🚀 gte-qwen2-7B-instruct
This is a model based on Alibaba-NLP/gte-Qwen2-7B-instruct
, which has shown excellent performance in multiple tasks of the MTEB benchmark, such as classification, retrieval, clustering, reranking, and STS.
✨ Features
- Rich Tags: Related to
mteb
, sentence-transformers
, transformers
, Qwen2
, sentence-similarity
, TensorBlock
, GGUF
.
- Multiple Task Performance: Demonstrates performance in various tasks including classification, retrieval, clustering, reranking, and STS.
📚 Documentation
Model Information
Property |
Details |
Model Type |
gte-qwen2-7B-instruct |
Base Model |
Alibaba-NLP/gte-Qwen2-7B-instruct |
License |
apache-2.0 |
Performance Metrics
1. Classification Tasks
Dataset |
Accuracy |
AP |
F1 |
MTEB AmazonCounterfactualClassification (en) |
91.31343283582089 |
67.64251402604096 |
87.53372530755692 |
MTEB AmazonPolarityClassification |
97.497825 |
96.30329547047529 |
97.49769793778039 |
MTEB AmazonReviewsClassification (en) |
62.564 |
- |
60.975777935041066 |
MTEB Banking77Classification |
87.56818181818181 |
- |
87.25826722019875 |
2. Retrieval Tasks
Dataset |
MAP@1 |
MAP@10 |
MAP@100 |
MAP@1000 |
MRR@1 |
MRR@10 |
MRR@100 |
MRR@1000 |
NDCG@1 |
NDCG@10 |
NDCG@100 |
NDCG@1000 |
Precision@1 |
Precision@10 |
Precision@100 |
Precision@1000 |
Recall@1 |
Recall@10 |
Recall@100 |
Recall@1000 |
MTEB ArguAna |
36.486000000000004 |
54.842 |
55.206999999999994 |
55.206999999999994 |
37.34 |
55.143 |
55.509 |
55.509 |
36.486000000000004 |
64.273 |
65.66199999999999 |
65.66199999999999 |
36.486000000000004 |
9.395000000000001 |
0.996 |
0.1 |
36.486000000000004 |
93.95400000000001 |
99.644 |
99.644 |
MTEB CQADupstackAndroidRetrieval |
33.997 |
48.176 |
49.82 |
49.924 |
42.059999999999995 |
53.726 |
54.398 |
54.416 |
42.059999999999995 |
55.574999999999996 |
60.744 |
61.85699999999999 |
42.059999999999995 |
11.101999999999999 |
1.73 |
0.218 |
33.997 |
70.35900000000001 |
91.642 |
97.977 |
MTEB CQADupstackEnglishRetrieval |
35.884 |
48.14 |
49.5 |
49.63 |
44.458999999999996 |
53.751000000000005 |
54.37800000000001 |
54.415 |
44.458999999999996 |
54.157 |
58.362 |
60.178 |
44.458999999999996 |
10.248 |
1.5890000000000002 |
0.207 |
35.884 |
64.798 |
82.345 |
93.267 |
MTEB CQADupstackGamingRetrieval |
39.383 |
53.714 |
54.838 |
54.87800000000001 |
45.016 |
56.732000000000006 |
57.411 |
57.431 |
45.016 |
60.228 |
64.277 |
65.07 |
45.016 |
9.937 |
1.288 |
0.13899999999999998 |
39.383 |
76.175 |
93.02 |
98.60900000000001 |
MTEB CQADupstackGisRetrieval |
27.426000000000002 |
37.397000000000006 |
38.61 |
38.678000000000004 |
29.944 |
39.654 |
40.638000000000005 |
40.691 |
29.944 |
43.094 |
48.789 |
50.339999999999996 |
29.944 |
6.78 |
1.024 |
0.11800000000000001 |
27.426000000000002 |
58.464000000000006 |
84.193 |
95.52000000000001 |
MTEB CQADupstackMathematicaRetrieval |
19.721 |
31.604 |
32.972 |
33.077 |
25.0 |
35.843 |
36.785000000000004 |
36.842000000000006 |
25.0 |
38.606 |
44.272 |
46.527 |
25.0 |
6.321 |
0.938 |
0.102 |
19.721 |
50.246 |
78.567 |
92.345 |
3. Clustering Tasks
Dataset |
V-Measure |
MTEB ArxivClusteringP2P |
56.461169803700564 |
MTEB ArxivClusteringS2S |
51.73600434466286 |
MTEB BiorxivClusteringP2P |
50.09239610327673 |
MTEB BiorxivClusteringS2S |
46.64733054606282 |
4. Reranking Tasks
Dataset |
MAP |
MRR |
MTEB AskUbuntuDupQuestions |
67.57827065898053 |
79.08136569493911 |
5. STS Tasks
Dataset |
Cosine Similarity Pearson |
Cosine Similarity Spearman |
Euclidean Pearson |
Euclidean Spearman |
Manhattan Pearson |
Manhattan Spearman |
MTEB BIOSSES |
83.53324575999243 |
81.37173362822374 |
82.19243335103444 |
81.33679307304334 |
82.38752665975699 |
81.31510583189689 |
📄 License
This project is licensed under the apache-2.0
license.