🚀 e5-small
This document presents the performance results of the e5-small model on various natural language processing tasks, including classification, retrieval, clustering, reranking, and semantic text similarity (STS).
📚 Documentation
Model Information
Property |
Details |
Model Type |
e5-small |
Tags |
mteb, Sentence Transformers, sentence-similarity, sentence-transformers |
Results on Different Tasks
Classification Tasks
Task |
Dataset |
Accuracy |
AP |
F1 |
Classification |
MTEB AmazonCounterfactualClassification (en) |
76.22388059701493 |
40.27466219523129 |
70.60533006025108 |
Classification |
MTEB AmazonPolarityClassification |
87.525775 |
83.51063993897611 |
87.49342736805572 |
Classification |
MTEB AmazonReviewsClassification (en) |
42.611999999999995 |
- |
42.05088045932892 |
Classification |
MTEB Banking77Classification |
81.87337662337661 |
- |
81.76647866926402 |
Retrieval Tasks
Task |
Dataset |
MAP@1 |
MAP@10 |
MAP@100 |
MAP@1000 |
MRR@1 |
MRR@10 |
MRR@100 |
MRR@1000 |
NDCG@1 |
NDCG@10 |
NDCG@100 |
NDCG@1000 |
Precision@1 |
Precision@10 |
Precision@100 |
Precision@1000 |
Recall@1 |
Recall@10 |
Recall@100 |
Recall@1000 |
Retrieval |
MTEB ArguAna |
23.826 |
38.269 |
39.322 |
39.344 |
24.253 |
38.425 |
39.478 |
39.5 |
23.826 |
46.693 |
51.469 |
52.002 |
23.826 |
7.383000000000001 |
0.9530000000000001 |
0.099 |
23.826 |
73.82600000000001 |
95.306 |
99.431 |
Retrieval |
MTEB CQADupstackAndroidRetrieval |
32.054 |
40.699999999999996 |
41.818 |
41.959999999999994 |
38.769999999999996 |
46.150000000000006 |
46.865 |
46.925 |
38.769999999999996 |
45.778 |
50.38 |
52.922999999999995 |
38.769999999999996 |
8.269 |
1.278 |
0.178 |
32.054 |
54.947 |
74.79599999999999 |
91.40899999999999 |
Retrieval |
MTEB CQADupstackEnglishRetrieval |
29.035 |
38.007000000000005 |
39.125 |
39.251999999999995 |
36.497 |
44.077 |
44.743 |
44.79 |
36.497 |
42.986000000000004 |
47.323 |
49.624 |
36.497 |
7.8340000000000005 |
1.269 |
0.178 |
29.035 |
51.06 |
69.64099999999999 |
84.49 |
Retrieval |
MTEB CQADupstackGamingRetrieval |
37.239 |
47.873 |
48.842999999999996 |
48.913000000000004 |
42.508 |
51.44 |
52.087 |
52.129999999999995 |
42.508 |
53.31399999999999 |
57.245000000000005 |
58.794000000000004 |
42.508 |
8.458 |
1.133 |
0.132 |
37.239 |
65.99000000000001 |
82.99499999999999 |
94.128 |
Retrieval |
MTEB CQADupstackGisRetrieval |
23.039 |
29.694 |
30.587999999999997 |
30.692999999999998 |
24.633 |
31.478 |
32.299 |
32.381 |
24.633 |
33.697 |
38.080000000000005 |
40.812 |
24.633 |
5.0729999999999995 |
0.753 |
0.10300000000000001 |
23.039 |
44.275999999999996 |
64.4 |
85.135 |
Retrieval |
MTEB CQADupstackMathematicaRetrieval |
13.594999999999999 |
19.933999999999997 |
20.966 |
21.087 |
17.662 |
24.407 |
25.385 |
25.465 |
17.662 |
24.391 |
29.681 |
32.923 |
17.662 |
4.44 |
0.8200000000000001 |
0.125 |
13.594999999999999 |
29.933999999999997 |
40.966 |
41.087 |
Clustering Tasks
Task |
Dataset |
V-Measure |
Clustering |
MTEB ArxivClusteringP2P |
44.13995374767436 |
Clustering |
MTEB ArxivClusteringS2S |
37.13950072624313 |
Clustering |
MTEB BiorxivClusteringP2P |
35.80600542614507 |
Clustering |
MTEB BiorxivClusteringS2S |
31.86321613256603 |
Reranking Task
Task |
Dataset |
MAP |
MRR |
Reranking |
MTEB AskUbuntuDupQuestions |
59.35843292105327 |
73.72312359846987 |
STS Task
Task |
Dataset |
Cosine Similarity Pearson |
Cosine Similarity Spearman |
Euclidean Pearson |
Euclidean Spearman |
Manhattan Pearson |
Manhattan Spearman |
STS |
MTEB BIOSSES |
84.55140418324174 |
84.21637675860022 |
81.26069614610006 |
83.25069210421785 |
80.17441422581014 |
81.87596198487877 |