🚀 Salesforce/SFR-Embedding-2_R
This is a model related to sentence embeddings, which has been tested on multiple tasks in the MTEB benchmark, including classification, retrieval, clustering, reranking, and STS. The model shows different performance metrics on various datasets.
📚 Documentation
Model Information
Property |
Details |
Model Name |
Salesforce/SFR-Embedding-2_R |
Tags |
mteb, sentence-transformers, transformers |
Task Results
Classification Tasks
Dataset |
Accuracy |
AP |
F1 |
MTEB AmazonCounterfactualClassification (en) |
92.71641791044776 |
69.47931007147756 |
88.0252625393374 |
MTEB AmazonPolarityClassification |
97.31075 |
96.26693923450127 |
97.31042448894502 |
MTEB AmazonReviewsClassification (en) |
61.040000000000006 |
N/A |
60.78646832640785 |
MTEB Banking77Classification |
90.02272727272727 |
N/A |
89.96681880265936 |
MTEB EmotionClassification |
93.36500000000001 |
N/A |
89.89541440183861 |
Retrieval Tasks
Dataset |
MAP@1 |
MAP@10 |
MAP@100 |
MAP@1000 |
MRR@1 |
MRR@10 |
MRR@100 |
MRR@1000 |
NDCG@1 |
NDCG@10 |
NDCG@100 |
NDCG@1000 |
Precision@1 |
Precision@10 |
Precision@100 |
Precision@1000 |
Recall@1 |
Recall@10 |
Recall@100 |
Recall@1000 |
MTEB ArguAna |
37.767 |
53.908 |
54.583000000000006 |
54.583999999999996 |
38.26458036984353 |
54.120408001987066 |
54.780719904297406 |
54.78174226698592 |
37.767 |
62.339999999999996 |
64.89399999999999 |
64.914 |
37.767 |
8.905000000000001 |
0.9950000000000001 |
0.1 |
37.767 |
89.047 |
99.502 |
99.644 |
MTEB CQADupstackRetrieval |
28.056666666666665 |
39.61749999999999 |
41.00666666666666 |
41.11358333333334 |
33.73950708467142 |
44.0987162763402 |
44.94302678553521 |
44.98758207055161 |
33.739666666666665 |
46.10683333333334 |
51.49275000000001 |
53.2585 |
33.739666666666665 |
8.46025 |
1.3215833333333333 |
0.16524999999999998 |
28.056666666666665 |
60.68825000000001 |
83.74433333333334 |
95.62299999999999 |
MTEB ClimateFEVER |
15.609 |
25.584 |
27.291999999999998 |
27.471 |
34.98371335504886 |
45.73747479447807 |
46.4973410206458 |
46.53372527933685 |
34.984 |
34.427 |
40.908 |
44.118 |
34.984 |
10.476 |
1.748 |
0.23500000000000001 |
15.609 |
39.619 |
61.952 |
79.861 |
MTEB DBPedia |
10.482 |
25.155 |
36.606 |
38.617000000000004 |
76.0 |
82.5610119047619 |
82.74795937825128 |
82.75526942226163 |
63.625 |
51.214000000000006 |
56.411 |
63.429 |
76.0 |
41.975 |
13.26 |
2.493 |
10.482 |
31.075000000000003 |
63.119 |
85.32300000000001 |
MTEB FEVER |
81.948 |
89.47500000000001 |
89.66199999999999 |
89.671 |
88.23882388238825 |
93.2122736083131 |
93.23908769526588 |
93.23932393435209 |
88.239 |
92.155 |
92.735 |
92.866 |
88.239 |
32.15 |
13.26 |
2.493 |
81.948 |
89.47500000000001 |
89.66199999999999 |
89.671 |
Clustering Tasks
Dataset |
V-Measure |
MTEB ArxivClusteringP2P |
54.024325012036314 |
MTEB ArxivClusteringS2S |
48.817300846601675 |
MTEB BiorxivClusteringP2P |
50.75930389699286 |
MTEB BiorxivClusteringS2S |
46.57286439805565 |
Reranking Task
Dataset |
MAP |
MRR |
MTEB AskUbuntuDupQuestions |
66.71478959728732 |
79.07202216066482 |
STS Task
Dataset |
Cosine Similarity Pearson |
Cosine Similarity Spearman |
Euclidean Pearson |
Euclidean Spearman |
Manhattan Pearson |
Manhattan Spearman |
MTEB BIOSSES |
88.79517914982239 |
87.60440576436838 |
87.75596873521118 |
87.60440576436838 |
87.74113773865973 |
87.50560833247899 |