🚀 SGPT-125M-weightedmean-nli-bitfit
SGPT-125M-weightedmean-nli-bitfit 是一个用于句子相似度任务的模型,在多种文本分类、检索、聚类等任务上有相应的表现,以下是该模型在不同数据集上的详细评估结果。
📚 详细文档
模型信息
属性 |
详情 |
模型类型 |
用于句子相似度的模型 |
标签 |
sentence-transformers、feature-extraction、sentence-similarity、mteb |
评估结果
分类任务
数据集 |
语言配置 |
准确率 |
AP |
F1 |
MTEB AmazonCounterfactualClassification |
en |
65.88059701492537 |
28.685493163579785 |
59.79951005816335 |
MTEB AmazonCounterfactualClassification |
de |
59.07922912205568 |
73.91887421019034 |
56.6316368658711 |
MTEB AmazonCounterfactualClassification |
en-ext |
64.91754122938531 |
16.360681214864226 |
53.126592061523766 |
MTEB AmazonCounterfactualClassification |
ja |
56.423982869378996 |
12.143003571907899 |
45.76363777987471 |
MTEB AmazonPolarityClassification |
default |
74.938225 |
69.58187110320567 |
74.72744058439321 |
MTEB AmazonReviewsClassification |
en |
35.098 |
- |
34.73265651435726 |
MTEB AmazonReviewsClassification |
de |
24.516 |
- |
24.21748200448397 |
MTEB AmazonReviewsClassification |
es |
29.097999999999995 |
- |
28.620040162757093 |
MTEB AmazonReviewsClassification |
fr |
27.395999999999997 |
- |
27.146888644986284 |
MTEB AmazonReviewsClassification |
ja |
21.724 |
- |
21.37230564276654 |
MTEB AmazonReviewsClassification |
zh |
23.976 |
- |
23.741137981755482 |
MTEB Banking77Classification |
default |
74.67857142857142 |
- |
74.61743413995573 |
检索任务
数据集 |
MAP@1 |
MAP@10 |
MAP@100 |
MAP@1000 |
MAP@3 |
MAP@5 |
NDCG@1 |
NDCG@10 |
NDCG@100 |
NDCG@1000 |
NDCG@3 |
NDCG@5 |
Precision@1 |
Precision@10 |
Precision@100 |
Precision@1000 |
Precision@3 |
Precision@5 |
Recall@1 |
Recall@10 |
Recall@100 |
Recall@1000 |
Recall@3 |
Recall@5 |
MTEB ArguAna |
13.442000000000002 |
24.275 |
25.588 |
25.659 |
20.092 |
22.439999999999998 |
13.442000000000002 |
31.04 |
37.529 |
39.348 |
22.342000000000002 |
26.595999999999997 |
13.442000000000002 |
5.299 |
0.836 |
0.098 |
9.625 |
7.852 |
13.442000000000002 |
52.986999999999995 |
83.64200000000001 |
97.795 |
28.876 |
39.26 |
MTEB CQADupstackAndroidRetrieval |
18.221999999999998 |
24.506 |
25.611 |
25.758 |
22.264999999999997 |
23.698 |
23.033 |
28.719 |
33.748 |
37.056 |
25.240000000000002 |
27.12 |
23.033 |
5.408 |
1.004 |
0.158 |
11.874 |
8.927 |
18.221999999999998 |
36.355 |
58.724 |
81.33500000000001 |
26.334000000000003 |
31.4 |
MTEB CQADupstackEnglishRetrieval |
12.058 |
16.051000000000002 |
16.772000000000002 |
16.871 |
14.78 |
15.5 |
15.35 |
18.804000000000002 |
22.346 |
25.007 |
16.768 |
17.692 |
15.35 |
3.51 |
0.664 |
0.11100000000000002 |
7.983 |
5.656 |
12.058 |
23.644000000000002 |
39.76 |
58.56 |
17.541999999999998 |
20.232 |
MTEB CQADupstackGamingRetrieval |
21.183 |
28.9 |
29.858 |
29.953999999999997 |
26.58 |
27.912 |
24.765 |
33.339999999999996 |
37.997 |
40.416000000000004 |
29.044999999999998 |
31.121 |
24.765 |
5.599 |
0.8699999999999999 |
0.11499999999999999 |
13.270999999999999 |
9.367 |
21.183 |
43.875 |
65.005 |
83.017 |
32.232 |
37.308 |
MTEB CQADupstackGisRetrieval |
11.350999999999999 |
14.953 |
15.623000000000001 |
15.716 |
13.603000000000002 |
14.343 |
12.429 |
17.319000000000003 |
20.990000000000002 |
23.899 |
14.605 |
15.89 |
12.429 |
2.701 |
0.48700000000000004 |
0.078 |
6.026 |
4.3839999999999995 |
11.350999999999999 |
23.536 |
40.942 |
64.05 |
16.195 |
19.264 |
MTEB CQADupstackMathematicaRetrieval |
8.08 |
11.691 |
12.312 |
12.439 |
10.344000000000001 |
10.996 |
10.697 |
14.48 |
18.160999999999998 |
21.886 |
11.872 |
12.834000000000001 |
10.697 |
2.811 |
0.551 |
0.10200000000000001 |
5.804 |
4.154 |
8.08 |
20.235 |
37.525999999999996 |
65.106 |
12.803999999999998 |
15.498999999999999 |
MTEB CQADupstackPhysicsRetrieval |
13.908999999999999 |
19.256 |
20.286 |
20.429 |
17.399 |
18.398999999999997 |
17.421 |
23.105999999999998 |
28.128999999999998 |
31.480999999999998 |
19.789 |
21.237000000000002 |
17.421 |
4.331 |
0.839 |
0.131 |
9.4 |
6.776 |
13.908999999999999 |
31.086999999999996 |
52.946000000000005 |
76.546 |
21.351 |
25.264999999999997 |
MTEB CQADupstackProgrammersRetrieval |
12.598 |
17.304 |
18.209 |
18.328 |
15.784 |
16.669999999999998 |
15.867999999999999 |
20.623 |
25.093 |
28.498 |
17.912 |
19.198 |
15.867999999999999 |
3.7670000000000003 |
0.716 |
0.11800000000000001 |
8.638 |
6.21 |
12.598 |
27.144000000000002 |
46.817 |
71.86099999999999 |
19.231 |
22.716 |
MTEB CQADupstackRetrieval |
12.738416666666666 |
17.235916666666668 |
18.063333333333333 |
18.18433333333333 |
15.74775 |
16.57825 |
15.487416 |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
聚类任务
数据集 |
V-measure |
MTEB ArxivClusteringP2P |
34.742482477870766 |
MTEB ArxivClusteringS2S |
24.67870651472156 |
MTEB BlurbsClusteringS2S |
8.00311862863495 |
MTEB BiorxivClusteringP2P |
28.93427045246491 |
MTEB BiorxivClusteringS2S |
23.080939123955474 |
重排序任务
数据集 |
MAP |
MRR |
MTEB AskUbuntuDupQuestions |
52.63439984994702 |
65.75704612408214 |
STS任务
数据集 |
Cos Sim Pearson |
Cos Sim Spearman |
Euclidean Pearson |
Euclidean Spearman |
Manhattan Pearson |
Manhattan Spearman |
MTEB BIOSSES |
72.78000135012542 |
70.92812216947605 |
77.1169214949292 |
77.10175681583313 |
76.84527031837595 |
77.0704308008438 |
双语挖掘任务
数据集 |
准确率 |
F1 |
精确率 |
召回率 |
MTEB BUCC (de-en) |
1.0960334029227559 |
1.0925539318023658 |
1.0908141962421711 |
1.0960334029227559 |
MTEB BUCC (fr-en) |
0.02201188641866608 |
0.02201188641866608 |
0.02201188641866608 |
0.02201188641866608 |
MTEB BUCC (ru-en) |
0.0 |
0.0 |
0.0 |
0.0 |
MTEB BUCC (zh-en) |
0.0 |
0.0 |
0.0 |
0.0 |