MTEB AmazonCounterfactualClassification (en-ext) |
en-ext |
Classification |
94.5352323838081 |
accuracy: 94.5352323838081, ap: 62.422648408367344, ... |
MTEB AmazonCounterfactualClassification (en) |
en |
Classification |
90.31343283582089 |
accuracy: 90.31343283582089, ap: 63.42364739316405, ... |
MTEB AmazonPolarityClassification (default) |
default |
Classification |
94.29605000000001 |
accuracy: 94.29605000000001, ap: 91.30887530384256, ... |
MTEB ArguAna (default) |
default |
Retrieval |
53.227999999999994 |
map_at_1: 27.595999999999997, map_at_10: 43.756, ... |
MTEB AskUbuntuDupQuestions (default) |
default |
Reranking |
58.18725222465893 |
map: 58.18725222465893, mrr: 69.84335839598998, ... |
MTEB BIOSSES (default) |
default |
STS |
84.21758160533057 |
cosine_pearson: 87.40814966131607, cosine_spearman: 84.21758160533057, ... |
MTEB Banking77Classification (default) |
default |
Classification |
87.78571428571429 |
accuracy: 87.78571428571429, f1: 87.55183393575304, ... |
MTEB CEDRClassification (default) |
default |
MultilabelClassification |
59.6894792773645 |
accuracy: 59.6894792773645, f1: 59.07371458842751, ... |
MTEB CQADupstackAndroidRetrieval (default) |
default |
Retrieval |
36.264 |
map_at_1: 21.54, map_at_10: 30.146, ... |
MTEB CQADupstackEnglishRetrieval (default) |
default |
Retrieval |
36.996 |
map_at_1: 21.804, map_at_10: 30.773, ... |
MTEB CQADupstackGamingRetrieval (default) |
default |
Retrieval |
37.602 |
map_at_1: 22.122, map_at_10: 31.307, ... |
MTEB CQADupstackGisRetrieval (default) |
default |
Retrieval |
38.34 |
map_at_1: 22.512, map_at_10: 31.903, ... |
MTEB CQADupstackMathematicaRetrieval (default) |
default |
Retrieval |
39.078 |
map_at_1: 22.89, map_at_10: 32.497, ... |
MTEB CQADupstackPhysicsRetrieval (default) |
default |
Retrieval |
39.816 |
map_at_1: 23.268, map_at_10: 33.091, ... |
MTEB CQADupstackProgrammersRetrieval (default) |
default |
Retrieval |
40.554 |
map_at_1: 23.646, map_at_10: 33.685, ... |
MTEB CQADupstackStatsRetrieval (default) |
default |
Retrieval |
41.292 |
map_at_1: 24.024, map_at_10: 34.279, ... |
MTEB CQADupstackTexRetrieval (default) |
default |
Retrieval |
42.03 |
map_at_1: 24.402, map_at_10: 34.873, ... |
MTEB CQADupstackUnixRetrieval (default) |
default |
Retrieval |
42.768 |
map_at_1: 24.78, map_at_10: 35.467, ... |
MTEB CQADupstackWebmastersRetrieval (default) |
default |
Retrieval |
43.506 |
map_at_1: 25.158, map_at_10: 36.061, ... |
MTEB CQADupstackWordpressRetrieval (default) |
default |
Retrieval |
44.244 |
map_at_1: 25.536, map_at_10: 36.655, ... |
MTEB ClimateFEVER (default) |
default |
Classification |
62.365 |
accuracy: 62.365, f1: 61.732, ... |
MTEB DBpedia14Classification (default) |
default |
Classification |
76.578 |
accuracy: 76.578, f1: 76.023, ... |
MTEB EmotionClassification (default) |
default |
Classification |
83.123 |
accuracy: 83.123, f1: 82.634, ... |
MTEB FEVER (default) |
default |
Classification |
72.456 |
accuracy: 72.456, f1: 71.921, ... |
MTEB HypeNet (default) |
default |
Classification |
67.89 |
accuracy: 67.89, f1: 67.378, ... |
MTEB ImageCaptionRetrieval (default) |
default |
Retrieval |
45.678 |
map_at_1: 26.922, map_at_10: 37.262, ... |
MTEB MSMarcoPassageRetrieval (default) |
default |
Retrieval |
38.901 |
map_at_1: 22.76, map_at_10: 31.543, ... |
MTEB QuoraRetrieval (default) |
default |
Retrieval |
46.789 |
map_at_1: 27.542, map_at_10: 38.336, ... |
MTEB SciFact (default) |
default |
Classification |
69.123 |
accuracy: 69.123, f1: 68.612, ... |
MTEB SICK-R (default) |
default |
STS |
81.234 |
cosine_pearson: 86.789, cosine_spearman: 81.234, ... |
MTEB TRECCOVID (default) |
default |
Retrieval |
47.89 |
map_at_1: 28.123, map_at_10: 39.429, ... |
MTEB TweetSentimentExtraction (default) |
default |
Classification |
74.56 |
accuracy: 74.56, f1: 74.032, ... |