🚀 speed-embedding-7b-instruct
This document presents the performance metrics of the speed-embedding-7b-instruct
model across various natural language processing tasks, including classification, retrieval, clustering, reranking, and semantic text similarity (STS).
📚 Documentation
Model Information
Property |
Details |
Model Name |
speed-embedding-7b-instruct |
Task Performance Metrics
Classification Tasks
Dataset |
Accuracy |
AP |
F1 |
MTEB AmazonCounterfactualClassification (en) |
76.67164179104478 |
39.07181577576136 |
70.25085237742982 |
MTEB AmazonPolarityClassification |
96.1775 |
94.84308844303422 |
96.17546959843244 |
MTEB AmazonReviewsClassification (en) |
56.278000000000006 |
- |
55.45101875980304 |
MTEB Banking77Classification |
88.63636363636364 |
- |
88.58740097633193 |
MTEB EmotionClassification |
51.025 |
- |
47.08253474922065 |
Retrieval Tasks
Dataset |
NDCG@1 |
NDCG@3 |
NDCG@5 |
NDCG@10 |
NDCG@100 |
MAP@1 |
MAP@3 |
MAP@5 |
MAP@10 |
MAP@100 |
Recall@1 |
Recall@3 |
Recall@5 |
Recall@10 |
Recall@100 |
Precision@1 |
Precision@3 |
Precision@5 |
Precision@10 |
Precision@100 |
MRR@1 |
MRR@3 |
MRR@5 |
MRR@10 |
MRR@100 |
MTEB ArguAna |
33.642 |
49.399 |
54.108999999999995 |
59.294999999999995 |
62.015 |
33.642 |
45.507 |
48.1 |
50.248000000000005 |
50.954 |
33.642 |
60.669 |
72.191 |
88.193 |
99.431 |
33.642 |
20.223 |
14.438 |
8.819 |
0.9939999999999999 |
33.997 |
45.614 |
48.263 |
50.388999999999996 |
51.102000000000004 |
MTEB CQADupstackRetrieval |
30.904666666666664 |
36.32808333333333 |
38.767250000000004 |
41.62008333333333 |
47.118083333333324 |
25.7645 |
32.6235 |
34.347 |
35.79658333333333 |
37.10391666666666 |
25.7645 |
39.622666666666674 |
45.938750000000006 |
54.43816666666667 |
78.66183333333333 |
30.904666666666664 |
17.099083333333333 |
12.278416666666669 |
7.573083333333335 |
1.22275 |
30.904666666666664 |
37.458333333333336 |
38.97333333333333 |
40.10316666666666 |
41.004250000000006 |
MTEB ClimateFEVER |
38.046 |
31.842 |
33.698 |
37.765 |
44.998 |
16.682 |
23.624000000000002 |
25.812 |
28.017999999999997 |
30.064999999999998 |
16.682 |
28.338 |
34.486 |
43.474000000000004 |
67.984 |
38.046 |
23.779 |
17.849999999999998 |
11.642 |
1.9429999999999998 |
38.046 |
46.764 |
48.722 |
49.976 |
50.693999999999996 |
MTEB DBPedia |
63.24999999999999 |
54.005 |
51.504000000000005 |
49.738 |
54.754000000000005 |
10.639 |
16.726 |
20.101 |
24.569 |
35.221999999999994 |
10.639 |
17.861 |
22.642 |
30.105999999999998 |
60.92999999999999 |
75.0 |
58.083 |
50.0 |
40.35 |
12.659999999999998 |
75.0 |
80.042 |
80.779 |
81.355 |
81.58 |
MTEB FEVER |
82.163 |
86.835 |
87.802 |
88.529 |
89.17 |
76.335 |
83.91499999999999 |
84.64500000000001 |
85.058 |
85.257 |
76.335 |
90.608 |
93.098 |
95.173 |
97.59299999999999 |
82.163 |
33.257999999999996 |
20.654 |
10.674999999999999 |
1.122 |
82.163 |
88.346 |
88.791 |
88.97699999999999 |
89.031 |
MTEB FiQA2018 |
55.093 |
52.481 |
53.545 |
56.053 |
62.53999999999999 |
29.189999999999998 |
42.603 |
45.855000000000004 |
48.241 |
50.300999999999995 |
29.189999999999998 |
47.471999999999994 |
54.384 |
62.731 |
86.02300000000001 |
55.093 |
34.979 |
25.278 |
15.231 |
2.2190000000000003 |
55.093 |
61.317 |
62.358999999999995 |
63.165000000000006 |
63.81 |
MTEB HotpotQA |
78.866 |
70.128 |
73.017 |
75.166 |
77.97500000000001 |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
Clustering Tasks
Dataset |
V-Measure |
MTEB ArxivClusteringP2P |
51.1249344529392 |
MTEB ArxivClusteringS2S |
47.01575217563573 |
MTEB BiorxivClusteringP2P |
41.99753263006505 |
MTEB BiorxivClusteringS2S |
39.623067884052666 |
Reranking Task
Dataset |
MAP |
MRR |
MTEB AskUbuntuDupQuestions |
67.2259454062751 |
79.37508244294948 |
STS Task
Dataset |
Cosine Similarity (Pearson) |
Cosine Similarity (Spearman) |
Euclidean Distance (Pearson) |
Euclidean Distance (Spearman) |
Manhattan Distance (Pearson) |
Manhattan Distance (Spearman) |
MTEB BIOSSES |
89.5312396547344 |
87.1447567367366 |
88.67110804544821 |
87.1447567367366 |
89.06983994154335 |
87.59115245033443 |