🚀 PEFT
PEFT is a library related to sentence similarity, offering various text - related capabilities such as text embedding, information retrieval, and text classification.
📚 Documentation
Model Information
Property |
Details |
Library Name |
peft |
Pipeline Tag |
sentence - similarity |
Tags |
text - embedding, embeddings, information - retrieval, beir, text - classification, language - model, text - clustering, text - semantic - similarity, text - evaluation, text - reranking, feature - extraction, sentence - similarity, Sentence Similarity, natural_questions, ms_marco, fever, hotpot_qa, mteb |
Model Index
LLM2Vec - Meta - Llama - 3 - supervised
The following are the results of the model on different tasks and datasets:
Classification Tasks
- MTEB AmazonCounterfactualClassification (en):
- Metrics:
- Accuracy: 79.94029850746269
- AP: 44.93223506764482
- F1: 74.30328994013465
- MTEB AmazonPolarityClassification:
- Metrics:
- Accuracy: 86.06680000000001
- AP: 81.97124658709345
- F1: 86.00558036874241
- MTEB AmazonReviewsClassification (en):
- Metrics:
- Accuracy: 46.836
- F1: 46.05094679201488
- MTEB Banking77Classification:
- Metrics:
- Accuracy: 88.0487012987013
- F1: 88.00953788281542
Retrieval Tasks
- MTEB ArguAna:
- Metrics:
- MAP@1: 37.980000000000004
- MAP@10: 54.167
- MAP@100: 54.735
- MAP@1000: 54.738
- MRR@1: 38.549
- MRR@10: 54.351000000000006
- MRR@100: 54.932
- MRR@1000: 54.935
- NDCG@1: 37.980000000000004
- NDCG@10: 62.778999999999996
- NDCG@100: 64.986
- NDCG@1000: 65.036
- Precision@1: 37.980000000000004
- Precision@10: 9.011
- Precision@100: 0.993
- Precision@1000: 0.1
- Recall@1: 37.980000000000004
- Recall@10: 90.114
- Recall@100: 99.289
- Recall@1000: 99.644
- MTEB CQADupstackAndroidRetrieval:
- Metrics:
- MAP@1: 37.742
- MAP@10: 51.803
- MAP@100: 53.556000000000004
- MAP@1000: 53.652
- MRR@1: 46.924
- MRR@10: 57.857
- MRR@100: 58.592
- MRR@1000: 58.619
- NDCG@1: 46.924
- NDCG@10: 58.733999999999995
- NDCG@100: 63.771
- NDCG@1000: 64.934
- Precision@1: 46.924
- Precision@10: 11.431
- Precision@100: 1.73
- Precision@1000: 0.213
- Recall@1: 37.742
- Recall@10: 71.34
- Recall@100: 91.523
- Recall@1000: 98.494
- MTEB CQADupstackEnglishRetrieval:
- Metrics:
- MAP@1: 34.183
- MAP@10: 46.837
- MAP@100: 48.126000000000005
- MAP@1000: 48.25
- MRR@1: 43.376
- MRR@10: 52.859
- MRR@100: 53.422000000000004
- MRR@1000: 53.456
- NDCG@1: 43.376
- NDCG@10: 53.223
- NDCG@100: 57.175
- NDCG@1000: 58.86900000000001
- Precision@1: 43.376
- Precision@10: 10.236
- Precision@100: 1.5730000000000002
- Precision@1000: 0.203
- Recall@1: 34.183
- Recall@10: 64.866
- Recall@100: 81.26100000000001
- Recall@1000: 91.412
- MTEB CQADupstackGamingRetrieval:
- Metrics:
- MAP@1: 44.878
- MAP@10: 58.656
- MAP@100: 59.668
- MAP@1000: 59.704
- MRR@1: 51.975
- MRR@10: 62.357
- MRR@100: 62.907999999999994
- MRR@1000: 62.925
- NDCG@1: 51.975
- NDCG@10: 64.95100000000001
- NDCG@100: 68.414
- NDCG@1000: 69.077
- Precision@1: 51.975
- Precision@10: 10.502
- Precision@100: 1.31
- Precision@1000: 0.13899999999999998
- Recall@1: 44.878
- Recall@10: 79.746
- Recall@100: 94.17
- Recall@1000: 98.80499999999999
- MTEB CQADupstackGisRetrieval:
- Metrics:
- MAP@1: 28.807
- MAP@10: 39.431
- MAP@100: 40.56
- MAP@1000: 40.617999999999995
- MRR@1: 31.186000000000003
- MRR@10: 41.654
- MRR@100: 42.58
- MRR@1000: 42.623
- NDCG@1: 31.186000000000003
- NDCG@10: 45.297
- NDCG@100: 50.515
- NDCG@1000: 52.005
- Precision@1: 31.186000000000003
- Precision@10: 7.073
- Precision@100: 1.0210000000000001
- Precision@1000: 0.11900000000000001
- Recall@1: 28.807
- Recall@10: 61.138999999999996
- Recall@100: 84.491
- Recall@1000: 95.651
- MTEB CQADupstackMathematicaRetrieval:
- Metrics:
- MAP@1: 20.607
- MAP@10: 31.944
- MAP@100: 33.317
- MAP@1000: 33.428000000000004
- MRR@1: 25.622
- MRR@10: 36.726
- MRR@100: 37.707
- MRR@1000: 37.761
- NDCG@1: 25.622
- NDCG@10: 38.462
- NDCG@100: 44.327
- NDCG@1000: 46.623
- Precision@1: 25.622
- Precision@10: 7.425
- Precision@100: 1.173
- Precision@1000: 0.149
- Recall@1: 20.607
- Recall@10: 53.337
- Recall@100: 78.133
- Recall@1000: 94.151
- MTEB CQADupstackPhysicsRetrieval:
- Metrics:
- MAP@1: 33.814
- MAP@10: 47.609
- MAP@100: 48.972
- MAP@1000: 49.061
- MRR@1: 42.059999999999995
- MRR@10: 53.074
- MRR@100: 53.76800000000001
- MRR@1000: 53.794
- NDCG@1: 42.059999999999995
- NDCG@10: 54.419
- NDCG@100: 59.508
- NDCG@1000: 60.858000000000004
- Precision@1: 42.059999999999995
- Precision@10: 10.231
- Precision@100: 1.4789999999999999
- Precision@1000: 0.17700000000000002
- Recall@1: 33.814
- Recall@10: 68.88
- Recall@100: 89.794
- Recall@1000: 98.058
- MTEB CQADupstackProgrammersRetrieval:
- Metrics:
- MAP@1: 29.668
- MAP@10: 43.032
- MAP@100: 44.48
- MAP@1000: 44.574000000000005
- MRR@1: 37.785000000000004
- MRR@10: 48.898
- MRR@100: 49.728
- MRR@1000: 49.769000000000005
- NDCG@1: 37.785000000000004
- NDCG@10: 50.21099999999999
- NDCG@100: 55.657999999999994
- NDCG@1000: 57.172
- Precision@1: 37.785000000000004
- Precision@10: 9.669
- Precision@100: 1.4409999999999998
- Precision@1000: 0.174
- Recall@1: 29.668
- Recall@10: 65.575
- Recall@100: 87.977
- Recall@1000: 97.615
- MTEB CQADupstackRetrieval:
- Metrics:
- MAP@1: 30.29925
- MAP@10: 41.98708333333333
- MAP@100: 43.306916666666666
- MAP@1000: 43.40716666666667
- MRR@1: 36.24483333333334
- MRR@10: 46.32666666666667
- MRR@100: 47.13983333333333
- MRR@1000: 47.18058333333334
- NDCG@1: 36.24483333333334
- NDCG@10: 48.251916666666666
- NDCG@100: 53.3555
- NDCG@1000: 55.024249999999995
- Precision@1: 36.24483333333334
- Precision@10: 8.666833333333333
- Precision@100: 1.3214166666666665
- Precision@1000: 0.16475
- Recall@1: 30.29925
- Recall@10: 62.232333333333344
- Recall@100: 84.151
- Recall@1000: 95.37333333333333
- MTEB CQADupstackStatsRetrieval:
- Metrics:
- MAP@1: 28.996
- MAP@10: 38.047
- MAP@100: 39.121
- MAP@1000: 39.202999999999996
- MRR@1: 32.362
- MRR@10: 40.717999999999996
- MRR@100: 41.586
- MRR@1000: 41.641
- NDCG@1: 32.362
- NDCG@10: 43.105
- NDCG@100: 48.026
- NDCG@1000: 49.998
- Precision@1: 32.362
- Precision@10: 6.7940000000000005
- Precision@100: 1.0170000000000001
- Precision@1000: 0.125
- Recall@1: 28.996
- Recall@10: 55.955
- Recall@100: 77.744
- Recall@1000: 92.196
- MTEB CQADupstackTexRetrieval:
- Metrics:
- [Metrics data for this dataset is incomplete in the original input, please provide the full data for a complete display.]
Clustering Tasks
- MTEB ArxivClusteringP2P:
- Metric:
- V - measure: 44.27081216556421
- MTEB ArxivClusteringS2S:
- Metric:
- V - measure: 46.8490872532913
- MTEB BiorxivClusteringP2P:
- Metric:
- V - measure: 32.34687321141145
- MTEB BiorxivClusteringS2S:
- Metric:
- V - measure: 36.69881680534123
Reranking Task
- MTEB AskUbuntuDupQuestions:
- Metrics:
- MAP: 65.18525400430678
- MRR: 78.80149936244119
STS Task
- MTEB BIOSSES:
- Metric:
- Cosine Similarity Spearman: 84.92301936595548
📄 License
This project is licensed under the MIT license.