Modernbert Base Gooaq
M
Modernbert Base Gooaq
tomaarsenによって開発
これはModernBERT-baseをベースにした文変換モデルで、文の類似度計算や情報検索タスクに特化しています。
ダウンロード数 3,092
リリース時間 : 12/19/2024
モデル概要
このモデルは文の特徴を抽出し類似度を計算することで、情報検索や質問応答システムなどのシナリオに使用できます。キャッシュされた複数ネガティブ例のランキング損失を使用してトレーニングされており、複数のベンチマークテストで良好な性能を示しています。
モデル特徴
効率的な文特徴抽出
文を高次元ベクトル表現に効率的に変換し、意味情報を捉えることができます
最適化された類似度計算
キャッシュされた複数ネガティブ例のランキング損失を使用してトレーニングされ、文の類似度計算が最適化されています
大規模トレーニングデータ
300万件以上のデータでトレーニングされており、強力な汎化能力を持っています
モデル能力
文類似度計算
情報検索
特徴抽出
質問応答システムサポート
使用事例
情報検索
ドキュメント検索
クエリ文から大量のドキュメント中から関連内容を検索
NanoNQデータセットで0.8の精度@10を達成
質問応答システム
問題マッチング
ユーザーの質問と知識ベース内の類似問題をマッチング
NanoMSMARCOデータセットで0.82の精度@10を達成
language:
- en tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:3012496
- loss:CachedMultipleNegativesRankingLoss base_model: answerdotai/ModernBERT-base widget:
- source_sentence: how much is a car title transfer in minnesota?
sentences:
- This complex is a larger molecule than the original crystal violet stain and iodine and is insoluble in water. ... Conversely, the the outer membrane of Gram negative bacteria is degraded and the thinner peptidoglycan layer of Gram negative cells is unable to retain the crystal violet-iodine complex and the color is lost.
- Get insurance on the car and provide proof. Bring this information (including the title) to the Minnesota DVS office, as well as $10 for the filing fee and $7.25 for the titling fee. There is also a $10 transfer tax, as well as a 6.5% sales tax on the purchase price.
- 'One of the risks of DNP is that it accelerates the metabolism to a dangerously fast level. Our metabolic system operates at the rate it does for a reason – it is safe. Speeding up the metabolism may help burn off fat, but it can also trigger a number of potentially dangerous side effects, such as: fever.'
- source_sentence: what is the difference between 18 and 20 inch tires?
sentences:
- The only real difference is a 20" rim would be more likely to be damaged, as you pointed out. Beyond looks, there is zero benefit for the 20" rim. Also, just the availability of tires will likely be much more limited for the larger rim. ... Tire selection is better for 18" wheels than 20" wheels.
- '[''Open your Outlook app on your mobile device and click on the Settings gear icon.'', ''Under Settings, click on the Signature option.'', ''Enter either a generic signature that could be used for all email accounts tied to your Outlook app, or a specific signature, Per Account Signature, for each email account.'']'
- The average normal body temperature is around 98.6 degrees Fahrenheit, or 37 degrees Celsius. If your body temperature drops to just a few degrees lower than this, your blood vessels in your hands, feet, arms, and legs start to get narrower.
- source_sentence: whom the bell tolls meaning?
sentences:
- 'Answer: Humans are depicted in Hindu art often in sensuous and erotic postures.'
- The phrase "For whom the bell tolls" refers to the church bells that are rung when a person dies. Hence, the author is suggesting that we should not be curious as to for whom the church bell is tolling for. It is for all of us.
- '[''Automatically.'', ''When connected to car Bluetooth and,'', ''Manually.'']'
- source_sentence: how long before chlamydia symptoms appear?
sentences:
- Most people who have chlamydia don't notice any symptoms. If you do get symptoms, these usually appear between 1 and 3 weeks after having unprotected sex with an infected person. For some people they don't develop until many months later. Sometimes the symptoms can disappear after a few days.
- '[''Open the My Verizon app . ... '', ''Tap the Menu icon. ... '', ''Tap Manage device for the appropriate mobile number. ... '', ''Tap Transfer content between phones. ... '', ''Tap Start Transfer.'']'
- 'Psychiatrist vs Psychologist A psychiatrist is classed as a medical doctor, they include a physical examination of symptoms in their assessment and are able to prescribe medicine: a psychologist is also a doctor by virtue of their PHD level qualification, but is not medically trained and cannot prescribe.'
- source_sentence: are you human korean novela?
sentences:
- Many cysts heal on their own, which means that conservative treatments like rest and anti-inflammatory painkillers can often be enough to get rid of them. However, in some cases, routine drainage of the sac may be necessary to reduce symptoms.
- A relative of European pear varieties like Bartlett and Anjou, the Asian pear is great used in recipes or simply eaten out of hand. It retains a crispness that works well in slaws and salads, and it holds its shape better than European pears when baked and cooked.
- 'Are You Human? (Korean: 너도 인간이니; RR: Neodo Inganini; lit. Are You Human Too?) is a 2018 South Korean television series starring Seo Kang-jun and Gong Seung-yeon. It aired on KBS2''s Mondays and Tuesdays at 22:00 (KST) time slot, from June 4 to August 7, 2018.' datasets:
- sentence-transformers/gooaq pipeline_tag: sentence-similarity library_name: sentence-transformers metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100 model-index:
- name: SentenceTransformer based on answerdotai/ModernBERT-base
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNQ
type: NanoNQ
metrics:
- type: cosine_accuracy@1 value: 0.38 name: Cosine Accuracy@1
- type: cosine_accuracy@3 value: 0.64 name: Cosine Accuracy@3
- type: cosine_accuracy@5 value: 0.7 name: Cosine Accuracy@5
- type: cosine_accuracy@10 value: 0.8 name: Cosine Accuracy@10
- type: cosine_precision@1 value: 0.38 name: Cosine Precision@1
- type: cosine_precision@3 value: 0.22 name: Cosine Precision@3
- type: cosine_precision@5 value: 0.14400000000000002 name: Cosine Precision@5
- type: cosine_precision@10 value: 0.08199999999999999 name: Cosine Precision@10
- type: cosine_recall@1 value: 0.36 name: Cosine Recall@1
- type: cosine_recall@3 value: 0.62 name: Cosine Recall@3
- type: cosine_recall@5 value: 0.67 name: Cosine Recall@5
- type: cosine_recall@10 value: 0.74 name: Cosine Recall@10
- type: cosine_ndcg@10 value: 0.5673854489333459 name: Cosine Ndcg@10
- type: cosine_mrr@10 value: 0.5237460317460316 name: Cosine Mrr@10
- type: cosine_map@100 value: 0.5116785860647901 name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoMSMARCO
type: NanoMSMARCO
metrics:
- type: cosine_accuracy@1 value: 0.32 name: Cosine Accuracy@1
- type: cosine_accuracy@3 value: 0.56 name: Cosine Accuracy@3
- type: cosine_accuracy@5 value: 0.66 name: Cosine Accuracy@5
- type: cosine_accuracy@10 value: 0.82 name: Cosine Accuracy@10
- type: cosine_precision@1 value: 0.32 name: Cosine Precision@1
- type: cosine_precision@3 value: 0.18666666666666665 name: Cosine Precision@3
- type: cosine_precision@5 value: 0.132 name: Cosine Precision@5
- type: cosine_precision@10 value: 0.08199999999999999 name: Cosine Precision@10
- type: cosine_recall@1 value: 0.32 name: Cosine Recall@1
- type: cosine_recall@3 value: 0.56 name: Cosine Recall@3
- type: cosine_recall@5 value: 0.66 name: Cosine Recall@5
- type: cosine_recall@10 value: 0.82 name: Cosine Recall@10
- type: cosine_ndcg@10 value: 0.555381357077638 name: Cosine Ndcg@10
- type: cosine_mrr@10 value: 0.47249206349206346 name: Cosine Mrr@10
- type: cosine_map@100 value: 0.4797949229011178 name: Cosine Map@100
- task:
type: nano-beir
name: Nano BEIR
dataset:
name: NanoBEIR mean
type: NanoBEIR_mean
metrics:
- type: cosine_accuracy@1 value: 0.35 name: Cosine Accuracy@1
- type: cosine_accuracy@3 value: 0.6000000000000001 name: Cosine Accuracy@3
- type: cosine_accuracy@5 value: 0.6799999999999999 name: Cosine Accuracy@5
- type: cosine_accuracy@10 value: 0.81 name: Cosine Accuracy@10
- type: cosine_precision@1 value: 0.35 name: Cosine Precision@1
- type: cosine_precision@3 value: 0.2033333333333333 name: Cosine Precision@3
- type: cosine_precision@5 value: 0.138 name: Cosine Precision@5
- type: cosine_precision@10 value: 0.08199999999999999 name: Cosine Precision@10
- type: cosine_recall@1 value: 0.33999999999999997 name: Cosine Recall@1
- type: cosine_recall@3 value: 0.5900000000000001 name: Cosine Recall@3
- type: cosine_recall@5 value: 0.665 name: Cosine Recall@5
- type: cosine_recall@10 value: 0.78 name: Cosine Recall@10
- type: cosine_ndcg@10 value: 0.5613834030054919 name: Cosine Ndcg@10
- type: cosine_mrr@10 value: 0.4981190476190476 name: Cosine Mrr@10
- type: cosine_map@100 value: 0.49573675448295396 name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNQ
type: NanoNQ
metrics:
SentenceTransformer based on answerdotai/ModernBERT-base
This is a sentence-transformers model finetuned from answerdotai/ModernBERT-base on the gooaq dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
This model has been finetuned using train_st_gooaq.py using an RTX 3090, although only 10GB of VRAM was used.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: answerdotai/ModernBERT-base
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: en
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/ModernBERT-base-gooaq")
# Run inference
sentences = [
'are you human korean novela?',
"Are You Human? (Korean: 너도 인간이니; RR: Neodo Inganini; lit. Are You Human Too?) is a 2018 South Korean television series starring Seo Kang-jun and Gong Seung-yeon. It aired on KBS2's Mondays and Tuesdays at 22:00 (KST) time slot, from June 4 to August 7, 2018.",
'A relative of European pear varieties like Bartlett and Anjou, the Asian pear is great used in recipes or simply eaten out of hand. It retains a crispness that works well in slaws and salads, and it holds its shape better than European pears when baked and cooked.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Datasets:
NanoNQ
andNanoMSMARCO
- Evaluated with
InformationRetrievalEvaluator
Metric | NanoNQ | NanoMSMARCO |
---|---|---|
cosine_accuracy@1 | 0.38 | 0.32 |
cosine_accuracy@3 | 0.64 | 0.56 |
cosine_accuracy@5 | 0.7 | 0.66 |
cosine_accuracy@10 | 0.8 | 0.82 |
cosine_precision@1 | 0.38 | 0.32 |
cosine_precision@3 | 0.22 | 0.1867 |
cosine_precision@5 | 0.144 | 0.132 |
cosine_precision@10 | 0.082 | 0.082 |
cosine_recall@1 | 0.36 | 0.32 |
cosine_recall@3 | 0.62 | 0.56 |
cosine_recall@5 | 0.67 | 0.66 |
cosine_recall@10 | 0.74 | 0.82 |
cosine_ndcg@10 | 0.5674 | 0.5554 |
cosine_mrr@10 | 0.5237 | 0.4725 |
cosine_map@100 | 0.5117 | 0.4798 |
Nano BEIR
- Dataset:
NanoBEIR_mean
- Evaluated with
NanoBEIREvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.35 |
cosine_accuracy@3 | 0.6 |
cosine_accuracy@5 | 0.68 |
cosine_accuracy@10 | 0.81 |
cosine_precision@1 | 0.35 |
cosine_precision@3 | 0.2033 |
cosine_precision@5 | 0.138 |
cosine_precision@10 | 0.082 |
cosine_recall@1 | 0.34 |
cosine_recall@3 | 0.59 |
cosine_recall@5 | 0.665 |
cosine_recall@10 | 0.78 |
cosine_ndcg@10 | 0.5614 |
cosine_mrr@10 | 0.4981 |
cosine_map@100 | 0.4957 |
Training Details
Training Dataset
gooaq
- Dataset: gooaq at b089f72
- Size: 3,012,496 training samples
- Columns:
question
andanswer
- Approximate statistics based on the first 1000 samples:
question answer type string string details - min: 8 tokens
- mean: 12.0 tokens
- max: 21 tokens
- min: 15 tokens
- mean: 58.17 tokens
- max: 190 tokens
- Samples:
question answer what is the difference between clay and mud mask?
The main difference between the two is that mud is a skin-healing agent, while clay is a cosmetic, drying agent. Clay masks are most useful for someone who has oily skin and is prone to breakouts of acne and blemishes.
myki how much on card?
A full fare myki card costs $6 and a concession, seniors or child myki costs $3. For more information about how to use your myki, visit ptv.vic.gov.au or call 1800 800 007.
how to find out if someone blocked your phone number on iphone?
If you get a notification like "Message Not Delivered" or you get no notification at all, that's a sign of a potential block. Next, you could try calling the person. If the call goes right to voicemail or rings once (or a half ring) then goes to voicemail, that's further evidence you may have been blocked.
- Loss:
CachedMultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
gooaq
- Dataset: gooaq at b089f72
- Size: 3,012,496 evaluation samples
- Columns:
question
andanswer
- Approximate statistics based on the first 1000 samples:
question answer type string string details - min: 8 tokens
- mean: 12.05 tokens
- max: 21 tokens
- min: 13 tokens
- mean: 59.08 tokens
- max: 116 tokens
- Samples:
question answer how do i program my directv remote with my tv?
['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']
are rodrigues fruit bats nocturnal?
Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.
why does your heart rate increase during exercise bbc bitesize?
During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.
- Loss:
CachedMultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 2048per_device_eval_batch_size
: 2048learning_rate
: 8e-05num_train_epochs
: 1warmup_ratio
: 0.05bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 2048per_device_eval_batch_size
: 2048per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 8e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.05warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss | NanoNQ_cosine_ndcg@10 | NanoMSMARCO_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
---|---|---|---|---|---|---|
0 | 0 | - | - | 0.0388 | 0.0785 | 0.0587 |
0.0068 | 10 | 6.9066 | - | - | - | - |
0.0136 | 20 | 4.853 | - | - | - | - |
0.0204 | 30 | 2.5305 | - | - | - | - |
0.0272 | 40 | 1.3877 | - | - | - | - |
0.0340 | 50 | 0.871 | 0.3358 | 0.4385 | 0.4897 | 0.4641 |
0.0408 | 60 | 0.6463 | - | - | - | - |
0.0476 | 70 | 0.5336 | - | - | - | - |
0.0544 | 80 | 0.4601 | - | - | - | - |
0.0612 | 90 | 0.4057 | - | - | - | - |
0.0680 | 100 | 0.366 | 0.1523 | 0.5100 | 0.4477 | 0.4789 |
0.0748 | 110 | 0.3498 | - | - | - | - |
0.0816 | 120 | 0.3297 | - | - | - | - |
0.0884 | 130 | 0.3038 | - | - | - | - |
0.0952 | 140 | 0.3062 | - | - | - | - |
0.1020 | 150 | 0.2976 | 0.1176 | 0.5550 | 0.4742 | 0.5146 |
0.1088 | 160 | 0.2843 | - | - | - | - |
0.1156 | 170 | 0.2732 | - | - | - | - |
0.1224 | 180 | 0.2549 | - | - | - | - |
0.1292 | 190 | 0.2584 | - | - | - | - |
0.1360 | 200 | 0.2451 | 0.1018 | 0.5313 | 0.4846 | 0.5079 |
0.1428 | 210 | 0.2521 | - | - | - | - |
0.1496 | 220 | 0.2451 | - | - | - | - |
0.1564 | 230 | 0.2367 | - | - | - | - |
0.1632 | 240 | 0.2359 | - | - | - | - |
0.1700 | 250 | 0.2343 | 0.0947 | 0.5489 | 0.4823 | 0.5156 |
0.1768 | 260 | 0.2263 | - | - | - | - |
0.1835 | 270 | 0.2225 | - | - | - | - |
0.1903 | 280 | 0.2219 | - | - | - | - |
0.1971 | 290 | 0.2136 | - | - | - | - |
0.2039 | 300 | 0.2202 | 0.0932 | 0.5165 | 0.4674 | 0.4920 |
0.2107 | 310 | 0.2198 | - | - | - | - |
0.2175 | 320 | 0.21 | - | - | - | - |
0.2243 | 330 | 0.207 | - | - | - | - |
0.2311 | 340 | 0.1972 | - | - | - | - |
0.2379 | 350 | 0.2037 | 0.0877 | 0.5231 | 0.5039 | 0.5135 |
0.2447 | 360 | 0.2054 | - | - | - | - |
0.2515 | 370 | 0.197 | - | - | - | - |
0.2583 | 380 | 0.1922 | - | - | - | - |
0.2651 | 390 | 0.1965 | - | - | - | - |
0.2719 | 400 | 0.1962 | 0.0843 | 0.5409 | 0.4746 | 0.5078 |
0.2787 | 410 | 0.186 | - | - | - | - |
0.2855 | 420 | 0.1911 | - | - | - | - |
0.2923 | 430 | 0.1969 | - | - | - | - |
0.2991 | 440 | 0.193 | - | - | - | - |
0.3059 | 450 | 0.1912 | 0.0763 | 0.5398 | 0.5083 | 0.5241 |
0.3127 | 460 | 0.1819 | - | - | - | - |
0.3195 | 470 | 0.1873 | - | - | - | - |
0.3263 | 480 | 0.1899 | - | - | - | - |
0.3331 | 490 | 0.1764 | - | - | - | - |
0.3399 | 500 | 0.1828 | 0.0728 | 0.5439 | 0.5176 | 0.5308 |
0.3467 | 510 | 0.1753 | - | - | - | - |
0.3535 | 520 | 0.1725 | - | - | - | - |
0.3603 | 530 | 0.1758 | - | - | - | - |
0.3671 | 540 | 0.183 | - | - | - | - |
0.3739 | 550 | 0.1789 | 0.0733 | 0.5437 | 0.5185 | 0.5311 |
0.3807 | 560 | 0.1773 | - | - | - | - |
0.3875 | 570 | 0.1764 | - | - | - | - |
0.3943 | 580 | 0.1638 | - | - | - | - |
0.4011 | 590 | 0.1809 | - | - | - | - |
0.4079 | 600 | 0.1727 | 0.0700 | 0.5550 | 0.5021 | 0.5286 |
0.4147 | 610 | 0.1664 | - | - | - | - |
0.4215 | 620 | 0.1683 | - | - | - | - |
0.4283 | 630 | 0.1622 | - | - | - | - |
0.4351 | 640 | 0.1592 | - | - | - | - |
0.4419 | 650 | 0.168 | 0.0662 | 0.5576 | 0.4843 | 0.5210 |
0.4487 | 660 | 0.1696 | - | - | - | - |
0.4555 | 670 | 0.1609 | - | - | - | - |
0.4623 | 680 | 0.1644 | - | - | - | - |
0.4691 | 690 | 0.1643 | - | - | - | - |
0.4759 | 700 | 0.1604 | 0.0660 | 0.5605 | 0.5042 | 0.5323 |
0.4827 | 710 | 0.1634 | - | - | - | - |
0.4895 | 720 | 0.1515 | - | - | - | - |
0.4963 | 730 | 0.1592 | - | - | - | - |
0.5031 | 740 | 0.1597 | - | - | - | - |
0.5099 | 750 | 0.1617 | 0.0643 | 0.5576 | 0.4830 | 0.5203 |
0.5167 | 760 | 0.1512 | - | - | - | - |
0.5235 | 770 | 0.1563 | - | - | - | - |
0.5303 | 780 | 0.1529 | - | - | - | - |
0.5370 | 790 | 0.1547 | - | - | - | - |
0.5438 | 800 | 0.1548 | 0.0620 | 0.5538 | 0.5271 | 0.5405 |
0.5506 | 810 | 0.1533 | - | - | - | - |
0.5574 | 820 | 0.1504 | - | - | - | - |
0.5642 | 830 | 0.1489 | - | - | - | - |
0.5710 | 840 | 0.1534 | - | - | - | - |
0.5778 | 850 | 0.1507 | 0.0611 | 0.5697 | 0.5095 | 0.5396 |
0.5846 | 860 | 0.1475 | - | - | - | - |
0.5914 | 870 | 0.1474 | - | - | - | - |
0.5982 | 880 | 0.1499 | - | - | - | - |
0.6050 | 890 | 0.1454 | - | - | - | - |
0.6118 | 900 | 0.1419 | 0.0620 | 0.5586 | 0.5229 | 0.5407 |
0.6186 | 910 | 0.1465 | - | - | - | - |
0.6254 | 920 | 0.1436 | - | - | - | - |
0.6322 | 930 | 0.1464 | - | - | - | - |
0.6390 | 940 | 0.1418 | - | - | - | - |
0.6458 | 950 | 0.1443 | 0.0565 | 0.5627 | 0.5458 | 0.5543 |
0.6526 | 960 | 0.1458 | - | - | - | - |
0.6594 | 970 | 0.1431 | - | - | - | - |
0.6662 | 980 | 0.1417 | - | - | - | - |
0.6730 | 990 | 0.1402 | - | - | - | - |
0.6798 | 1000 | 0.1431 | 0.0563 | 0.5499 | 0.5366 | 0.5432 |
0.6866 | 1010 | 0.1386 | - | - | - | - |
0.6934 | 1020 | 0.1413 | - | - | - | - |
0.7002 | 1030 | 0.1381 | - | - | - | - |
0.7070 | 1040 | 0.1364 | - | - | - | - |
0.7138 | 1050 | 0.1346 | 0.0545 | 0.5574 | 0.5416 | 0.5495 |
0.7206 | 1060 | 0.1338 | - | - | - | - |
0.7274 | 1070 | 0.1378 | - | - | - | - |
0.7342 | 1080 | 0.135 | - | - | - | - |
0.7410 | 1090 | 0.1336 | - | - | - | - |
0.7478 | 1100 | 0.1393 | 0.0541 | 0.5776 | 0.5362 | 0.5569 |
0.7546 | 1110 | 0.1427 | - | - | - | - |
0.7614 | 1120 | 0.1378 | - | - | - | - |
0.7682 | 1130 | 0.1346 | - | - | - | - |
0.7750 | 1140 | 0.1423 | - | - | - | - |
0.7818 | 1150 | 0.1368 | 0.0525 | 0.5681 | 0.5237 | 0.5459 |
0.7886 | 1160 | 0.1392 | - | - | - | - |
0.7954 | 1170 | 0.1321 | - | - | - | - |
0.8022 | 1180 | 0.1387 | - | - | - | - |
0.8090 | 1190 | 0.134 | - | - | - | - |
0.8158 | 1200 | 0.1369 | 0.0515 | 0.5613 | 0.5416 | 0.5514 |
0.8226 | 1210 | 0.1358 | - | - | - | - |
0.8294 | 1220 | 0.1401 | - | - | - | - |
0.8362 | 1230 | 0.1334 | - | - | - | - |
0.8430 | 1240 | 0.1331 | - | - | - | - |
0.8498 | 1250 | 0.1324 | 0.0510 | 0.5463 | 0.5546 | 0.5505 |
0.8566 | 1260 | 0.135 | - | - | - | - |
0.8634 | 1270 | 0.1367 | - | - | - | - |
0.8702 | 1280 | 0.1356 | - | - | - | - |
0.8770 | 1290 | 0.1291 | - | - | - | - |
0.8838 | 1300 | 0.1313 | 0.0498 | 0.5787 | 0.5552 | 0.5670 |
0.8906 | 1310 | 0.1334 | - | - | - | - |
0.8973 | 1320 | 0.1389 | - | - | - | - |
0.9041 | 1330 | 0.1302 | - | - | - | - |
0.9109 | 1340 | 0.1319 | - | - | - | - |
0.9177 | 1350 | 0.1276 | 0.0504 | 0.5757 | 0.5575 | 0.5666 |
0.9245 | 1360 | 0.1355 | - | - | - | - |
0.9313 | 1370 | 0.1289 | - | - | - | - |
0.9381 | 1380 | 0.1335 | - | - | - | - |
0.9449 | 1390 | 0.1298 | - | - | - | - |
0.9517 | 1400 | 0.1279 | 0.0497 | 0.5743 | 0.5567 | 0.5655 |
0.9585 | 1410 | 0.1324 | - | - | - | - |
0.9653 | 1420 | 0.1306 | - | - | - | - |
0.9721 | 1430 | 0.1313 | - | - | - | - |
0.9789 | 1440 | 0.135 | - | - | - | - |
0.9857 | 1450 | 0.1293 | 0.0493 | 0.5671 | 0.5554 | 0.5612 |
0.9925 | 1460 | 0.133 | - | - | - | - |
0.9993 | 1470 | 0.1213 | - | - | - | - |
1.0 | 1471 | - | - | 0.5674 | 0.5554 | 0.5614 |
Framework Versions
- Python: 3.11.10
- Sentence Transformers: 3.3.1
- Transformers: 4.48.0.dev0
- PyTorch: 2.6.0.dev20241112+cu121
- Accelerate: 1.2.0
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
CachedMultipleNegativesRankingLoss
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Jina Embeddings V3
Jina Embeddings V3 は100以上の言語をサポートする多言語文埋め込みモデルで、文の類似度と特徴抽出タスクに特化しています。
テキスト埋め込み
Transformers 複数言語対応

J
jinaai
3.7M
911
Ms Marco MiniLM L6 V2
Apache-2.0
MS Marcoパッセージランキングタスクで訓練されたクロスエンコーダモデル、情報検索におけるクエリ-パッセージ関連性スコアリング用
テキスト埋め込み 英語
M
cross-encoder
2.5M
86
Opensearch Neural Sparse Encoding Doc V2 Distill
Apache-2.0
蒸留技術に基づくスパース検索モデルで、OpenSearch向けに最適化されており、推論不要のドキュメントエンコーディングをサポートし、検索関連性と効率性においてV1版を上回ります
テキスト埋め込み
Transformers 英語

O
opensearch-project
1.8M
7
Sapbert From PubMedBERT Fulltext
Apache-2.0
PubMedBERTに基づく生物医学エンティティ表現モデルで、自己アライメント事前学習により意味関係の捕捉を最適化します。
テキスト埋め込み 英語
S
cambridgeltl
1.7M
49
Gte Large
MIT
GTE-Largeは強力なセンテンストランスフォーマーモデルで、文の類似度とテキスト埋め込みタスクに特化しており、複数のベンチマークテストで優れた性能を発揮します。
テキスト埋め込み 英語
G
thenlper
1.5M
278
Gte Base En V1.5
Apache-2.0
GTE-base-en-v1.5 は英語の文章変換モデルで、文章類似度タスクに特化しており、複数のテキスト埋め込みベンチマークで優れた性能を発揮します。
テキスト埋め込み
Transformers 複数言語対応

G
Alibaba-NLP
1.5M
63
Gte Multilingual Base
Apache-2.0
GTE Multilingual Base は50以上の言語をサポートする多言語文埋め込みモデルで、文類似度計算などのタスクに適しています。
テキスト埋め込み
Transformers 複数言語対応

G
Alibaba-NLP
1.2M
246
Polybert
polyBERTは、完全に機械駆動の超高速ポリマー情報学を実現するための化学言語モデルです。PSMILES文字列を600次元の密なフィンガープリントにマッピングし、ポリマー化学構造を数値形式で表現します。
テキスト埋め込み
Transformers

P
kuelumbus
1.0M
5
Bert Base Turkish Cased Mean Nli Stsb Tr
Apache-2.0
トルコ語BERTベースの文埋め込みモデルで、意味的類似性タスクに最適化
テキスト埋め込み
Transformers その他

B
emrecan
1.0M
40
GIST Small Embedding V0
MIT
BAAI/bge-small-en-v1.5モデルを微調整したテキスト埋め込みモデルで、MEDIデータセットとMTEB分類タスクデータセットで訓練され、検索タスクのクエリエンコーディング能力を最適化しました。
テキスト埋め込み
Safetensors 英語
G
avsolatorio
945.68k
29
おすすめAIモデル
Llama 3 Typhoon V1.5x 8b Instruct
タイ語専用に設計された80億パラメータの命令モデルで、GPT-3.5-turboに匹敵する性能を持ち、アプリケーションシナリオ、検索拡張生成、制限付き生成、推論タスクを最適化
大規模言語モデル
Transformers 複数言語対応

L
scb10x
3,269
16
Cadet Tiny
Openrail
Cadet-TinyはSODAデータセットでトレーニングされた超小型対話モデルで、エッジデバイス推論向けに設計されており、体積はCosmo-3Bモデルの約2%です。
対話システム
Transformers 英語

C
ToddGoldfarb
2,691
6
Roberta Base Chinese Extractive Qa
RoBERTaアーキテクチャに基づく中国語抽出型QAモデルで、与えられたテキストから回答を抽出するタスクに適しています。
質問応答システム 中国語
R
uer
2,694
98