GTE ModernColBERT V1
模型概述
模型特點
模型能力
使用案例
🚀 GTE-ModernColBERT-v1
這是一個基於 Alibaba-NLP/gte-modernbert-base 的 PyLate 模型,可將句子和段落映射為128維的密集向量序列,適用於使用 MaxSim 算子進行語義文本相似度計算。
🚀 快速開始
首先安裝 PyLate 庫:
pip install -U pylate
檢索
PyLate 提供了一個簡化的接口,用於使用 ColBERT 模型對文檔進行索引和檢索。索引利用 Voyager HNSW 索引來高效處理文檔嵌入,並實現快速檢索。
文檔索引
首先,加載 ColBERT 模型並初始化 Voyager 索引,然後對文檔進行編碼和索引:
from pylate import indexes, models, retrieve
# 步驟 1: 加載 ColBERT 模型
model = models.ColBERT(
model_name_or_path=lightonai/GTE-ModernColBERT-v1,
)
# 步驟 2: 初始化 Voyager 索引
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True, # 如果存在現有索引,則覆蓋它
)
# 步驟 3: 編碼文檔
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]
documents_embeddings = model.encode(
documents,
batch_size=32,
is_query=False, # 確保將其設置為 False 以表明這些是文檔,而不是查詢
show_progress_bar=True,
)
# 步驟 4: 通過提供嵌入和相應的 ID 將文檔嵌入添加到索引中
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
請注意,您不必每次都重新創建索引和編碼文檔。一旦創建了索引並添加了文檔,您可以通過加載它來重複使用該索引:
# 要加載索引,只需使用正確的文件夾/名稱實例化它,而不覆蓋它
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
)
查詢前 k 個文檔
一旦文檔被索引,您可以為給定的查詢集檢索前 k 個最相關的文檔。 為此,使用要搜索的索引初始化 ColBERT 檢索器,對查詢進行編碼,然後檢索前 k 個文檔以獲取前匹配項的 ID 和相關性分數:
# 步驟 1: 初始化 ColBERT 檢索器
retriever = retrieve.ColBERT(index=index)
# 步驟 2: 編碼查詢
queries_embeddings = model.encode(
["query for document 3", "query for document 1"],
batch_size=32,
is_query=True, # 確保將其設置為 True 以表明這些是查詢
show_progress_bar=True,
)
# 步驟 3: 檢索前 k 個文檔
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10, # 為每個查詢檢索前 10 個匹配項
)
重排序
如果您只想使用 ColBERT 模型在第一階段檢索管道的基礎上進行重排序,而不構建索引,您可以簡單地使用 rank 函數並傳遞要重排序的查詢和文檔:
from pylate import rank, models
queries = [
"query A",
"query B",
]
documents = [
["document A", "document B"],
["document 1", "document C", "document B"],
]
documents_ids = [
[1, 2],
[1, 3, 2],
]
model = models.ColBERT(
model_name_or_path=pylate_model_id,
)
queries_embeddings = model.encode(
queries,
is_query=True,
)
documents_embeddings = model.encode(
documents,
is_query=False,
)
reranked_documents = rank.rerank(
documents_ids=documents_ids,
queries_embeddings=queries_embeddings,
documents_embeddings=documents_embeddings,
)
✨ 主要特性
- 語義相似度計算:能夠將句子和段落映射為128維的密集向量序列,使用 MaxSim 算子進行語義文本相似度計算。
- 高效檢索:利用 Voyager HNSW 索引,實現快速的文檔檢索。
- 長上下文處理:在長上下文嵌入基準測試中表現出色,能夠處理超出訓練長度的文檔。
📦 安裝指南
pip install -U pylate
💻 使用示例
基礎用法
# 加載 ColBERT 模型
from pylate import models
model = models.ColBERT(
model_name_or_path=lightonai/GTE-ModernColBERT-v1,
)
# 編碼文檔
documents = ["document 1 text", "document 2 text"]
documents_embeddings = model.encode(
documents,
is_query=False,
)
# 編碼查詢
queries = ["query for document 1"]
queries_embeddings = model.encode(
queries,
is_query=True,
)
高級用法
# 文檔索引和檢索
from pylate import indexes, retrieve
# 初始化 Voyager 索引
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True,
)
# 添加文檔嵌入到索引
documents_ids = ["1", "2"]
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
# 初始化 ColBERT 檢索器
retriever = retrieve.ColBERT(index=index)
# 檢索前 10 個文檔
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10,
)
📚 詳細文檔
模型詳情
屬性 | 詳情 |
---|---|
模型類型 | PyLate 模型 |
基礎模型 | Alibaba-NLP/gte-modernbert-base |
文檔長度 | 300 個標記 |
查詢長度 | 32 個標記 |
輸出維度 | 128 維 |
相似度函數 | MaxSim |
訓練數據集 | ms-marco-en-bge-gemma |
語言 | 英語 |
許可證 | Apache 2.0 |
文檔長度
GTE-ModernColBERT 在 MS MARCO 上使用知識蒸餾進行訓練,文檔長度為 300 個標記,這解釋了其文檔長度的默認值。 然而,正如 ModernBERT 論文所示,ColBERT 模型可以推廣到遠遠超出其訓練長度的文檔長度,並且 GTE-ModernColBERT 在長上下文嵌入基準測試中實際上取得了遠遠高於當前最優水平的結果,請參閱 LongEmbed 結果。 在加載模型時,只需根據需要調整文檔長度參數:
model = models.ColBERT(
model_name_or_path=lightonai/GTE-ModernColBERT-v1,
document_length=8192,
)
ModernBERT 本身僅在 8K 上下文長度上進行了訓練,但似乎 GTE-ModernColBERT 可以推廣到更大的上下文大小,不過這並不保證,因此請自行進行測試!
模型來源
- 文檔:PyLate 文檔
- 倉庫:GitHub 上的 PyLate
- Hugging Face:Hugging Face 上的 PyLate 模型
完整模型架構
ColBERT(
(0): Transformer({'max_seq_length': 299, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)
🔧 技術細節
訓練超參數
非默認超參數
eval_strategy
: stepsper_device_train_batch_size
: 16learning_rate
: 3e-05bf16
: True
所有超參數
點擊展開
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 3e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 6ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Truedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
訓練日誌
點擊展開
Epoch | Step | Training Loss | NanoClimateFEVER_MaxSim_ndcg@10 | NanoDBPedia_MaxSim_ndcg@10 | NanoFEVER_MaxSim_ndcg@10 | NanoFiQA2018_MaxSim_ndcg@10 | NanoHotpotQA_MaxSim_ndcg@10 | NanoMSMARCO_MaxSim_ndcg@10 | NanoNFCorpus_MaxSim_ndcg@10 | NanoNQ_MaxSim_ndcg@10 | NanoQuoraRetrieval_MaxSim_ndcg@10 | NanoSCIDOCS_MaxSim_ndcg@10 | NanoArguAna_MaxSim_ndcg@10 | NanoSciFact_MaxSim_ndcg@10 | NanoTouche2020_MaxSim_ndcg@10 | NanoBEIR_mean_MaxSim_ndcg@10 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.004 | 20 | 0.0493 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.008 | 40 | 0.0434 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.012 | 60 | 0.0324 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.016 | 80 | 0.0238 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.02 | 100 | 0.0202 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.024 | 120 | 0.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.028 | 140 | 0.0172 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.032 | 160 | 0.0164 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.036 | 180 | 0.0157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.04 | 200 | 0.0153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.044 | 220 | 0.0145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.048 | 240 | 0.014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.052 | 260 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.056 | 280 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.06 | 300 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.064 | 320 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.068 | 340 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.072 | 360 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.076 | 380 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.08 | 400 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.084 | 420 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.088 | 440 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.092 | 460 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.096 | 480 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.1 | 500 | 0.0111 | 0.3085 | 0.6309 | 0.9206 | 0.5303 | 0.8618 | 0.6893 | 0.3703 | 0.7163 | 0.9548 | 0.3885 | 0.4682 | 0.7930 | 0.5982 | 0.6331 |
0.104 | 520 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.108 | 540 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.112 | 560 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.116 | 580 | 0.0105 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.12 | 600 | 0.0102 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.124 | 620 | 0.0104 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.128 | 640 | 0.0103 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.132 | 660 | 0.01 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.136 | 680 | 0.0101 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.14 | 700 | 0.0098 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.144 | 720 | 0.0097 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.148 | 740 | 0.0097 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.152 | 760 | 0.0096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.156 | 780 | 0.0096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.16 | 800 | 0.0094 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.164 | 820 | 0.0096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.168 | 840 | 0.0095 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.172 | 860 | 0.0093 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.176 | 880 | 0.0092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.18 | 900 | 0.0093 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.184 | 920 | 0.009 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.188 | 940 | 0.009 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.192 | 960 | 0.0089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.196 | 980 | 0.0089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.2 | 1000 | 0.0089 | 0.3148 | 0.6586 | 0.9335 | 0.5374 | 0.8810 | 0.6805 | 0.3746 | 0.7368 | 0.9486 | 0.3955 | 0.4824 | 0.8219 | 0.6089 | 0.6442 |
0.204 | 1020 | 0.0088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.208 | 1040 | 0.0089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.212 | 1060 | 0.0088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.216 | 1080 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.22 | 1100 | 0.0087 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.224 | 1120 | 0.0088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.228 | 1140 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.232 | 1160 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.236 | 1180 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.24 | 1200 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.244 | 1220 | 0.0085 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.248 | 1240 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.252 | 1260 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.256 | 1280 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.26 | 1300 | 0.0083 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.264 | 1320 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.268 | 1340 | 0.0082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.272 | 1360 | 0.0082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.276 | 1380 | 0.008 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.28 | 1400 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.284 | 1420 | 0.0079 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.288 | 1440 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.292 | 1460 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.296 | 1480 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.3 | 1500 | 0.0079 | 0.3510 | 0.6590 | 0.9285 | 0.5463 | 0.8893 | 0.6853 | 0.3800 | 0.7370 | 0.9513 | 0.3980 | 0.5268 | 0.8268 | 0.6130 | 0.6533 |
0.304 | 1520 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.308 | 1540 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.312 | 1560 | 0.0077 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.316 | 1580 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.32 | 1600 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.324 | 1620 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.328 | 1640 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.332 | 1660 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.336 | 1680 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.34 | 1700 | 0.0077 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.344 | 1720 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.348 | 1740 | 0.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.352 | 1760 | 0.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.356 | 1780 | 0.0075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.36 | 1800 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.364 | 1820 | 0.0075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.368 | 1840 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.372 | 1860 | 0.0075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.376 | 1880 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.38 | 1900 | 0.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.384 | 1920 | 0.0072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.388 | 1940 | 0.0072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.392 | 1960 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.396 | 1980 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.4 | 2000 | 0.0071 | 0.3551 | 0.6807 | 0.9311 | 0.5340 | 0.8951 | 0.7019 | 0.3767 | 0.7460 | 0.9559 | 0.3912 | 0.5121 | 0.8245 | 0.6058 | 0.6546 |
0.404 | 2020 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.408 | 2040 | 0.0072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.412 | 2060 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.416 | 2080 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.42 | 2100 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.424 | 2120 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.428 | 2140 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.432 | 2160 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.436 | 2180 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.44 | 2200 | 0.007 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.444 | 2220 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.448 | 2240 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.452 | 2260 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.456 | 2280 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.46 | 2300 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.464 | 2320 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.468 | 2340 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.472 | 2360 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.476 | 2380 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.48 | 2400 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.484 | 2420 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.488 | 2440 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.492 | 2460 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.496 | 2480 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.5 | 2500 | 0.0068 | 0.3647 | 0.6883 | 0.9435 | 0.5624 | 0.8946 | 0.7065 | 0.3815 | 0.7709 | 0.9658 | 0.3993 | 0.5631 | 0.8371 | 0.6076 | 0.6681 |
0.504 | 2520 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.508 | 2540 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.512 | 2560 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.516 | 2580 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.52 | 2600 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.524 | 2620 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.528 | 2640 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.532 | 2660 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.536 | 2680 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.54 | 2700 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.544 | 2720 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.548 | 2740 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.552 | 2760 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.556 | 2780 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.56 | 2800 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.564 | 2820 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.568 | 2840 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.572 | 2860 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.576 | 2880 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.58 | 2900 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.584 | 2920 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.588 | 2940 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.592 | 2960 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.596 | 2980 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.6 | 3000 | 0.0064 | 0.3585 | 0.7081 | 0.9409 | 0.5474 | 0.8915 | 0.7037 | 0.3796 | 0.7763 | 0.9540 | 0.4038 | 0.5628 | 0.8424 | 0.6042 | 0.6672 |
0.604 | 3020 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.608 | 3040 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.612 | 3060 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.616 | 3080 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.62 | 3100 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.624 | 3120 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.628 | 3140 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.632 | 3160 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.636 | 3180 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.64 | 3200 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.644 | 3220 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.648 | 3240 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.652 | 3260 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.656 | 3280 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.66 | 3300 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.664 | 3320 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.668 | 3340 | 0.0061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.672 | 3360 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.676 | 3380 | 0.0061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.68 | 3400 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.684 | 3420 | 0.006 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.688 | 3440 | 0.0061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.692 | 3460 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.696 | 3480 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.7 | 3500 | 0.0061 | 0.3783 | 0.7080 | 0.9441 | 0.5603 | 0.8902 | 0.7022 | 0.3824 | 0.7780 | 0.9612 | 0.3995 | 0.5414 | 0.8450 | 0.6049 | 0.6689 |
0.7 |







