🚀 ABrinkmann/sbert_xtremedistil-l6-h256-uncased-mean-cosine-h32
这是一个 sentence-transformers 模型,它可以将句子和段落映射到一个 32 维的密集向量空间,可用于聚类或语义搜索等任务。
🚀 快速开始
若你已安装 sentence-transformers,使用该模型将十分便捷。
📦 安装指南
pip install -U sentence-transformers
💻 使用示例
基础用法
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('ABrinkmann/sbert_xtremedistil-l6-h256-uncased-mean-cosine-h32')
embeddings = model.encode(sentences)
print(embeddings)
📚 详细文档
评估结果
若要对该模型进行自动评估,可参考 Sentence Embeddings Benchmark:https://seb.sbert.net
训练
该模型使用以下参数进行训练:
数据加载器
torch.utils.data.dataloader.DataLoader
,长度为 251,参数如下:
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
损失函数
sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss
fit()
方法的参数:
{
"epochs": 1,
"evaluation_steps": 1000,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 26,
"weight_decay": 0.01
}
完整模型架构
SentenceTransformer(
(0): Transformer({'max_seq_length': 16, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 256, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Dense({'in_features': 256, 'out_features': 32, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)
引用与作者