🚀 bert-large-finetuned-squad2
该模型基于bert-large-uncased微调而来,在SQuAD2.0数据集上表现出色,可用于问答任务,为相关领域提供了强大的支持。
🚀 快速开始
本模型基于 bert-large-uncased 构建,并在 SQuAD2.0 数据集上进行了微调。你可以在 这里(模型) 和 这里(数据) 找到对应的论文。
✨ 主要特性
- 语言:英文
- 标签:pytorch、question-answering
- 数据集:squad2
- 评估指标:exact、f1
属性 |
详情 |
模型类型 |
基于bert-large-uncased微调的问答模型 |
训练数据 |
SQuAD2.0 |
💻 使用示例
基础用法
from transformers.pipelines import pipeline
model_name = "phiyodr/bert-large-finetuned-squad2"
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
inputs = {
'question': 'What discipline did Winkelmann create?',
'context': 'Johann Joachim Winckelmann was a German art historian and archaeologist. He was a pioneering Hellenist who first articulated the difference between Greek, Greco-Roman and Roman art. "The prophet and founding hero of modern archaeology", Winckelmann was one of the founders of scientific archaeology and first applied the categories of style on a large, systematic basis to the history of art. '
}
nlp(inputs)
🔧 技术细节
训练流程
{
"base_model": "bert-large-uncased",
"do_lower_case": True,
"learning_rate": 3e-5,
"num_train_epochs": 4,
"max_seq_length": 384,
"doc_stride": 128,
"max_query_length": 64,
"batch_size": 96
}
评估结果
{
"exact": 76.22336393497852,
"f1": 79.72527570261339,
"total": 11873,
"HasAns_exact": 76.19770580296895,
"HasAns_f1": 83.21157193271408,
"HasAns_total": 5928,
"NoAns_exact": 76.24894869638352,
"NoAns_f1": 76.24894869638352,
"NoAns_total": 5945
}