🚀 lmqg/t5-small-squad-qg-ae
模型卡片
本模型是 t5-small 的微调版本,通过 lmqg
在 lmqg/qg_squad(数据集名称:default)上联合进行问题生成和答案提取任务的微调。
🚀 快速开始
模型概述
使用示例
基础用法
使用 lmqg
库:
from lmqg import TransformersQG
model = TransformersQG(language="en", model="lmqg/t5-small-squad-qg-ae")
question_answer_pairs = model.generate_qa("William Turner was an English painter who specialised in watercolour landscapes")
高级用法
使用 transformers
库:
from transformers import pipeline
pipe = pipeline("text2text-generation", "lmqg/t5-small-squad-qg-ae")
answer = pipe("generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.")
question = pipe("extract answers: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress.")
📚 详细文档
评估指标
问题生成指标
原始指标文件
问题与答案生成指标
原始指标文件
答案提取指标
原始指标文件
训练超参数
微调过程中使用了以下超参数:
- 数据集路径:lmqg/qg_squad
- 数据集名称:default
- 输入类型:['paragraph_answer', 'paragraph_sentence']
- 输出类型:['question', 'answer']
- 前缀类型:['qg', 'ae']
- 模型:t5-small
- 最大长度:512
- 最大输出长度:32
- 训练轮数:7
- 批次大小:64
- 学习率:0.0001
- 混合精度训练:False
- 随机种子:1
- 梯度累积步数:1
- 标签平滑:0.15
完整配置可在 微调配置文件 中查看。
📄 许可证
本模型使用 CC BY 4.0 许可证。
📖 引用
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}