🚀 lmqg/t5-small-squad-qg-ae
模型卡片
本模型是 t5-small 的微調版本,通過 lmqg
在 lmqg/qg_squad(數據集名稱:default)上聯合進行問題生成和答案提取任務的微調。
🚀 快速開始
模型概述
使用示例
基礎用法
使用 lmqg
庫:
from lmqg import TransformersQG
model = TransformersQG(language="en", model="lmqg/t5-small-squad-qg-ae")
question_answer_pairs = model.generate_qa("William Turner was an English painter who specialised in watercolour landscapes")
高級用法
使用 transformers
庫:
from transformers import pipeline
pipe = pipeline("text2text-generation", "lmqg/t5-small-squad-qg-ae")
answer = pipe("generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.")
question = pipe("extract answers: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress.")
📚 詳細文檔
評估指標
問題生成指標
原始指標文件
問題與答案生成指標
原始指標文件
答案提取指標
原始指標文件
訓練超參數
微調過程中使用了以下超參數:
- 數據集路徑:lmqg/qg_squad
- 數據集名稱:default
- 輸入類型:['paragraph_answer', 'paragraph_sentence']
- 輸出類型:['question', 'answer']
- 前綴類型:['qg', 'ae']
- 模型:t5-small
- 最大長度:512
- 最大輸出長度:32
- 訓練輪數:7
- 批次大小:64
- 學習率:0.0001
- 混合精度訓練:False
- 隨機種子:1
- 梯度累積步數:1
- 標籤平滑:0.15
完整配置可在 微調配置文件 中查看。
📄 許可證
本模型使用 CC BY 4.0 許可證。
📖 引用
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}