🚀 T5-Base在SQuAD上微調用於問題生成
本項目是將T5-Base模型在SQuAD數據集上進行微調,以實現問題生成的功能。該模型能夠根據給定的答案和上下文,生成與之相關的問題。
🚀 快速開始
代碼示例
import torch
from transformers import T5Tokenizer, T5ForConditionalGeneration
trained_model_path = 'ZhangCheng/T5-Base-Fine-Tuned-for-Question-Generation'
trained_tokenizer_path = 'ZhangCheng/T5-Base-Fine-Tuned-for-Question-Generation'
class QuestionGeneration:
def __init__(self, model_dir=None):
self.model = T5ForConditionalGeneration.from_pretrained(trained_model_path)
self.tokenizer = T5Tokenizer.from_pretrained(trained_tokenizer_path)
self.device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
self.model = self.model.to(self.device)
self.model.eval()
def generate(self, answer: str, context: str):
input_text = '<answer> %s <context> %s ' % (answer, context)
encoding = self.tokenizer.encode_plus(
input_text,
return_tensors='pt'
)
input_ids = encoding['input_ids']
attention_mask = encoding['attention_mask']
outputs = self.model.generate(
input_ids=input_ids,
attention_mask=attention_mask
)
question = self.tokenizer.decode(
outputs[0],
skip_special_tokens=True,
clean_up_tokenization_spaces=True
)
return {'question': question, 'answer': answer, 'context': context}
if __name__ == "__main__":
context = 'ZhangCheng fine-tuned T5 on SQuAD dataset for question generation.'
answer = 'ZhangCheng'
QG = QuestionGeneration()
qa = QG.generate(answer, context)
print(qa['question'])
📚 詳細文檔
模型信息
屬性 |
詳情 |
模型類型 |
T5-Base微調模型 |
訓練數據 |
SQuAD數據集 |
示例展示
本模型提供了一些示例,展示瞭如何根據給定的答案和上下文生成問題:
- 示例1:
- 輸入:
<answer> T5 <context> Cheng fine-tuned T5 on SQuAD for question generation.
- 示例2:
- 輸入:
<answer> SQuAD <context> Cheng fine-tuned T5 on SQuAD dataset for question generation.
- 示例3:
- 輸入:
<answer> thousands <context> Transformers provides thousands of pre-trained models to perform tasks on different modalities such as text, vision, and audio.