🚀 T5 Small for Conversation Summarization
本项目使用T5 Small模型进行对话摘要生成,可有效提炼对话的核心内容。
🚀 快速开始
本模型基于T5 Small,用于对话摘要生成。以下是使用示例:
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_checkpoint = "ahlad/t5-small-finetuned-samsum"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
input_text = """
Emma: Did you finish the book I lent you?
Liam: Yes, I couldn’t put it down! The twist at the end was insane.
Emma: I know, right? I didn’t see it coming at all. What did you think of the main character?
Liam: Honestly, I thought they were a bit frustrating at first, but they grew on me.
Emma: Same here. I loved how they developed by the end. Are you up for another book from the series?
Liam: Absolutely! Pass it my way.
"""
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
print("Summary:", summary)
💻 使用示例
基础用法
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_checkpoint = "ahlad/t5-small-finetuned-samsum"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
input_text = """
Emma: Did you finish the book I lent you?
Liam: Yes, I couldn’t put it down! The twist at the end was insane.
Emma: I know, right? I didn’t see it coming at all. What did you think of the main character?
Liam: Honestly, I thought they were a bit frustrating at first, but they grew on me.
Emma: Same here. I loved how they developed by the end. Are you up for another book from the series?
Liam: Absolutely! Pass it my way.
"""
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
print("Summary:", summary)
高级用法
暂未提供高级用法示例,可根据实际需求对模型进行微调或结合其他技术使用。
📚 详细文档
模型信息
属性 |
详情 |
数据集 |
Samsung/samsum |
基础模型 |
google-t5/t5-small |
任务类型 |
摘要生成 |