🚀 T5 Small for Conversation Summarization
本項目使用T5 Small模型進行對話摘要生成,可有效提煉對話的核心內容。
🚀 快速開始
本模型基於T5 Small,用於對話摘要生成。以下是使用示例:
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_checkpoint = "ahlad/t5-small-finetuned-samsum"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
input_text = """
Emma: Did you finish the book I lent you?
Liam: Yes, I couldn’t put it down! The twist at the end was insane.
Emma: I know, right? I didn’t see it coming at all. What did you think of the main character?
Liam: Honestly, I thought they were a bit frustrating at first, but they grew on me.
Emma: Same here. I loved how they developed by the end. Are you up for another book from the series?
Liam: Absolutely! Pass it my way.
"""
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
print("Summary:", summary)
💻 使用示例
基礎用法
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_checkpoint = "ahlad/t5-small-finetuned-samsum"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
input_text = """
Emma: Did you finish the book I lent you?
Liam: Yes, I couldn’t put it down! The twist at the end was insane.
Emma: I know, right? I didn’t see it coming at all. What did you think of the main character?
Liam: Honestly, I thought they were a bit frustrating at first, but they grew on me.
Emma: Same here. I loved how they developed by the end. Are you up for another book from the series?
Liam: Absolutely! Pass it my way.
"""
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
print("Summary:", summary)
高級用法
暫未提供高級用法示例,可根據實際需求對模型進行微調或結合其他技術使用。
📚 詳細文檔
模型信息
屬性 |
詳情 |
數據集 |
Samsung/samsum |
基礎模型 |
google-t5/t5-small |
任務類型 |
摘要生成 |