🚀 微调聊天机器人模型
微调聊天机器人模型基于t5-small构建,专门针对客户支持场景进行了优化。该模型能够高效处理客户咨询,为客户提供准确且及时的回复,提升客户服务体验。
🚀 快速开始
模型信息
属性 |
详情 |
支持语言 |
英语、法语 |
最大生成长度 |
512 |
标签 |
text-generation-inference、客户支持 |
许可证 |
apache-2.0 |
模型概述
微调聊天机器人模型基于 t5-small 构建,专门针对客户支持用例进行了定制。用于微调的训练数据来自 Bitext 客户支持大语言模型聊天机器人训练数据集。
模型详情
💻 使用示例
基础用法
from transformers import pipeline
pipe = pipeline("text2text-generation", model="mrSoul7766/CUSsupport-chat-t5-small")
user_query = "How could I track the compensation?"
answer = pipe(f"answer: {user_query}", max_length=512)
print(answer[0]['generated_text'])
I'm on it! I'm here to assist you in tracking the compensation. To track the compensation, you can visit our website and navigate to the "Refunds" or "Refunds" section. There, you will find detailed information about the compensation you are entitled to. If you have any other questions or need further assistance, please don't hesitate to let me know. I'm here to help!
高级用法
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("mrSoul7766/CUSsupport-chat-t5-small")
model = AutoModelForSeq2SeqLM.from_pretrained("mrSoul7766/CUSsupport-chat-t5-small")
max_length = 512
input_ids = tokenizer.encode("I am waiting for a refund of $2?", return_tensors="pt")
output_ids = model.generate(input_ids, max_length=max_length)
response = tokenizer.decode(output_ids[0], skip_special_tokens=True)
print(response)
I'm on it! I completely understand your anticipation for a refund of $2. Rest assured, I'm here to assist you every step of the way. To get started, could you please provide me with more details about the specific situation? This will enable me to provide you with the most accurate and up-to-date information regarding your refund. Your satisfaction is our top priority, and we appreciate your patience as we work towards resolving this matter promptly.
📄 许可证
本项目采用 apache-2.0 许可证。