🚀 Flux-Prompt-Enhance
Flux-Prompt-Enhance 是一個文本生成模型,它基於 google-t5/t5-base
模型,可用於將簡短的提示詞轉換為更詳細、豐富的描述,適用於文本到文本的生成任務。
🚀 快速開始
以下是使用 Flux-Prompt-Enhance 模型進行提示詞增強的示例代碼:
from transformers import pipeline, AutoTokenizer, AutoModelForSeq2SeqLM
device = "cuda" if torch.cuda.is_available() else "cpu"
model_checkpoint = "gokaygokay/Flux-Prompt-Enhance"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
enhancer = pipeline('text2text-generation',
model=model,
tokenizer=tokenizer,
repetition_penalty= 1.2,
device=device)
max_target_length = 256
prefix = "enhance prompt: "
short_prompt = "beautiful house with text 'hello'"
answer = enhancer(prefix + short_prompt, max_length=max_target_length)
final_answer = answer[0]['generated_text']
print(final_answer)
💻 使用示例
基礎用法
from transformers import pipeline, AutoTokenizer, AutoModelForSeq2SeqLM
device = "cuda" if torch.cuda.is_available() else "cpu"
model_checkpoint = "gokaygokay/Flux-Prompt-Enhance"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
enhancer = pipeline('text2text-generation',
model=model,
tokenizer=tokenizer,
repetition_penalty= 1.2,
device=device)
max_target_length = 256
prefix = "enhance prompt: "
short_prompt = "beautiful house with text 'hello'"
answer = enhancer(prefix + short_prompt, max_length=max_target_length)
final_answer = answer[0]['generated_text']
print(final_answer)
高級用法
from transformers import pipeline, AutoTokenizer, AutoModelForSeq2SeqLM
device = "cuda" if torch.cuda.is_available() else "cpu"
model_checkpoint = "gokaygokay/Flux-Prompt-Enhance"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)
enhancer = pipeline('text2text-generation',
model=model,
tokenizer=tokenizer,
repetition_penalty= 1.5,
device=device)
max_target_length = 300
prefix = "enhance prompt: "
short_prompt = "beautiful house with text 'hello'"
answer = enhancer(prefix + short_prompt, max_length=max_target_length)
final_answer = answer[0]['generated_text']
print(final_answer)
📄 許可證
本項目採用 apache-2.0
許可證。
📚 詳細文檔
模型信息
屬性 |
詳情 |
基礎模型 |
google-t5/t5-base |
訓練數據集 |
gokaygokay/prompt-enhancer-dataset |
語言 |
en |
庫名稱 |
transformers |
任務類型 |
text2text-generation |
許可證 |
apache-2.0 |