Bart Large Paraphrase Generator En De V2
B
Bart Large Paraphrase Generator En De V2
Developed by bettertextapp
Large-scale English-German paraphrase generation model based on BART architecture
Downloads 121
Release Time : 4/25/2025
Model Overview
This model is an English-German paraphrase generation model trained on the BART architecture, capable of producing semantically similar but differently expressed texts.
Model Features
Bilingual Paraphrasing Capability
Supports paraphrase generation between English and German
Large-Scale Pretraining
Based on the BART-large architecture, it possesses strong text comprehension and generation capabilities
Model Capabilities
Text Paraphrasing
Bilingual Text Generation
Semantic-Preserving Rewriting
Use Cases
Content Creation
Article Rewriting
Rewrites existing articles while preserving semantics to generate new expressions
Language Learning
Bilingual Expression Practice
Provides language learners with different expressions of the same content
Featured Recommended AI Models
Qwen2.5 VL 7B Abliterated Caption It I1 GGUF
Apache-2.0
Quantized version of Qwen2.5-VL-7B-Abliterated-Caption-it, supporting multilingual image description tasks.
Image-to-Text
Transformers Supports Multiple Languages

Q
mradermacher
167
1
Nunchaku Flux.1 Dev Colossus
Other
The Nunchaku quantized version of the Colossus Project Flux, designed to generate high-quality images based on text prompts. This model minimizes performance loss while optimizing inference efficiency.
Image Generation English
N
nunchaku-tech
235
3
Qwen2.5 VL 7B Abliterated Caption It GGUF
Apache-2.0
This is a static quantized version based on the Qwen2.5-VL-7B model, focusing on image captioning generation tasks and supporting multiple languages.
Image-to-Text
Transformers Supports Multiple Languages

Q
mradermacher
133
1
Olmocr 7B 0725 FP8
Apache-2.0
olmOCR-7B-0725-FP8 is a document OCR model based on the Qwen2.5-VL-7B-Instruct model. It is fine-tuned using the olmOCR-mix-0225 dataset and then quantized to the FP8 version.
Image-to-Text
Transformers English

O
allenai
881
3
Lucy 128k GGUF
Apache-2.0
Lucy-128k is a model developed based on Qwen3-1.7B, focusing on proxy-based web search and lightweight browsing, and can run efficiently on mobile devices.
Large Language Model
Transformers English

L
Mungert
263
2