Discolm Mfto 7b German V0.1
This is an experimental German language model merging project that combines two excellent German models from OpenPipe and DiscoResearch, aiming to improve German generation quality
Large Language Model
Transformers Supports Multiple Languages#German optimization#Multilingual role-playing#Low grammar errors

Downloads 15
Release Time : 1/21/2024
Model Overview
Using the SLERP merging method, it combines the OpenPipe/mistral-ft-optimized-1227 and DiscoResearch/DiscoLM_German_7b_v1 models, focusing on enhancing German text generation capabilities, especially suitable for German role-playing scenarios
Model Features
German optimization
Merges two best-performing German models, significantly improving German generation quality
Role-playing support
Specially optimized for performance in German role-playing scenarios
Bilingual capability
Supports both German and English generation
Model Capabilities
German text generation
English text generation
Role-playing dialogue
Multi-turn conversation
Use Cases
Entertainment
German role-playing
Used for dialogue generation in German role-playing games
Generates natural and fluent German role-playing dialogues
Content creation
German content generation
Generates various types of German text content
Reduces grammar errors and improves text fluency
Featured Recommended AI Models
Š 2025AIbase