Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Trainable on a single GPU
# Trainable on a single GPU
Tinymistral 248M GGUF
Apache-2.0
TinyMistral-248M is a small language model based on pre-training from the Mistral 7B model, with parameters scaled down to approximately 248 million, primarily used for fine-tuning downstream tasks.
Large Language Model
English
T
afrideva
211
5
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase