AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-round pre-training

# Multi-round pre-training

Vikhr 7b 0.1
Apache-2.0
Vikhr is a Russian language model based on the Mistral architecture, undergoing three rounds of pre-training with 400 million tokens. It outperforms the original Mistral in Russian tasks but may have shortcomings in code processing.
Large Language Model Transformers Supports Multiple Languages
V
Vikhrmodels
259
57
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase