E

Eurollm 1.7B

Developed by utter-project
EuroLLM-1.7B is the first pre-trained model in the EuroLLM series, with multilingual processing capabilities, capable of understanding and generating text in multiple European and other related languages.
Downloads 3,444
Release Time : 8/6/2024

Model Overview

EuroLLM-1.7B is a model with 1.7 billion parameters, trained on 4 trillion tokens, supporting multiple European and other related languages. Its instruction fine-tuned version, EuroLLM-1.7B-Instruct, performs excellently in tasks such as machine translation.

Model Features

Multilingual support
Supports multiple European and other related languages, including Bulgarian, Croatian, Czech, Danish, Dutch, English, etc.
Efficient architecture
Adopts a standard dense Transformer architecture, using technologies such as Grouped Query Attention (GQA), Pre-Layer Normalization, RMSNorm, SwiGLU activation function, and Rotary Position Embedding (RoPE) to balance inference speed and downstream task performance.
Large-scale training
Trained on 4 trillion tokens, with a wide range of data sources, including web data, parallel data, and high-quality datasets.
Excellent performance
Performs excellently in machine translation and general benchmark tests, and is competitive compared to similar models.

Model Capabilities

Text generation
Machine translation
Multilingual processing

Use Cases

Machine translation
Multilingual translation
Supports translation tasks between multiple languages, such as English to Portuguese, German to English, etc.
Performs excellently in benchmarks such as FLORES-200, WMT-23, and WMT-24, outperforming Gemma-2B and being competitive with Gemma-7B.
General text generation
Multilingual text generation
Generates coherent text in multiple languages, suitable for multilingual content creation.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase