E

Eurollm 1.7B Instruct

Developed by utter-project
EuroLLM-1.7B-Instruct is the first instruction fine-tuned model in the EuroLLM series. It has multilingual processing capabilities and can understand and generate text in multiple European and other related languages. It performs excellently in tasks such as machine translation.
Downloads 6,829
Release Time : 8/6/2024

Model Overview

EuroLLM-1.7B-Instruct is an instruction fine-tuned model with 1.7 billion parameters, focusing on multilingual text understanding and generation, and is particularly suitable for machine translation tasks.

Model Features

Multilingual support
Supports text understanding and generation in multiple European and other related languages.
Instruction fine-tuning
Further fine-tuned on the EuroBlocks instruction fine-tuning dataset, focusing on general instruction following and machine translation.
High-performance
Competitive compared to other models in machine translation and general benchmark tests.
Advanced architecture
Adopts advanced technologies such as Grouped Query Attention (GQA), Pre-Layer Normalization (RMSNorm), SwiGLU activation function, and Rotary Position Embedding (RoPE).

Model Capabilities

Multilingual text generation
Machine translation
Instruction following

Use Cases

Machine translation
English to Portuguese translation
Translate English text into Portuguese.
Excellent performance in English to Portuguese translation in the FLORES-200 test.
Multilingual translation
Supports mutual translation between multiple European languages.
Excellent performance in the WMT-23 and WMT-24 tests.
General text generation
Multilingual text generation
Generate coherent text in multiple languages.
Performs better than TinyLlama-v1.1 in the Hellaswag test.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase