N

Nllb 200 Distilled 1.3B Easyproject

Developed by ychenNLP
NLLB-200-3.3B is a large-scale multilingual translation model supporting 200 languages, trained on the FLORES-200 dataset.
Downloads 52
Release Time : 5/3/2023

Model Overview

This model focuses on high-quality multilingual machine translation tasks, supporting direct translation between 200 languages, with particular attention to the translation quality of low-resource languages.

Model Features

Extensive Language Coverage
Supports 200 languages, including many low-resource languages and dialect variants
Multilingual Translation
Supports direct translation between any two supported languages without English as an intermediary
Low-Resource Language Optimization
Specifically improves translation quality for low-resource languages

Model Capabilities

Text Translation
Multilingual Translation
Low-Resource Language Processing

Use Cases

Cross-Language Communication
Multilingual Content Localization
Translate content into multiple languages to adapt to global markets
Enhances content accessibility worldwide
Language Research
Low-Resource Language Research
Provides translation support for linguists working with low-resource languages
Promotes language preservation and documentation
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase