N

Nllb 200 Distilled 600M Ctranslate2

Developed by entai2965
NLLB-200 is a neural machine translation model supporting 200 languages, with special focus on translation quality for low-resource languages.
Downloads 37
Release Time : 11/20/2024

Model Overview

NLLB-200 is a machine translation model developed by Facebook Research, supporting single-sentence translation between 200 languages, with particular attention to translation quality for low-resource languages. The model employs distillation techniques and has a parameter size of 600M.

Model Features

Multilingual Support
Supports translation between 200 languages, with special focus on low-resource languages
Distillation Technology
Uses model distillation techniques to reduce model size while maintaining performance
Fairness Considerations
Particularly focuses on translation quality for low-resource languages in regions like Africa

Model Capabilities

Single-sentence machine translation
Multilingual translation
Low-resource language translation

Use Cases

Research
Machine Translation Research
Used for research in machine translation, particularly for low-resource language translation techniques
Education
Multilingual Educational Material Translation
Helps translate educational materials into multiple languages, especially low-resource languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase