N

Nllb 200 Distilled 1.3B Ct2 Int8

Developed by OpenNMT
NLLB-200 Distilled 1.3B is a neural machine translation model developed by Meta, supporting translation between 200 languages, utilizing CTranslate2 for efficient inference.
Downloads 101
Release Time : 11/30/2023

Model Overview

This is a distilled version of the translation model based on the No Language Left Behind (NLLB) project, focusing on efficient multilingual translation with special optimizations for memory usage and inference speed.

Model Features

Multilingual Support
Supports translation between 200 languages, covering most major languages and dialects worldwide.
Efficient Inference
Uses CTranslate2 with int8 quantization to reduce memory usage by 2-4x while maintaining inference speed.
Optimized Deployment
Efficiently runs on both CPU and GPU, suitable for production environment deployment.

Model Capabilities

Text Translation
Multilingual Translation
Low-Resource Language Translation

Use Cases

Globalization Applications
Multilingual Content Localization
Provides multilingual content translation for global applications.
Supports mutual translation between 200 languages.
Research Applications
Low-Resource Language Research
Offers translation support for linguistic research and low-resource language preservation.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase