N

Nllb 200 Distilled 1.3B Ct2 Int8

Developed by winstxnhdw
NLLB-200 Distilled 1.3B is the distilled version of Meta's No Language Left Behind (NLLB) project, supporting translation tasks between 200 languages.
Downloads 7,666
Release Time : 6/21/2023

Model Overview

This model is the distilled version of NLLB-200, focusing on multilingual translation tasks and supporting mutual translation between 200 languages.

Model Features

Multilingual Support
Supports translation between 200 languages, covering most major languages and dialects worldwide.
Distilled Model
Compared to the original NLLB-200 model, this version has been distilled, reducing model size while maintaining performance.
Efficient Inference
Optimizes inference speed using int8 quantization technology, suitable for production environment deployment.

Model Capabilities

Text Translation
Multilingual Translation
Cross-language Information Conversion

Use Cases

Global Applications
Multilingual Website Content Translation
Automatically translates website content into multiple languages to support global business expansion.
Improves content accessibility and expands global audience reach
Cross-language Communication
Supports real-time translation for communication between users of different languages.
Eliminates language barriers and promotes international exchange
Academic Research
Multilingual Literature Translation
Helps researchers access and understand academic literature in different languages.
Promotes cross-language academic exchange
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase