N

Nllb Moe 54b 4bit

Developed by KnutJaegersberg
NLLB-MoE is a Mixture of Experts machine translation model developed by Meta, supporting 200 languages, and is one of the most advanced open-access machine translation models available.
Downloads 17
Release Time : 12/16/2023

Model Overview

NLLB-MoE is a large-scale multilingual machine translation model based on the Mixture of Experts architecture, focusing on solving translation problems for low-resource languages and utilizing expert output masking technology to optimize performance.

Model Features

Large-scale multilingual support
Supports translation between 200 languages, including many low-resource languages
Mixture of Experts architecture
Utilizes MoE architecture for efficient large-scale model training and inference
Expert output masking
Uses expert output masking technology to optimize model performance
Efficient inference optimization
Optimized for fast loading through bitsandbytes and Hugging Face Transformers

Model Capabilities

Text translation
Multilingual translation
Low-resource language processing

Use Cases

Globalization applications
Multilingual content localization
Provides multilingual content translation for global applications and websites
Accurate translation between 200 languages
Low-resource language preservation
Provides machine translation support for endangered and low-resource languages
Enables small languages to achieve translation quality comparable to mainstream languages
Academic research
Cross-language research
Supports cross-language conversion of academic papers and research materials
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase