Nllb 200 Distilled 600M Ja Zh
N
Nllb 200 Distilled 600M Ja Zh
Developed by neverLife
This is a distilled sequence-to-sequence model based on the NLLB-200 architecture, specifically designed for Japanese-to-Chinese translation tasks.
Machine Translation
Transformers Supports Multiple Languages#Japanese-Chinese Translation#NLLB Distilled Model#Low-resource Optimization

Downloads 174
Release Time : 5/15/2023
Model Overview
This model is a lightweight version of NLLB-200, focusing on Japanese-to-Chinese text translation. It adopts the Transformer architecture, maintaining high translation quality while reducing model size.
Model Features
Efficient Distilled Model
Extracts core translation capabilities from the large NLLB-200 model through knowledge distillation, maintaining performance while reducing computational resource requirements.
Bilingual Specialized Translation
Optimized specifically for Japanese-to-Chinese translation scenarios, handling everyday language and simple professional terminology.
Lightweight Deployment
Significantly reduced parameter size compared to the full NLLB-200 model, suitable for deployment in resource-limited environments.
Model Capabilities
Japanese-to-Chinese text translation
Handling everyday conversation translation
Support for beam search decoding
Use Cases
Language Services
Everyday Conversation Translation
Translates Japanese everyday conversations into natural and fluent Chinese
BLEU score 55.834, indicating good translation quality
Simple Document Translation
Translates non-specialized Japanese documents into Chinese
Can handle moderately long texts (up to 128 tokens)
Featured Recommended AI Models
Š 2025AIbase