Distilbert Base En Ur Cased
This is a distilled version of distilbert-base-multilingual-cased, specifically supporting English and Urdu while maintaining the original model's representation capabilities.
Downloads 32
Release Time : 3/2/2022
Model Overview
This model is a distilled version based on multilingual BERT, focusing on English and Urdu processing while preserving the original model's accuracy.
Model Features
Multilingual support
Optimized specifically for English and Urdu while maintaining the representation capabilities of the original multilingual model.
Distilled architecture
More lightweight compared to the original model while maintaining the same representation quality.
Efficient inference
Faster inference speed and lower resource consumption due to the distilled architecture.
Model Capabilities
Text representation generation
Multilingual text processing
Use Cases
Natural Language Processing
Cross-lingual information retrieval
Used in information retrieval systems between English and Urdu
Text classification
Classifying English and Urdu texts
Featured Recommended AI Models