Distilbert Base En Fr Es De Zh Cased
D
Distilbert Base En Fr Es De Zh Cased
Developed by Geotrend
This is a lightweight version of distilbert-base-multilingual-cased, supporting five languages: English, French, Spanish, German, and Chinese, while maintaining original accuracy.
Downloads 35
Release Time : 3/2/2022
Model Overview
This model is a lightweight version of multilingual BERT, supporting five languages and capable of generating identical representations as the original model.
Model Features
Multilingual support
Supports text processing in five languages: English, French, Spanish, German, and Chinese
Lightweight version
A lightweight version of distilbert-base-multilingual-cased that maintains original accuracy
Representation consistency
Capable of generating identical representations as the original model
Model Capabilities
Multilingual text processing
Text representation generation
Use Cases
Natural Language Processing
Multilingual text classification
Classify text in five languages
Cross-language information retrieval
Support information retrieval across different languages
Featured Recommended AI Models