Distilbert Base Zh Cased
This is a distilled and compact Chinese model based on a multilingual foundation model, supporting customizable language counts and capable of generating semantic representations identical to the original model.
Downloads 436
Release Time : 3/2/2022
Model Overview
This model is a distilled Chinese version of distilbert-base-multilingual-cased, perfectly preserving the original accuracy and suitable for Chinese text processing tasks.
Model Features
Semantic Representation Consistency
Capable of generating semantic representations identical to the original model, perfectly preserving the original accuracy.
Multilingual Support
Supports customizable language counts and can generate distilled versions of other multilingual Transformers.
Efficient and Compact
More lightweight than the original model while maintaining high performance.
Model Capabilities
Chinese Text Representation Generation
Semantic Similarity Calculation
Text Classification
Use Cases
Natural Language Processing
Text Classification
Can be used for Chinese text classification tasks
Maintains accuracy comparable to the original model.
Semantic Search
Used to build Chinese semantic search engines
Capable of generating high-quality text representations.
Featured Recommended AI Models