Distilbert Base Chinese Amazon Zh 20000
D
Distilbert Base Chinese Amazon Zh 20000
Developed by ASCCCCCCCC
This model is a fine-tuned version of bert-base-chinese on an unknown dataset, primarily used for Chinese text processing tasks.
Downloads 14
Release Time : 3/2/2022
Model Overview
This is a fine-tuned DistilBERT Chinese model suitable for Chinese text classification or other natural language processing tasks.
Model Features
Lightweight Model
Based on the DistilBERT architecture, it is more lightweight and faster in inference compared to the original BERT model.
Chinese Optimization
Specifically fine-tuned and optimized for Chinese text.
Model Capabilities
Chinese Text Classification
Natural Language Understanding
Use Cases
E-commerce
Product Review Classification
Perform sentiment analysis or classification on Chinese product reviews on e-commerce platforms
Achieved an accuracy of 50.92% on the evaluation set
Featured Recommended AI Models