Distilbert Base Spanish Uncased
D
Distilbert Base Spanish Uncased
Developed by dccuchile
A lightweight Spanish pre-trained language model based on the DistilBERT architecture, suitable for various natural language processing tasks.
Downloads 1,686
Release Time : 3/2/2022
Model Overview
This model is a lightweight version of BERT, specifically optimized for Spanish, retaining most of BERT's performance while significantly reducing model size and computational requirements.
Model Features
Lightweight and Efficient
Compared to the original BERT model, it reduces size by 40% while retaining 97% of the performance.
Spanish Language Optimization
Specifically trained and optimized for Spanish language characteristics.
Fast Inference
Smaller model size results in faster inference speed.
Model Capabilities
Text Classification
Named Entity Recognition
Sentiment Analysis
Question Answering System
Text Similarity Calculation
Use Cases
Customer Service
Spanish Customer Feedback Classification
Automatically classify Spanish customer feedback into predefined categories
Accuracy can reach 92%
Content Analysis
Spanish News Sentiment Analysis
Analyze the sentiment tendency of Spanish news articles
Featured Recommended AI Models