Distilbert Base Uncased Finetuned Cola
This model is a fine-tuned version of DistilBERT-base-uncased on the CoLA (Corpus of Linguistic Acceptability) dataset, designed for grammatical acceptability judgment tasks.
Downloads 33
Release Time : 3/8/2025
Model Overview
This model is a lightweight version of DistilBERT, fine-tuned specifically for judging the grammatical acceptability of English sentences. It maintains high accuracy while being smaller in size and faster in inference.
Model Features
Lightweight and efficient
Retains 97% of BERT-base's performance through knowledge distillation while reducing model size by 40%
Fast inference
Inference speed is 60% faster compared to the original BERT model
Specialized fine-tuning
Optimized specifically for grammatical acceptability judgment tasks on the CoLA dataset
Model Capabilities
Grammar correctness judgment
Text classification
Natural language understanding
Use Cases
Educational technology
Grammar checking tool
Integrated into writing assistance tools to automatically detect the grammatical acceptability of English sentences
Matthews correlation coefficient reaches 0.544
Linguistic research
Linguistic experiments
Used to study human acceptance of grammatical structures
Featured Recommended AI Models
Š 2025AIbase