# Text Classification Optimization
Check Event Wa
Apache-2.0
A text classification model fine-tuned based on distilbert-base-uncased, demonstrating outstanding performance on the evaluation set with both accuracy and F1 score reaching 1.0.
Large Language Model
Transformers

C
Venkatesh4342
15
2
Distilbert Base Uncased Finetuned Cola
Apache-2.0
This model is a fine-tuned version of DistilBERT on the CoLA (Corpus of Linguistic Acceptability) dataset, designed for grammatical acceptability judgment tasks.
Large Language Model
Transformers

D
stuser2023
179
2
Distilroberta Base SmithsModel2
Apache-2.0
A fine-tuned model based on distilroberta-base, suitable for specific NLP tasks
Large Language Model
Transformers

D
stevems1
22
0
Distilbert500e
Apache-2.0
A model fine-tuned based on distilbert-base-uncased, with specific task and dataset information not provided
Large Language Model
Transformers

D
bigmorning
27
0
Tamillion
A Tamil pre-trained model based on the ELECTRA framework, with the second version trained on TPUs and expanded corpus scale
Large Language Model
Transformers Other

T
monsoon-nlp
58
2
Albert Fa Base V2 Sentiment Deepsentipers Multi
Apache-2.0
Lightweight BERT model designed for self-supervised learning of Persian language representations
Large Language Model
Transformers Other

A
m3hrdadfi
24
0
Featured Recommended AI Models