# RoBERTa optimization
Roberta With Kornli
Apache-2.0
This model is fine-tuned from klue/roberta-base using mnli and xnli datasets from kor_nli, specifically designed for Korean zero-shot classification tasks.
Text Classification
Transformers Korean

R
pongjin
52
7
Roberta Base Serbian Upos
This is a RoBERTa model specifically designed for Serbian language, specialized in part-of-speech tagging and dependency parsing.
Sequence Labeling
Transformers Other

R
KoichiYasuoka
64
0
Phobert Base
MIT
PhoBERT is the most advanced pretrained language model for Vietnamese, optimized based on the RoBERTa architecture, and excels in multiple Vietnamese NLP tasks.
Large Language Model Other
P
vinai
368.06k
53
Phobert Large
MIT
PhoBERT is the most advanced pretrained language model for Vietnamese, optimized based on RoBERTa's BERT pretraining process, achieving excellent performance in multiple Vietnamese NLP tasks.
Large Language Model Other
P
vinai
23.47k
10
Bertweet Base
MIT
BERTweet is the first publicly available language model specifically pretrained on English tweets, built upon the RoBERTa pretraining approach.
Large Language Model
B
vinai
74.86k
37
Roberta Fake News
A fake news detection model trained based on the RoBERTa architecture, which determines the authenticity of news by analyzing its textual content.
Text Classification
R
ghanashyamvtatti
26
3
Roberta Base Bne Finetuned Amazon Reviews Multi
This model is a text classification model fine-tuned on the Amazon multilingual reviews dataset based on roberta-base-bne, primarily used for sentiment analysis of Spanish comments.
Text Classification
Transformers

R
Proggleb
13
0
Featured Recommended AI Models