B

Bertweet Base

Developed by vinai
BERTweet is the first publicly available language model specifically pretrained on English tweets, built upon the RoBERTa pretraining approach.
Downloads 74.86k
Release Time : 3/2/2022

Model Overview

BERTweet is a pretrained language model optimized for English tweets, suitable for various natural language processing tasks such as sentiment analysis, named entity recognition, etc.

Model Features

Tweet-specific pretraining
Specifically pretrained on English tweets, better understanding tweet-specific linguistic features
Large-scale training data
Trained on 850 million English tweets (16 billion tokens), including COVID-19 related tweets
Multi-task applicability
Excellent performance on multiple tasks including POS tagging, named entity recognition, sentiment analysis, and sarcasm detection

Model Capabilities

Text understanding
Sentiment analysis
Named entity recognition
POS tagging
Sarcasm detection

Use Cases

Social media analysis
Tweet sentiment analysis
Analyze sentiment tendencies in tweets
Excellent performance in sentiment analysis tasks
Sarcasm detection
Identify sarcastic expressions in tweets
Achieved good results in sarcasm detection tasks
Information extraction
Named entity recognition
Extract entities such as person names and locations from tweets
Excellent performance in named entity recognition tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase