Bertweet Covid19 Base Uncased
BERTweet is the first large-scale publicly available language model pretrained specifically for English tweets, optimized based on the RoBERTa architecture for social media text processing.
Downloads 15
Release Time : 3/2/2022
Model Overview
BERTweet is a pretrained language model specifically optimized for English tweets, suitable for social media text analysis tasks.
Model Features
Social media specialization
Optimized specifically for tweet characteristics, better handling informal expressions, abbreviations, and hashtags in social media text.
Large-scale training data
Pretrained on 850 million English tweets (16 billion tokens), including COVID-19 related tweets.
RoBERTa optimization
Built using RoBERTa pretraining methods with more efficient training strategies.
Model Capabilities
Tweet text understanding
Social media sentiment analysis
Topic classification
Named entity recognition
Use Cases
Social media analysis
COVID-19 tweet analysis
Analyzing public sentiment and concerns in COVID-19 related tweets
Brand sentiment monitoring
Monitoring and analyzing discussions about specific brands on social media
Natural language processing research
Social media language model research
Serving as a benchmark model for research related to social media text processing
Featured Recommended AI Models