đ FinTwitBERT-sentiment
FinTwitBERT-sentiment is a finetuned model designed for classifying the sentiment of financial tweets. It leverages FinTwitBERT as its base model, which has been pre - trained on 10 million financial tweets. This enables the model to handle the informal nature of financial tweets better than other financial texts like news headlines, making it highly effective for informal financial texts on social media.
đ Quick Start
Using HuggingFace's transformers library, you can convert the model and tokenizers into a pipeline for text classification.
đģ Usage Examples
Basic Usage
from transformers import pipeline
pipe = pipeline(
"sentiment-analysis",
model="StephanAkkerman/FinTwitBERT-sentiment",
)
print(pipe("Nice 9% pre market move for $para, pump my calls Uncle Buffett đ¤"))
⨠Features
- Specialized for Financial Tweets: FinTwitBERT - sentiment is specifically finetuned for financial tweet sentiment classification, making it highly accurate in this domain.
- Trained on Relevant Data: It has been trained on two datasets, one with real financial tweets and another synthetic dataset, ensuring robustness.
đĻ Installation
The README does not provide specific installation steps, so this section is skipped.
đ Documentation
Intended Uses
FinTwitBERT - sentiment is intended for classifying financial tweets or other financial social media texts.
Dataset
FinTwitBERT - sentiment has been trained on two datasets:
- [TimKoornstra/financial - tweets - sentiment](https://huggingface.co/datasets/TimKoornstra/financial - tweets - sentiment): 38,091 human - labeled tweets
- [TimKoornstra/synthetic - financial - tweets - sentiment](https://huggingface.co/datasets/TimKoornstra/synthetic - financial - tweets - sentiment): 1,428,771 synthetic tweets
More Information
For a comprehensive overview, including the training setup and analysis of the model, visit the FinTwitBERT GitHub repository.
đ§ Technical Details
The README does not provide detailed technical implementation details, so this section is skipped.
đ License
This project is licensed under the MIT License. See the LICENSE file for details.
đ Citing & Authors
If you use FinTwitBERT or FinTwitBERT - sentiment in your research, please cite us as follows, noting that both authors contributed equally to this work:
@misc{FinTwitBERT,
author = {Stephan Akkerman, Tim Koornstra},
title = {FinTwitBERT: A Specialized Language Model for Financial Tweets},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/TimKoornstra/FinTwitBERT}}
}
Additionally, if you utilize the sentiment classifier, please cite:
@misc{FinTwitBERT - sentiment,
author = {Stephan Akkerman, Tim Koornstra},
title = {FinTwitBERT - sentiment: A Sentiment Classifier for Financial Tweets},
year = {2023},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/StephanAkkerman/FinTwitBERT - sentiment}}
}
Information Table
Property |
Details |
Model Type |
Finetuned model for financial tweet sentiment classification |
Training Data |
[TimKoornstra/financial - tweets - sentiment](https://huggingface.co/datasets/TimKoornstra/financial - tweets - sentiment) (38,091 human - labeled tweets) and [TimKoornstra/synthetic - financial - tweets - sentiment](https://huggingface.co/datasets/TimKoornstra/synthetic - financial - tweets - sentiment) (1,428,771 synthetic tweets) |
Metrics |
accuracy, f1 |
Pipeline Tag |
text - classification |
Base Model |
StephanAkkerman/FinTwitBERT |
Tags |
NLP, BERT, FinBERT, FinTwitBERT, sentiment, finance, financial - analysis, sentiment - analysis, financial - sentiment - analysis, twitter, tweets, tweet - analysis, stocks, stock - market, crypto, cryptocurrency |