đ twitter-bitcoin-emotion-classification
This model is a fine - tuned version of vinai/bertweet-base on an unknown dataset. It can classify emotions in Twitter Bitcoin - related texts, achieving certain accuracy and other evaluation metrics.
đ Quick Start
This section provides a high - level overview of the model. It is a fine - tuned version of vinai/bertweet-base on an unknown dataset.
Evaluation Results
It achieves the following results on the evaluation set:
- Loss: 1.7032
- Accuracy: 0.6561
- F1: 0.6572
- Macro F1: 0.6226
- Precision: 0.6630
- Recall: 0.6561
đ Documentation
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e - 05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon = 1e - 08 and optimizer_args = No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results

Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
F1 |
Macro F1 |
Precision |
Recall |
2.9772 |
0.2092 |
500 |
2.6140 |
0.4598 |
0.3456 |
0.2565 |
0.2855 |
0.4598 |
2.0831 |
0.4185 |
1000 |
1.9186 |
0.5043 |
0.4434 |
0.4006 |
0.5917 |
0.5043 |
1.7199 |
0.6277 |
1500 |
1.6219 |
0.5694 |
0.5441 |
0.5039 |
0.5912 |
0.5694 |
1.56 |
0.8370 |
2000 |
1.4268 |
0.6031 |
0.5922 |
0.5572 |
0.6167 |
0.6031 |
1.3383 |
1.0460 |
2500 |
1.3762 |
0.6220 |
0.6180 |
0.5814 |
0.6289 |
0.6220 |
1.3473 |
1.2553 |
3000 |
1.3732 |
0.6188 |
0.6132 |
0.5774 |
0.6367 |
0.6188 |
1.1643 |
1.4645 |
3500 |
1.4119 |
0.6239 |
0.6247 |
0.5894 |
0.6391 |
0.6239 |
1.2617 |
1.6738 |
4000 |
1.3296 |
0.6195 |
0.6219 |
0.5792 |
0.6453 |
0.6195 |
1.2549 |
1.8830 |
4500 |
1.3182 |
0.6330 |
0.6351 |
0.5975 |
0.6460 |
0.6330 |
0.8836 |
2.0921 |
5000 |
1.4422 |
0.6247 |
0.6298 |
0.6006 |
0.6528 |
0.6247 |
0.8652 |
2.3013 |
5500 |
1.4682 |
0.6475 |
0.6488 |
0.6120 |
0.6563 |
0.6475 |
0.9502 |
2.5106 |
6000 |
1.5682 |
0.6402 |
0.6438 |
0.6128 |
0.6559 |
0.6402 |
0.8769 |
2.7198 |
6500 |
1.5807 |
0.6392 |
0.6435 |
0.6100 |
0.6583 |
0.6392 |
0.7853 |
2.9291 |
7000 |
1.5285 |
0.6389 |
0.6403 |
0.6077 |
0.6555 |
0.6389 |
0.6828 |
3.1381 |
7500 |
1.6189 |
0.6467 |
0.6499 |
0.6170 |
0.6588 |
0.6467 |
0.6566 |
3.3474 |
8000 |
1.4571 |
0.6530 |
0.6519 |
0.6140 |
0.6558 |
0.6530 |
0.6767 |
3.5566 |
8500 |
1.6671 |
0.6567 |
0.6585 |
0.6216 |
0.6613 |
0.6567 |
0.6806 |
3.7659 |
9000 |
1.5203 |
0.6518 |
0.6542 |
0.6195 |
0.6628 |
0.6518 |
0.7289 |
3.9751 |
9500 |
1.5431 |
0.6524 |
0.6568 |
0.6224 |
0.6669 |
0.6524 |
0.5355 |
4.1841 |
10000 |
1.6772 |
0.6501 |
0.6528 |
0.6185 |
0.6619 |
0.6501 |
0.5352 |
4.3934 |
10500 |
1.7032 |
0.6561 |
0.6572 |
0.6226 |
0.6630 |
0.6561 |
0.615 |
4.6026 |
11000 |
1.6694 |
0.6504 |
0.6527 |
0.6175 |
0.6608 |
0.6504 |
0.5843 |
4.8119 |
11500 |
1.6443 |
0.6541 |
0.6567 |
0.6213 |
0.6634 |
0.6541 |
Framework versions
- Transformers 4.50.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
đ License
This project is licensed under the MIT license.