đ bank-transactions-statements-classification
This model is designed for classifying bank transaction statements. It's a fine - tuned version of an existing model, offering improved performance on relevant tasks.
đ Quick Start
This model is a fine - tuned version of flaubert/flaubert_small_cased on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0458
- Accuracy: 0.7683
- F1 Macro: 0.7945
- F1 Weighted: 0.7635
đ Documentation
Training and evaluation data
More information needed
Model description
More information needed
Intended uses & limitations
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e - 05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
F1 Macro |
F1 Weighted |
No log |
0.29 |
50 |
3.6955 |
0.1012 |
0.0238 |
0.0880 |
No log |
0.58 |
100 |
3.2965 |
0.2150 |
0.0569 |
0.1598 |
No log |
0.87 |
150 |
3.1122 |
0.2530 |
0.0833 |
0.1889 |
No log |
1.16 |
200 |
2.6838 |
0.3622 |
0.1800 |
0.3051 |
No log |
1.45 |
250 |
2.5128 |
0.3808 |
0.1938 |
0.3139 |
No log |
1.74 |
300 |
2.1573 |
0.4913 |
0.3241 |
0.4522 |
No log |
2.03 |
350 |
2.0208 |
0.5220 |
0.3910 |
0.4832 |
No log |
2.33 |
400 |
2.0454 |
0.5053 |
0.4090 |
0.4613 |
No log |
2.62 |
450 |
1.7601 |
0.5599 |
0.4682 |
0.5303 |
3.1338 |
2.91 |
500 |
1.6837 |
0.5965 |
0.5489 |
0.5736 |
3.1338 |
3.2 |
550 |
1.6337 |
0.5885 |
0.5744 |
0.5609 |
3.1338 |
3.49 |
600 |
1.4553 |
0.6491 |
0.6219 |
0.6322 |
3.1338 |
3.78 |
650 |
1.4483 |
0.6531 |
0.6441 |
0.6345 |
3.1338 |
4.07 |
700 |
1.4108 |
0.6625 |
0.6810 |
0.6522 |
3.1338 |
4.36 |
750 |
1.3241 |
0.6924 |
0.6999 |
0.6769 |
3.1338 |
4.65 |
800 |
1.3254 |
0.6824 |
0.6960 |
0.6703 |
3.1338 |
4.94 |
850 |
1.3349 |
0.6937 |
0.6952 |
0.6759 |
3.1338 |
5.23 |
900 |
1.2264 |
0.7057 |
0.7157 |
0.6931 |
3.1338 |
5.52 |
950 |
1.3012 |
0.6891 |
0.7061 |
0.6748 |
1.6259 |
5.81 |
1000 |
1.2756 |
0.7071 |
0.7224 |
0.6925 |
1.6259 |
6.1 |
1050 |
1.1432 |
0.7317 |
0.7440 |
0.7267 |
1.6259 |
6.4 |
1100 |
1.2014 |
0.7290 |
0.7434 |
0.7161 |
1.6259 |
6.69 |
1150 |
1.1029 |
0.7483 |
0.7656 |
0.7367 |
1.6259 |
6.98 |
1200 |
1.1643 |
0.7310 |
0.7470 |
0.7227 |
1.6259 |
7.27 |
1250 |
1.1112 |
0.7477 |
0.7561 |
0.7371 |
1.6259 |
7.56 |
1300 |
1.1662 |
0.7350 |
0.7668 |
0.7254 |
1.6259 |
7.85 |
1350 |
1.0756 |
0.7577 |
0.7823 |
0.7530 |
1.6259 |
8.14 |
1400 |
1.1390 |
0.7403 |
0.7657 |
0.7318 |
1.6259 |
8.43 |
1450 |
1.1555 |
0.7437 |
0.7637 |
0.7377 |
1.092 |
8.72 |
1500 |
1.1086 |
0.7437 |
0.7686 |
0.7384 |
1.092 |
9.01 |
1550 |
1.0789 |
0.7510 |
0.7780 |
0.7427 |
1.092 |
9.3 |
1600 |
1.0613 |
0.7543 |
0.7823 |
0.7492 |
1.092 |
9.59 |
1650 |
1.0750 |
0.7477 |
0.7701 |
0.7382 |
1.092 |
9.88 |
1700 |
1.1412 |
0.7423 |
0.7772 |
0.7349 |
1.092 |
10.17 |
1750 |
1.0580 |
0.7617 |
0.7918 |
0.7549 |
1.092 |
10.47 |
1800 |
1.0667 |
0.7670 |
0.7856 |
0.7580 |
1.092 |
10.76 |
1850 |
1.1344 |
0.7403 |
0.7757 |
0.7332 |
1.092 |
11.05 |
1900 |
1.0808 |
0.7603 |
0.7944 |
0.7571 |
1.092 |
11.34 |
1950 |
1.0367 |
0.7690 |
0.7932 |
0.7655 |
0.9029 |
11.63 |
2000 |
1.0921 |
0.7577 |
0.7861 |
0.7504 |
0.9029 |
11.92 |
2050 |
1.0833 |
0.7603 |
0.7912 |
0.7541 |
0.9029 |
12.21 |
2100 |
1.0523 |
0.7716 |
0.7968 |
0.7662 |
0.9029 |
12.5 |
2150 |
1.0467 |
0.7683 |
0.7939 |
0.7614 |
0.9029 |
12.79 |
2200 |
1.0515 |
0.7703 |
0.7987 |
0.7667 |
0.9029 |
13.08 |
2250 |
1.0604 |
0.7696 |
0.8020 |
0.7654 |
0.9029 |
13.37 |
2300 |
1.0900 |
0.7716 |
0.8002 |
0.7663 |
0.9029 |
13.66 |
2350 |
1.0348 |
0.7743 |
0.8009 |
0.7686 |
0.9029 |
13.95 |
2400 |
1.0495 |
0.7656 |
0.7929 |
0.7610 |
0.9029 |
14.24 |
2450 |
1.0411 |
0.7670 |
0.7956 |
0.7624 |
0.7924 |
14.53 |
2500 |
1.0458 |
0.7683 |
0.7945 |
0.7635 |
0.7924 |
14.83 |
2550 |
1.0401 |
0.7696 |
0.7982 |
0.7649 |
Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
đ License
This model is released under the MIT license.