Sal Base
S
Sal Base
Developed by bolu61
This model was trained from scratch on an unknown dataset and achieved good performance on the evaluation set, providing effective support for related tasks.
Downloads 216
Release Time : 2/11/2025
Model Overview
A general model trained from scratch, performing well on multiple evaluation metrics and suitable for various classification tasks.
Model Features
High accuracy
The overall accuracy on the evaluation set reaches 87.86%, and it performs excellently in multiple categories.
Multi-class support
Supports classification of multiple categories such as Eol, Msg, Cmd, Var, Dff, Pgr, Stk,Itm, Yml, Obj, Tre.
Stable training
The AdamW optimizer and cosine learning rate scheduler are used, and the training process converges stably.
Model Capabilities
Multi-class classification
High-precision prediction
Imbalanced data processing
Use Cases
Text classification
Document classification
Classify different types of documents.
The F1 value for the Pgr category reaches 0.9405.
Message classification
Distinguish different types of messages.
The ROC AUC value for the Msg category reaches 0.9836.
Object recognition
Object classification
Identify and classify different objects.
Both the precision and recall for the Obj category reach 1.0.
🚀 sal-base
This model was trained from scratch on an unknown dataset and achieves high performance on the evaluation set.
🚀 Quick Start
This sal-base
model was trained from scratch on an unknown dataset. It has achieved the following results on the evaluation set:
Metric | Value |
---|---|
Loss | 0.7435 |
Accuracy | 0.8786 |
Precision Eol | 0.8682 |
Recall Eol | 0.9233 |
F1 Eol | 0.8949 |
Support Eol | 378 |
Roc Auc Eol | 0.9700 |
Precision Msg | 0.7391 |
Recall Msg | 0.5484 |
F1 Msg | 0.6296 |
Support Msg | 31 |
Roc Auc Msg | 0.9836 |
Precision Cmd | 0.2174 |
Recall Cmd | 0.4167 |
F1 Cmd | 0.2857 |
Support Cmd | 12 |
Roc Auc Cmd | 0.9762 |
Precision Var | 0.0 |
Recall Var | 0.0 |
F1 Var | 0.0 |
Support Var | 15 |
Roc Auc Var | 0.9956 |
Precision Dff | 0.0 |
Recall Dff | 0.0 |
F1 Dff | 0.0 |
Support Dff | 4 |
Roc Auc Dff | 0.9990 |
Precision Pgr | 0.8901 |
Recall Pgr | 0.9970 |
F1 Pgr | 0.9405 |
Support Pgr | 333 |
Roc Auc Pgr | 0.9991 |
Precision Stk | 0.9630 |
Recall Stk | 0.65 |
F1 Stk | 0.7761 |
Support Stk | 40 |
Roc Auc Stk | 0.9453 |
Precision Itm | 0.9821 |
Recall Itm | 0.7639 |
F1 Itm | 0.8594 |
Support Itm | 144 |
Roc Auc Itm | 0.9382 |
Precision Yml | 1.0 |
Recall Yml | 0.8537 |
F1 Yml | 0.9211 |
Support Yml | 41 |
Roc Auc Yml | 0.9964 |
Precision Obj | 1.0 |
Recall Obj | 1.0 |
F1 Obj | 1.0 |
Support Obj | 8 |
Roc Auc Obj | 1.0 |
Precision Tre | 1.0 |
Recall Tre | 0.9583 |
F1 Tre | 0.9787 |
Support Tre | 24 |
Roc Auc Tre | 1.0 |
🔧 Technical Details
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 10
- eval_batch_size: 10
- seed: 45242
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision Eol | Recall Eol | F1 Eol | Support Eol | Roc Auc Eol | Precision Msg | Recall Msg | F1 Msg | Support Msg | Roc Auc Msg | Precision Cmd | Recall Cmd | F1 Cmd | Support Cmd | Roc Auc Cmd | Precision Var | Recall Var | F1 Var | Support Var | Roc Auc Var | Precision Dff | Recall Dff | F1 Dff | Support Dff | Roc Auc Dff | Precision Pgr | Recall Pgr | F1 Pgr | Support Pgr | Roc Auc Pgr | Precision Stk | Recall Stk | F1 Stk | Support Stk | Roc Auc Stk | Precision Itm | Recall Itm | F1 Itm | Support Itm | Roc Auc Itm | Precision Yml | Recall Yml | F1 Yml | Support Yml | Roc Auc Yml | Precision Obj | Recall Obj | F1 Obj | Support Obj | Roc Auc Obj | Precision Tre | Recall Tre | F1 Tre | Support Tre | Roc Auc Tre |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2.5311 | 1.0 | 76 | 2.4941 | 0.0194 | 0.0 | 0.0 | 0.0 | 378 | 0.4851 | 0.0456 | 0.4516 | 0.0828 | 31 | 0.7149 | 0.0073 | 0.1667 | 0.0140 | 12 | 0.4709 | 0.0 | 0.0 | 0.0 | 15 | 0.4025 | 0.0 | 0.0 | 0.0 | 4 | 0.8850 | 0.0 | 0.0 | 0.0 | 333 | 0.6887 | 0.0 | 0.0 | 0.0 | 40 | 0.4469 | 0.1429 | 0.0208 | 0.0364 | 144 | 0.5840 | 0.5 | 0.0244 | 0.0465 | 41 | 0.7767 | 0.0 | 0.0 | 0.0 | 8 | 0.1738 | 0.0 | 0.0 | 0.0 | 24 | 0.7487 |
2.1099 | 2.0 | 152 | 2.0052 | 0.4388 | 0.4979 | 0.3095 | 0.3817 | 378 | 0.5048 | 0.1190 | 0.1613 | 0.1370 | 31 | 0.8382 | 0.0609 | 0.5833 | 0.1102 | 12 | 0.7734 | 0.0 | 0.0 | 0.0 | 15 | 0.5176 | 0.0 | 0.0 | 0.0 | 4 | 0.8207 | 0.5325 | 0.9339 | 0.6783 | 333 | 0.9186 | 0.0 | 0.0 | 0.0 | 40 | 0.5659 | 0.3571 | 0.0694 | 0.1163 | 144 | 0.7699 | 0.5 | 0.0488 | 0.0889 | 41 | 0.8385 | 0.0 | 0.0 | 0.0 | 8 | 0.4631 | 0.0 | 0.0 | 0.0 | 24 | 0.7796 |
1.5835 | 3.0 | 228 | 1.3663 | 0.5777 | 0.4690 | 0.8201 | 0.5967 | 378 | 0.7462 | 0.0 | 0.0 | 0.0 | 31 | 0.8730 | 0.0 | 0.0 | 0.0 | 12 | 0.9100 | 0.0 | 0.0 | 0.0 | 15 | 0.8133 | 0.0 | 0.0 | 0.0 | 4 | 0.6306 | 0.7972 | 0.8498 | 0.8227 | 333 | 0.9584 | 0.0 | 0.0 | 0.0 | 40 | 0.6980 | 0.2 | 0.0139 | 0.0260 | 144 | 0.8902 | 0.0 | 0.0 | 0.0 | 41 | 0.9466 | 0.0 | 0.0 | 0.0 | 8 | 0.8749 | 0.0 | 0.0 | 0.0 | 24 | 0.8314 |
1.3092 | 4.0 | 304 | 1.0934 | 0.6806 | 0.5640 | 0.8624 | 0.6820 | 378 | 0.8593 | 0.0 | 0.0 | 0.0 | 31 | 0.8850 | 0.0 | 0.0 | 0.0 | 12 | 0.9299 | 0.0 | 0.0 | 0.0 | 15 | 0.8156 | 0.0 | 0.0 | 0.0 | 4 | 0.6399 | 0.8944 | 0.9159 | 0.9050 | 333 | 0.9802 | 0.0 | 0.0 | 0.0 | 40 | 0.8137 | 0.6306 | 0.4861 | 0.5490 | 144 | 0.9450 | 0.0 | 0.0 | 0.0 | 41 | 0.9842 | 0.0 | 0.0 | 0.0 | 8 | 0.9250 | 0.0 | 0.0 | 0.0 | 24 | 0.9508 |
0.9523 | 5.0 | 380 | 0.9353 | 0.7262 | 0.6149 | 0.8492 | 0.7133 | 378 | 0.8953 | 0.0 | 0.0 | 0.0 | 31 | 0.9026 | 0.0 | 0.0 | 0.0 | 12 | 0.9492 | 0.0 | 0.0 | 0.0 | 15 | 0.8372 | 0.0 | 0.0 | 0.0 | 4 | 0.6399 | 0.8646 | 0.9970 | 0.9261 | 333 | 0.9892 | 0.0 | 0.0 | 0.0 | 40 | 0.8796 | 0.7661 | 0.6597 | 0.7090 | 144 | 0.9474 | 0.0 | 0.0 | 0.0 | 41 | 0.9895 | 0.0 | 0.0 | 0.0 | 8 | 0.9598 | 0.0 | 0.0 | 0.0 | 24 | 0.9965 |
0.8714 | 6.0 | 456 | 0.7562 | 0.8039 | 0.7164 | 0.9153 | 0.8037 | 378 | 0.9491 | 0.0 | 0.0 | 0.0 | 31 | 0.9143 | 0.0 | 0.0 | 0.0 | 12 | 0.9499 | 0.0 | 0.0 | 0.0 | 15 | 0.8298 | 0.0 | 0.0 | 0.0 | 4 | 0.7083 | 0.925 | 1.0 | 0.9610 | 333 | 0.9978 | 1.0 | 0.35 | 0.5185 | 40 | 0.9170 | 0.7516 | 0.7986 | 0.7744 | 144 | 0.9590 | 1.0 | 0.3171 | 0.4815 | 41 | 0.9919 | 0.0 | 0.0 | 0.0 | 8 | 0.9793 | 1.0 | 0.2917 | 0.4516 | 24 | 0.9989 |
0.7294 | 7.0 | 532 | 0.6222 | 0.8282 | 0.7596 | 0.8862 | 0.8181 | 378 | 0.9575 | 0.0 | 0.0 | 0.0 | 31 | 0.9330 | 0.0 | 0.0 | 0.0 | 12 | 0.9479 | 0.0 | 0.0 | 0.0 | 15 | 0.8359 | 0.0 | 0.0 | 0.0 | 4 | 0.8302 | 0.9276 | 1.0 | 0.9624 | 333 | 0.9986 | 1.0 | 0.375 | 0.5455 | 40 | 0.9634 | 0.7815 | 0.8194 | 0.8 | 144 | 0.9655 | 0.7826 | 0.8780 | 0.8276 | 41 | 0.9924 | 0.0 | 0.0 | 0.0 | 8 | 0.9976 | 0.9412 | 0.6667 | 0.7805 | 24 | 0.9993 |
0.5118 | 8.0 | 608 | 0.5242 | 0.8534 | 0.7680 | 0.9021 | 0.8297 | 378 | 0.9634 | 0.0 | 0.0 | 0.0 | 31 | 0.9457 | 0.0 | 0.0 | 0.0 | 12 | 0.9656 | 0.0 | 0.0 | 0.0 | 15 | 0.8963 | 0.0 | 0.0 | 0.0 | 4 | 0.9035 | 0.9302 | 1.0 | 0.9638 | 333 | 0.9993 | 1.0 | 0.375 | 0.5455 | 40 | 0.9755 | 0.8873 | 0.875 | 0.8811 | 144 | 0.9720 | 0.9091 | 0.9756 | 0.9412 | 41 | 0.9972 | 0.0 | 0.0 | 0.0 | 8 | 0.9991 | 0.9231 | 1.0 | 0.96 | 24 | 0.9995 |
0.4041 | 9.0 | 684 | 0.5220 | 0.8485 | 0.8020 | 0.8571 | 0.8286 | 378 | 0.9552 | 0.8333 | 0.3226 | 0.4651 | 31 | 0.9577 | 0.0 | 0.0 | 0.0 | 12 | 0.9672 | 0.0 | 0.0 | 0.0 | 15 | 0.9258 | 0.0 | 0.0 | 0.0 | 4 | 0.8845 | 0.8582 | 1.0 | 0.9237 | 333 | 0.9990 | 0.9667 | 0.725 | 0.8286 | 40 | 0.9734 | 0.9487 | 0.7708 | 0.8506 | 144 | 0.9765 | 0.9091 | 0.9756 | 0.9412 | 41 | 0.9977 | 0.75 | 0.375 | 0.5 | 8 | 0.9993 | 0.9231 | 1.0 | 0.96 | 24 | 0.9997 |
0.4858 | 10.0 | 760 | 0.4709 | 0.8748 | 0.7968 | 0.9127 | 0.8508 | 378 | 0.9622 | 0.8333 | 0.3226 | 0.4651 | 31 | 0.9592 | 0.0 | 0.0 | 0.0 | 12 | 0.9721 | 0.0 | 0.0 | 0.0 | 15 | 0.9589 | 0.0 | 0.0 | 0.0 | 4 | 0.8913 | 0.9328 | 1.0 | 0.9652 | 333 | 0.9995 | 1.0 | 0.55 | 0.7097 | 40 | 0.9741 | 0.9690 | 0.8681 | 0.9158 | 144 | 0.9798 | 0.9286 | 0.9512 | 0.9398 | 41 | 0.9982 | 0.75 | 0.375 | 0.5 | 8 | 0.9994 | 0.9231 | 1.0 | 0.96 | 24 | 0.9999 |
0.252 | 11.0 | 836 | 0.4155 | 0.8738 | 0.7949 | 0.9127 | 0.8498 | 378 | 0.9694 | 0.8462 | 0.3548 | 0.5 | 31 | 0.9737 | 0.375 | 0.25 | 0.3 | 12 | 0.9741 | 0.0 | 0.0 | 0.0 | 15 | 0.9900 | 0.0 | 0.0 | 0.0 | 4 | 0.9608 | 0.9354 | 1.0 | 0.9666 | 333 | 0.9996 | 1.0 | 0.45 | 0.6207 | 40 | 0.9663 | 0.9615 | 0.8681 | 0.9124 | 144 | 0.9811 | 0.9268 | 0.9268 | 0.9268 | 41 | 0.9984 | 0.75 | 0.375 | 0.5 | 8 | 0.9995 | 0.9231 | 1.0 | 0.96 | 24 | 1.0000 |
0.2958 | 12.0 | 912 | 0.4171 | 0.8699 | 0.8354 | 0.9127 | 0.8723 | 378 | 0.9718 | 0.8125 | 0.4194 | 0.5532 | 31 | 0.9773 | 0.6667 | 0.3333 | 0.4444 | 12 | 0.9763 | 0.0 | 0.0 | 0.0 | 15 | 0.9921 | 0.0 | 0.0 | 0.0 | 4 | 0.9518 | 0.8952 | 1.0 | 0.9447 | 333 | 0.9997 | 0.9565 | 0.55 | 0.6984 | 40 | 0.9705 | 0.8810 | 0.7708 | 0.8222 | 144 | 0.9817 | 0.9268 | 0.9268 | 0.9268 | 41 | 0.9986 | 0.8571 | 0.75 | 0.8 | 8 | 0.9998 | 0.9231 | 1.0 | 0.96 | 24 | 0.9997 |
0.3351 | 13.0 | 988 | 0.4281 | 0.8786 | 0.8438 | 0.8862 | 0.8645 | 378 | 0.9676 | 0.8421 | 0.5161 | 0.64 | 31 | 0.9798 | 0.625 | 0.4167 | 0.5 | 12 | 0.9788 | 0.0 | 0.0 | 0.0 | 15 | 0.9867 | 0.0 | 0.0 | 0.0 | 4 | 0.9464 | 0.9074 | 1.0 | 0.9514 | 333 | 0.9996 | 1.0 | 0.575 | 0.7302 | 40 | 0.9758 | 0.8865 | 0.8681 | 0.8772 | 144 | 0.9840 | 0.9048 | 0.9268 | 0.9157 | 41 | 0.9984 | 0.8571 | 0.75 | 0.8 | 8 | 0.9998 | 0.9231 | 1.0 | 0.96 | 24 | 1.0 |
0.1799 | 14.0 | 1064 | 0.3970 | 0.8806 | 0.8159 | 0.9259 | 0.8674 | 378 | 0.9704 | 0.8462 | 0.3548 | 0.5 | 31 | 0.9825 | 0.625 | 0.4167 | 0.5 | 12 | 0.9790 | 0.0 | 0.0 | 0.0 | 15 | 0.9934 | 0.0 | 0.0 | 0.0 | 4 | 0.9976 | 0.9354 | 1.0 | 0.9666 | 333 | 0.9997 | 1.0 | 0.375 | 0.5455 | 40 | 0.9539 | 0.9143 | 0.8889 | 0.9014 | 144 | 0.9859 | 0.9459 | 0.8537 | 0.8974 | 41 | 0.9961 | 0.8571 | 0.75 | 0.8 | 8 | 0.9998 | 0.96 | 1.0 | 0.9796 | 24 | 1.0 |
0.2573 | 15.0 | 1140 | 0.3927 | 0.8942 | 0.8589 | 0.9180 | 0.8875 | 378 | 0.9713 | 0.8571 | 0.5806 | 0.6923 | 31 | 0.9833 | 0.8333 | 0.4167 | 0.5556 | 12 | 0.9814 | 0.0 | 0.0 | 0.0 | 15 | 0.9899 | 0.0 | 0.0 | 0.0 | 4 | 0.9922 | 0.9276 | 1.0 | 0.9624 | 333 | 0.9997 | 1.0 | 0.55 | 0.7097 | 40 | 0.9706 | 0.8966 | 0.9028 | 0.8997 | 144 | 0.9851 | 0.9474 | 0.8780 | 0.9114 | 41 | 0.9982 | 1.0 | 0.75 | 0.8571 | 8 | 0.9998 | 0.9231 | 1.0 | 0.96 | 24 | 1.0 |
0.1706 | 16.0 | 1216 | 0.3886 | 0.8835 | 0.8357 | 0.9286 | 0.8797 | 378 | 0.9739 | 0.85 | 0.5484 | 0.6667 | 31 | 0.9848 | 0.8333 | 0.4167 | 0.5556 | 12 | 0.9799 | 0.0 | 0.0 | 0.0 | 15 | 0.9940 | 0.0 | 0.0 | 0.0 | 4 | 0.9963 | 0.9171 | 0.9970 | 0.9554 | 333 | 0.9997 | 1.0 | 0.475 | 0.6441 | 40 | 0.9615 | 0.9084 | 0.8264 | 0.8655 | 144 | 0.9826 | 0.9487 | 0.9024 | 0.925 | 41 | 0.9970 | 1.0 | 0.75 | 0.8571 | 8 | 0.9998 | 1.0 | 1.0 | 1.0 | 24 | 1.0 |
0.1121 | 17.0 | 1292 | 0.4100 | 0.8777 | 0.8540 | 0.9286 | 0.8897 | 378 | 0.9717 | 0.8235 | 0.4516 | 0.5833 | 31 | 0.9849 | 0.625 | 0.4167 | 0.5 | 12 | 0.9783 | 0.0 | 0.0 | 0.0 | 15 | 0.9931 | 0.0 | 0.0 | 0.0 | 4 | 0.9954 | 0.8904 | 1.0 | 0.9420 | 333 | 0.9998 | 1.0 | 0.625 | 0.7692 | 40 | 0.9721 | 0.8790 | 0.7569 | 0.8134 | 144 | 0.9789 | 1.0 | 0.8537 | 0.9211 | 41 | 0.9974 | 1.0 | 1.0 | 1.0 | 8 | 1.0 | 1.0 | 1.0 | 1.0 | 24 | 1.0 |
0.1414 | 18.0 | 1368 | 0.4740 | 0.8757 | 0.8509 | 0.9206 | 0.8844 | 378 | 0.9646 | 0.85 | 0.5484 | 0.6667 | 31 | 0.9854 | 0.8333 | 0.4167 | 0.5556 | 12 | 0.9780 | 0.0 | 0.0 | 0.0 | 15 | 0.9932 | 0.0 | 0.0 | 0.0 | 4 | 0.9966 | 0.8833 | 1.0 | 0.9380 | 333 | 0.9996 | 1.0 | 0.575 | 0.7302 | 40 | 0.9652 | 0.8862 | 0.7569 | 0.8165 | 144 | 0.9758 | 0.9722 | 0.8537 | 0.9091 | 41 | 0.9975 | 1.0 | 1.0 | 1.0 | 8 | 1.0 | 1.0 | 1.0 | 1.0 | 24 | 1.0 |
0.0935 | 19.0 | 1444 | 0.4503 | 0.8689 | 0.8769 | 0.9233 | 0.8995 | 378 | 0.9722 | 0.8077 | 0.6774 | 0.7368 | 31 | 0.9890 | 0.25 | 0.4167 | 0.3125 | 12 | 0.9780 | 0.0 | 0.0 | 0.0 | 15 | 0.9939 | 0.0 | 0.0 | 0.0 | 4 | 0.9983 | 0.8582 | 1.0 | 0.9237 | 333 | 0.9997 | 0.9630 | 0.65 | 0.7761 | 40 | 0.9524 | 0.9592 | 0.6528 | 0.7769 | 144 | 0.9767 | 0.9737 | 0.9024 | 0.9367 | 41 | 0.9982 | 1.0 | 0.75 | 0.8571 | 8 | 1.0 | 1.0 | 1.0 | 1.0 | 24 | 1.0 |
0.0794 | 20.0 | 1520 | 0.4563 | 0.8709 | 0.8886 | 0.9074 | 0.8979 | 378 | 0.9740 | 0.875 | 0.6774 | 0.7636 | 31 | 0.9879 | 0.7143 | 0.4167 | 0.5263 | 12 | 0.9795 | 0.0 | 0.0 | 0.0 | 15 | 0.9936 | 0.0 | 0.0 | 0.0 | 4 | 0.9976 | 0.8473 | 1.0 | 0.9174 | 333 | 0.9998 | 0.9630 | 0.65 | 0.7761 | 40 | 0.9514 | 0.8534 | 0.6875 | 0.7615 | 144 | 0.9779 | 0.95 | 0.9268 | 0.9383 | 41 | 0.9996 | 1.0 | 1.0 | 1.0 | 8 | 1.0 | 1.0 | 1.0 | 1.0 | 24 | 1.0 |
0.1209 | 21.0 | 1596 | 0.5927 | 0.8505 | 0.8014 | 0.9286 | 0.8603 | 378 | 0.9639 | 0.85 | 0.5484 | 0.6667 | 31 | 0.9857 | 0.8333 | 0.4167 | 0.5556 | 12 | 0.9813 | 0.0 | 0.0 | 0.0 | 15 | 0.9939 | 0.0 | 0.0 | 0.0 | 4 | 0.9956 | 0.8513 | 0.9970 | 0.9184 | 333 | 0.9997 | 1.0 | 0.375 | 0.5455 | 40 |
Distilbert Base Uncased Finetuned Sst 2 English
Apache-2.0
Text classification model fine-tuned on the SST-2 sentiment analysis dataset based on DistilBERT-base-uncased, with 91.3% accuracy
Text Classification English
D
distilbert
5.2M
746
Xlm Roberta Base Language Detection
MIT
Multilingual detection model based on XLM-RoBERTa, supporting text classification in 20 languages
Text Classification
Transformers Supports Multiple Languages

X
papluca
2.7M
333
Roberta Hate Speech Dynabench R4 Target
This model improves online hate detection through dynamic dataset generation, focusing on learning from worst-case scenarios to enhance detection effectiveness.
Text Classification
Transformers English

R
facebook
2.0M
80
Bert Base Multilingual Uncased Sentiment
MIT
A multilingual sentiment analysis model fine-tuned based on bert-base-multilingual-uncased, supporting sentiment analysis of product reviews in 6 languages
Text Classification Supports Multiple Languages
B
nlptown
1.8M
371
Emotion English Distilroberta Base
A fine-tuned English text emotion classification model based on DistilRoBERTa-base, capable of predicting Ekman's six basic emotions and neutral category.
Text Classification
Transformers English

E
j-hartmann
1.1M
402
Robertuito Sentiment Analysis
Spanish tweet sentiment analysis model based on RoBERTuito, supporting POS(positive)/NEG(negative)/NEU(neutral) three-class sentiment classification
Text Classification Spanish
R
pysentimiento
1.0M
88
Finbert Tone
FinBERT is a BERT model pre-trained on financial communication texts, specializing in the field of financial natural language processing. finbert-tone is its fine-tuned version for financial sentiment analysis tasks.
Text Classification
Transformers English

F
yiyanghkust
998.46k
178
Roberta Base Go Emotions
MIT
A multi-label sentiment classification model based on RoBERTa-base, trained on the go_emotions dataset, supporting recognition of 28 emotion labels.
Text Classification
Transformers English

R
SamLowe
848.12k
565
Xlm Emo T
XLM-EMO is a multilingual sentiment analysis model fine-tuned based on the XLM-T model, supporting 19 languages and specifically designed for sentiment prediction in social media texts.
Text Classification
Transformers Other

X
MilaNLProc
692.30k
7
Deberta V3 Base Mnli Fever Anli
MIT
DeBERTa-v3 model trained on MultiNLI, Fever-NLI, and ANLI datasets, excelling in zero-shot classification and natural language inference tasks
Text Classification
Transformers English

D
MoritzLaurer
613.93k
204
Featured Recommended AI Models