đ large-algae-vit-rgb
This model is a fine - tuned version of samitizerxu/large-algae-vit-rgb on the None dataset. It provides evaluation results on the loss and accuracy metrics, offering valuable insights for relevant tasks.
đ Quick Start
This model is a fine-tuned version of samitizerxu/large-algae-vit-rgb on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1659
- Accuracy: 0.5798
đ Documentation
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
1.2115 |
1.0 |
120 |
0.9078 |
0.6315 |
1.1249 |
2.0 |
240 |
0.9217 |
0.6320 |
1.1385 |
3.0 |
360 |
0.9518 |
0.6180 |
1.1347 |
4.0 |
480 |
1.0201 |
0.6068 |
1.1358 |
5.0 |
600 |
1.0801 |
0.5892 |
1.098 |
6.0 |
720 |
1.0932 |
0.5851 |
1.0882 |
7.0 |
840 |
1.0347 |
0.6033 |
1.0688 |
8.0 |
960 |
1.0403 |
0.6056 |
1.0863 |
9.0 |
1080 |
1.0466 |
0.6009 |
1.1253 |
10.0 |
1200 |
1.2308 |
0.5511 |
1.0393 |
11.0 |
1320 |
1.1434 |
0.5869 |
1.0749 |
12.0 |
1440 |
1.2155 |
0.5622 |
1.0433 |
13.0 |
1560 |
1.2466 |
0.5522 |
1.0141 |
14.0 |
1680 |
1.1880 |
0.5563 |
1.0516 |
15.0 |
1800 |
1.1006 |
0.5992 |
1.0696 |
16.0 |
1920 |
1.0971 |
0.5751 |
0.9867 |
17.0 |
2040 |
1.1689 |
0.5827 |
1.0234 |
18.0 |
2160 |
1.1846 |
0.5751 |
1.0364 |
19.0 |
2280 |
1.1480 |
0.5739 |
1.0314 |
20.0 |
2400 |
1.0977 |
0.5880 |
1.0179 |
21.0 |
2520 |
1.1258 |
0.5851 |
1.0584 |
22.0 |
2640 |
1.1569 |
0.5822 |
1.0222 |
23.0 |
2760 |
1.1672 |
0.5839 |
0.996 |
24.0 |
2880 |
1.1737 |
0.5798 |
1.0343 |
25.0 |
3000 |
1.1588 |
0.5792 |
0.9854 |
26.0 |
3120 |
1.1758 |
0.5763 |
0.9753 |
27.0 |
3240 |
1.1715 |
0.5763 |
0.9881 |
28.0 |
3360 |
1.1403 |
0.5839 |
1.0057 |
29.0 |
3480 |
1.1765 |
0.5781 |
0.9824 |
30.0 |
3600 |
1.1659 |
0.5798 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.9.0
- Tokenizers 0.13.2
đ License
This model is licensed under the Apache-2.0 license.