Amoros Beaugosse Batch 64 Epochs 150 Test Large 2025 05 31 74882 Bs64 Freeze
A
Amoros Beaugosse Batch 64 Epochs 150 Test Large 2025 05 31 74882 Bs64 Freeze
Developed by Amoros
This model is a fine-tuned version based on facebook/dinov2-large, which has been optimized and trained on specific tasks.
Downloads 1,215
Release Time : 5/31/2025
Model Overview
This model is a fine-tuned version of facebook/dinov2-large, mainly used for vision tasks, and the specific usage is not clearly stated.
Model Features
Fine-tuned Based on DINOv2-large
Fine-tuning is carried out using the powerful DINOv2-large architecture, which may inherit its excellent visual feature extraction ability.
Long-term Training
After 150 epochs of training, the model may have good convergence.
Mixed Precision Training
Native AMP is used for mixed precision training, which improves the training efficiency.
Model Capabilities
Visual Feature Extraction
Image Classification (Inference)
Use Cases
Computer Vision
Image Classification
May be applicable to image classification tasks
The F1 micro score is 0.6595 and the accuracy is 0.5713 on the evaluation set
đ Amoros_Beaugosse_batch_64_epochs_150_test-large-2025_05_31_74882-bs64_freeze
This model is a fine - tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0794
- F1 Micro: 0.6595
- F1 Macro: 0.5426
- Accuracy: 0.5713
- Learning Rate: 0.0000
đ Quick Start
This section provides a brief introduction to the Amoros_Beaugosse model, which is a fine - tuned version of the facebook/dinov2-large
model.
⨠Features
- Fine - tuned from the
facebook/dinov2-large
model. - Achieved certain evaluation metrics on the evaluation set, including Loss, F1 Micro, F1 Macro, Accuracy, etc.
đ Documentation
Model description
This model is a fine - tuned version of facebook/dinov2-large on the None dataset.
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
đ§ Technical Details
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon = 1e - 08 and optimizer_args = No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | Rate |
---|---|---|---|---|---|---|---|
No log | 1.0 | 489 | 0.1054 | 0.4744 | 0.1919 | 0.3282 | 0.001 |
0.2259 | 2.0 | 978 | 0.0981 | 0.5294 | 0.2757 | 0.3986 | 0.001 |
0.1119 | 3.0 | 1467 | 0.0943 | 0.5452 | 0.3540 | 0.4125 | 0.001 |
0.1063 | 4.0 | 1956 | 0.0947 | 0.5346 | 0.3396 | 0.4013 | 0.001 |
0.104 | 5.0 | 2445 | 0.0943 | 0.5597 | 0.3631 | 0.4329 | 0.001 |
0.1031 | 6.0 | 2934 | 0.0946 | 0.5400 | 0.3305 | 0.4131 | 0.001 |
0.1029 | 7.0 | 3423 | 0.0934 | 0.5622 | 0.3664 | 0.4370 | 0.001 |
0.1022 | 8.0 | 3912 | 0.0955 | 0.5511 | 0.3686 | 0.4273 | 0.001 |
0.1021 | 9.0 | 4401 | 0.0934 | 0.5745 | 0.3633 | 0.4571 | 0.001 |
0.102 | 10.0 | 4890 | 0.0930 | 0.5688 | 0.3772 | 0.4417 | 0.001 |
0.1027 | 11.0 | 5379 | 0.0930 | 0.5624 | 0.3707 | 0.4352 | 0.001 |
0.102 | 12.0 | 5868 | 0.0920 | 0.5713 | 0.3767 | 0.4449 | 0.001 |
0.1017 | 13.0 | 6357 | 0.0924 | 0.5641 | 0.3580 | 0.4338 | 0.001 |
0.1014 | 14.0 | 6846 | 0.0917 | 0.5733 | 0.3675 | 0.4502 | 0.001 |
0.1006 | 15.0 | 7335 | 0.0904 | 0.5817 | 0.3965 | 0.4611 | 0.001 |
0.1011 | 16.0 | 7824 | 0.0906 | 0.5759 | 0.4032 | 0.4497 | 0.001 |
0.1007 | 17.0 | 8313 | 0.0917 | 0.5629 | 0.3868 | 0.4328 | 0.001 |
0.1009 | 18.0 | 8802 | 0.0910 | 0.5791 | 0.3982 | 0.4546 | 0.001 |
0.1007 | 19.0 | 9291 | 0.0909 | 0.5657 | 0.3833 | 0.4363 | 0.001 |
0.1006 | 20.0 | 9780 | 0.0905 | 0.5832 | 0.3929 | 0.4619 | 0.001 |
0.1008 | 21.0 | 10269 | 0.0917 | 0.5678 | 0.4099 | 0.4367 | 0.001 |
0.0998 | 22.0 | 10758 | 0.0868 | 0.6078 | 0.4424 | 0.4921 | 0.0001 |
0.0947 | 23.0 | 11247 | 0.0861 | 0.6140 | 0.4472 | 0.5008 | 0.0001 |
0.0937 | 24.0 | 11736 | 0.0853 | 0.6166 | 0.4589 | 0.5022 | 0.0001 |
0.0932 | 25.0 | 12225 | 0.0849 | 0.6163 | 0.4571 | 0.5025 | 0.0001 |
0.0922 | 26.0 | 12714 | 0.0845 | 0.6215 | 0.4645 | 0.5119 | 0.0001 |
0.0912 | 27.0 | 13203 | 0.0842 | 0.6259 | 0.4661 | 0.5159 | 0.0001 |
0.091 | 28.0 | 13692 | 0.0839 | 0.6245 | 0.4658 | 0.5133 | 0.0001 |
0.0905 | 29.0 | 14181 | 0.0839 | 0.6248 | 0.4696 | 0.5141 | 0.0001 |
0.0903 | 30.0 | 14670 | 0.0835 | 0.6276 | 0.4716 | 0.5202 | 0.0001 |
0.09 | 31.0 | 15159 | 0.0832 | 0.6303 | 0.4792 | 0.5222 | 0.0001 |
0.0892 | 32.0 | 15648 | 0.0831 | 0.6310 | 0.4858 | 0.5261 | 0.0001 |
0.0893 | 33.0 | 16137 | 0.0826 | 0.6338 | 0.4873 | 0.5307 | 0.0001 |
0.0884 | 34.0 | 16626 | 0.0826 | 0.6320 | 0.4740 | 0.5241 | 0.0001 |
0.0882 | 35.0 | 17115 | 0.0824 | 0.6342 | 0.4855 | 0.5302 | 0.0001 |
0.0886 | 36.0 | 17604 | 0.0823 | 0.6351 | 0.4845 | 0.5326 | 0.0001 |
0.0881 | 37.0 | 18093 | 0.0822 | 0.6340 | 0.4825 | 0.5273 | 0.0001 |
0.0882 | 38.0 | 18582 | 0.0823 | 0.6383 | 0.4913 | 0.5369 | 0.0001 |
0.0876 | 39.0 | 19071 | 0.0819 | 0.6400 | 0.4970 | 0.5369 | 0.0001 |
0.0877 | 40.0 | 19560 | 0.0819 | 0.6372 | 0.4887 | 0.5315 | 0.0001 |
0.0864 | 41.0 | 20049 | 0.0821 | 0.6317 | 0.4833 | 0.5239 | 0.0001 |
0.0867 | 42.0 | 20538 | 0.0814 | 0.6395 | 0.5033 | 0.5369 | 0.0001 |
0.0871 | 43.0 | 21027 | 0.0812 | 0.6456 | 0.5000 | 0.5460 | 0.0001 |
0.087 | 44.0 | 21516 | 0.0812 | 0.6400 | 0.4966 | 0.5371 | 0.0001 |
0.0863 | 45.0 | 22005 | 0.0815 | 0.6392 | 0.5049 | 0.5344 | 0.0001 |
0.0863 | 46.0 | 22494 | 0.0812 | 0.6419 | 0.5045 | 0.5395 | 0.0001 |
0.0859 | 47.0 | 22983 | 0.0809 | 0.6452 | 0.5071 | 0.5442 | 0.0001 |
0.0858 | 48.0 | 23472 | 0.0811 | 0.6451 | 0.5108 | 0.5449 | 0.0001 |
0.0861 | 49.0 | 23961 | 0.0812 | 0.6415 | 0.4906 | 0.5406 | 0.0001 |
0.0856 | 50.0 | 24450 | 0.0808 | 0.6449 | 0.5024 | 0.5432 | 0.0001 |
0.0857 | 51.0 | 24939 | 0.0807 | 0.6466 | 0.5080 | 0.5475 | 0.0001 |
0.0857 | 52.0 | 25428 | 0.0808 | 0.6432 | 0.5082 | 0.5414 | 0.0001 |
0.0852 | 53.0 | 25917 | 0.0806 | 0.6507 | 0.5132 | 0.5525 | 0.0001 |
0.0847 | 54.0 | 26406 | 0.0806 | 0.6436 | 0.5143 | 0.5420 | 0.0001 |
0.0849 | 55.0 | 26895 | 0.0809 | 0.6429 | 0.5096 | 0.5409 | 0.0001 |
0.0847 | 56.0 | 27384 | 0.0807 | 0.6486 | 0.5029 | 0.5485 | 0.0001 |
0.0845 | 57.0 | 27873 | 0.0807 | 0.6439 | 0.5007 | 0.5412 | 0.0001 |
0.0848 | 58.0 | 28362 | 0.0806 | 0.6497 | 0.4993 | 0.5520 | 0.0001 |
0.0843 | 59.0 | 28851 | 0.0804 | 0.6445 | 0.4995 | 0.5391 | 0.0001 |
0.0839 | 60.0 | 29340 | 0.0801 | 0.6549 | 0.5226 | 0.5597 | 0.0001 |
0.0844 | 61.0 | 29829 | 0.0807 | 0.6450 | 0.4941 | 0.5454 | 0.0001 |
0.0832 | 62.0 | 30318 | 0.0801 | 0.6470 | 0.5134 | 0.5438 | 0.0001 |
0.084 | 63.0 | 30807 | 0.0804 | 0.6494 | 0.5026 | 0.5517 | 0.0001 |
0.0834 | 64.0 | 31296 | 0.0802 | 0.6448 | 0.5091 | 0.5431 | 0.0001 |
0.0841 | 65.0 | 31785 | 0.0804 | 0.6504 | 0.5109 | 0.5513 | 0.0001 |
0.0837 | 66.0 | 32274 | 0.0802 | 0.6483 | 0.5137 | 0.5487 | 0.0001 |
0.0833 | 67.0 | 32763 | 0.0801 | 0.6517 | 0.5166 | 0.5557 | 0.0001 |
0.0836 | 68.0 | 33252 | 0.0798 | 0.6553 | 0.5184 | 0.5574 | 0.0001 |
0.0835 | 69.0 | 33741 | 0.0802 | 0.6516 | 0.5112 | 0.5549 | 0.0001 |
0.0827 | 70.0 | 34230 | 0.0798 | 0.6536 | 0.5232 | 0.5561 | 0.0001 |
0.0832 | 71.0 | 34719 | 0.0801 | 0.6510 | 0.5223 | 0.5536 | 0.0001 |
0.0831 | 72.0 | 35208 | 0.0799 | 0.6534 | 0.5130 | 0.5583 | 0.0001 |
0.0832 | 73.0 | 35697 | 0.0799 | 0.6489 | 0.5129 | 0.5487 | 0.0001 |
0.0836 | 74.0 | 36186 | 0.0799 | 0.6451 | 0.5035 | 0.5437 | 0.0001 |
0.0827 | 75.0 | 36675 | 0.0798 | 0.6520 | 0.5196 | 0.5533 | 0.0001 |
0.0827 | 76.0 | 37164 | 0.0797 | 0.6507 | 0.5247 | 0.5498 | 0.0001 |
0.0832 | 77.0 | 37653 | 0.0797 | 0.6537 | 0.5186 | 0.5574 | 0.0001 |
0.0824 | 78.0 | 38142 | 0.0796 | 0.6520 | 0.5284 | 0.5534 | 0.0001 |
0.0828 | 79.0 | 38631 | 0.0795 | 0.6536 | 0.5135 | 0.5572 | 0.0001 |
0.0824 | 80.0 | 39120 | 0.0797 | 0.6519 | 0.5117 | 0.5523 | 0.0001 |
0.0822 | 81.0 | 39609 | 0.0795 | 0.6548 | 0.5192 | 0.5586 | 0.0001 |
0.0824 | 82.0 | 40098 | 0.0796 | 0.6550 | 0.5164 | 0.5610 | 0.0001 |
0.0822 | 83.0 | 40587 | 0.0796 | 0.6556 | 0.5408 | 0.5607 | 0.0001 |
0.0818 | 84.0 | 41076 | 0.0792 | 0.6562 | 0.5267 | 0.5631 | 0.0001 |
0.0826 | 85.0 | 41565 | 0.0795 | 0.6517 | 0.5200 | 0.5559 | 0.0001 |
0.0819 | 86.0 | 42054 | 0.0794 | 0.6546 | 0.5127 | 0.5579 | 0.0001 |
0.0822 | 87.0 | 42543 | 0.0794 | 0.6566 | 0.5185 | 0.5613 | 0.0001 |
0.0818 | 88.0 | 43032 | 0.0794 | 0.6549 | 0.5269 | 0.5598 | 0.0001 |
0.0817 | 89.0 | 43521 | 0.0795 | 0.6555 | 0.5239 | 0.5585 | 0.0001 |
0.082 | 90.0 | 44010 | 0.0794 | 0.6518 | 0.5180 | 0.5536 | 0.0001 |
0.082 | 91.0 | 44499 | 0.0787 | 0.6601 | 0.5286 | 0.5699 | 1e - 05 |
0.0804 | 92.0 | 44988 | 0.0786 | 0.6590 | 0.5243 | 0.5656 | 1e - 05 |
0.0803 | 93.0 | 45477 | 0.0785 | 0.6585 | 0.5256 | 0.5645 | 1e - 05 |
0.0792 | 94.0 | 45966 | 0.0785 | 0.6592 | 0.5281 | 0.5646 | 1e - 05 |
0.0789 | 95.0 | 46455 | 0.0785 | 0.6603 | 0.5329 | 0.5696 | 1e - 05 |
0.0788 | 96.0 | 46944 | 0.0785 | 0.6602 | 0.5236 | 0.5685 | 1e - 05 |
0.0786 | 97.0 | 47433 | 0.0785 | 0.6590 | 0.5270 | 0.5653 | 1e - 05 |
0.0789 | 98.0 | 47922 | 0.0784 | 0.6629 | 0.5348 | 0.5727 | 1e - 05 |
0.0783 | 99.0 | 48411 | 0.0784 | 0.6626 | 0.5344 | 0.5726 | 1e - 05 |
0.0789 | 100.0 | 48900 | 0.0785 | 0.6607 | 0.5257 | 0.5709 | 1e - 05 |
0.0783 | 101.0 | 49389 | 0.0783 | 0.6620 | 0.5332 | 0.5723 | 1e - 05 |
0.0783 | 102.0 | 49878 | 0.0783 | 0.6644 | 0.5335 | 0.5750 | 1e - 05 |
0.0781 | 103.0 | 50367 | 0.0783 | 0.6652 | 0.5375 | 0.5796 | 1e - 05 |
0.0782 | 104.0 | 50856 | 0.0783 | 0.6644 | 0.5414 | 0.5751 | 1e - 05 |
0.0776 | 105.0 | 51345 | 0.0783 | 0.6646 | 0.5412 | 0.5776 | 1e - 05 |
0.0778 | 106.0 | 51834 | 0.0782 | 0.6670 | 0.5439 | 0.5803 | 1e - 05 |
0.0777 | 107.0 | 52323 | 0.0781 | 0.6652 | 0.5333 | 0.5771 | 1e - 05 |
0.0778 | 108.0 | 52812 | 0.0782 | 0.6628 | 0.5354 | 0.5716 | 1e - 05 |
0.078 | 109.0 | 53301 | 0.0781 | 0.6640 | 0.5352 | 0.5752 | 1e - 05 |
0.0785 | 110.0 | 53790 | 0.0780 | 0.6655 | 0.5345 | 0.5752 | 1e - 05 |
0.0772 | 111.0 | 54279 | 0.0781 | 0.6639 | 0.5403 | 0.5748 | 1e - 05 |
0.0779 | 112.0 | 54768 | 0.0780 | 0.6648 | 0.5373 | 0.5767 | 1e - 05 |
0.0774 | 113.0 | 55257 | 0.0781 | 0.6658 | 0.5446 | 0.5792 | 1e - 05 |
0.0774 | 114.0 | 55746 | 0.0780 | 0.6672 | 0.5445 | 0.5801 | 1e - 05 |
0.078 | 115.0 | 56235 | 0.0782 | 0.6671 | 0.5445 | 0.5816 | 1e - 05 |
0.0773 | 116.0 | 56724 | 0.0782 | 0.6647 | 0.5352 | 0.5756 | 1e - 05 |
0.0779 | 117.0 | 57213 | 0.0781 | 0.6641 | 0.5323 | 0.5758 | 1e - 05 |
0.0769 | 118.0 | 57702 | 0.0781 | 0.6655 | 0.5342 | 0.5774 | 1e - 05 |
0.0773 | 119.0 | 58191 | 0.0780 | 0.6655 | 0.5362 | 0.5769 | 0.0000 |
0.0771 | 120.0 | 58680 | 0.0780 | 0.6663 | 0.5425 | 0.5777 | 0.0000 |
0.0769 | 121.0 | 59169 | 0.0781 | 0.6668 | 0.5404 | 0.5806 | 0.0000 |
0.0769 | 122.0 | 59658 | 0.0780 | 0.6680 | 0.5436 | 0.5817 | 0.0000 |
0.0771 | 123.0 | 60147 | 0.0780 | 0.6667 | 0.5441 | 0.5798 | 0.0000 |
0.0773 | 124.0 | 60636 | 0.0780 | 0.6664 | 0.5436 | 0.5784 | 0.0000 |
0.0773 | 125.0 | 61125 | 0.0780 | 0.6660 | 0.5453 | 0.5777 | 0.0000 |
0.077 | 126.0 | 61614 | 0.0779 | 0.6632 | 0.5347 | 0.5726 | 0.0000 |
0.0774 | 127.0 | 62103 | 0.0780 | 0.6649 | 0.5324 | 0.5757 | 0.0000 |
0.0767 | 128.0 | 62592 | 0.0780 | 0.6662 | 0.5357 | 0.5765 | 0.0000 |
0.077 | 129.0 | 63081 | 0.0779 | 0.6664 | 0.5404 | 0.5773 | 0.0000 |
0.0773 | 130.0 | 63570 | 0.0781 | 0.6670 | 0.5409 | 0.5802 | 0.0000 |
0.0772 | 131.0 | 64059 | 0.0779 | 0.6686 | 0.5461 | 0.5828 | 0.0000 |
0.0772 | 132.0 | 64548 | 0.0779 | 0.6671 | 0.5430 | 0.5789 | 0.0000 |
0.077 | 133.0 | 65037 | 0.0780 | 0.6678 | 0.5418 | 0.5817 | 0.0000 |
0.0769 | 134.0 | 65526 | 0.0780 | 0.6670 | 0.5429 | 0.5796 | 0.0000 |
0.0766 | 135.0 | 66015 | 0.0779 | 0.6676 | 0.5453 | 0.5783 | 0.0000 |
0.0772 | 136.0 | 66504 | 0.0779 | 0.6646 | 0.5399 | 0.5750 | 0.0000 |
0.0772 | 137.0 | 66993 | 0.0780 | 0.6651 | 0.5299 | 0.5755 | 0.0000 |
0.0773 | 138.0 | 67482 | 0.0780 | 0.6664 | 0.5401 | 0.5793 | 0.0000 |
0.0771 | 139.0 | 67971 | 0.0780 | 0.6657 | 0.5310 | 0.5784 | 0.0000 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.6.0+cu118
- Datasets 3.0.2
- Tokenizers 0.21.1
đ License
This project is licensed under the Apache - 2.0 license.
Nsfw Image Detection
Apache-2.0
An NSFW image classification model based on the ViT architecture, pre-trained on ImageNet-21k via supervised learning and fine-tuned on 80,000 images to distinguish between normal and NSFW content.
Image Classification
Transformers

N
Falconsai
82.4M
588
Fairface Age Image Detection
Apache-2.0
An image classification model based on Vision Transformer architecture, pre-trained on the ImageNet-21k dataset, suitable for multi-category image classification tasks
Image Classification
Transformers

F
dima806
76.6M
10
Dinov2 Small
Apache-2.0
A small-scale vision Transformer model trained using the DINOv2 method, extracting image features through self-supervised learning
Image Classification
Transformers

D
facebook
5.0M
31
Vit Base Patch16 224
Apache-2.0
Vision Transformer model pre-trained on ImageNet-21k and fine-tuned on ImageNet for image classification tasks
Image Classification
V
google
4.8M
775
Vit Base Patch16 224 In21k
Apache-2.0
A Vision Transformer model pretrained on the ImageNet-21k dataset for image classification tasks.
Image Classification
V
google
2.2M
323
Dinov2 Base
Apache-2.0
Vision Transformer model trained using the DINOv2 method, extracting image features through self-supervised learning
Image Classification
Transformers

D
facebook
1.9M
126
Gender Classification
An image classification model built with PyTorch and HuggingPics for recognizing gender in images
Image Classification
Transformers

G
rizvandwiki
1.8M
48
Vit Base Nsfw Detector
Apache-2.0
An image classification model based on Vision Transformer (ViT) architecture, specifically designed to detect whether images contain NSFW (Not Safe For Work) content.
Image Classification
Transformers

V
AdamCodd
1.2M
47
Vit Hybrid Base Bit 384
Apache-2.0
The Hybrid Vision Transformer (ViT) model combines convolutional networks and Transformer architectures for image classification tasks, excelling on ImageNet.
Image Classification
Transformers

V
google
992.28k
6
Gender Classification 2
This is an image classification model based on the PyTorch framework and generated using HuggingPics tools, specifically designed for gender classification tasks.
Image Classification
Transformers

G
rizvandwiki
906.98k
32
Featured Recommended AI Models
Š 2025AIbase