🚀 detr-resnet-50_fine_tuned_loc-2023
This model is a fine - tuned version of [facebook/detr - resnet - 50](https://huggingface.co/facebook/detr - resnet - 50) on the loc_beyond_words dataset. It can be used for object - detection tasks. On the evaluation set, it achieves a loss of 0.8784.
✨ Features
- Based on the
facebook/detr - resnet - 50
base model.
- Fine - tuned on the
biglam/loc_beyond_words
dataset.
- Suitable for object - detection tasks.
📦 Installation
No installation steps are provided in the original document, so this section is skipped.
💻 Usage Examples
No code examples are provided in the original document, so this section is skipped.
📚 Documentation
Model Information
Property |
Details |
Model Type |
Fine - tuned version of facebook/detr - resnet - 50 |
Training Data |
biglam/loc_beyond_words dataset |
Pipeline Tag |
Object - detection |
Training and Evaluation
Training Hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9, 0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- num_epochs: 100
Training Results
Training Loss |
Epoch |
Step |
Validation Loss |
3.731 |
0.16 |
50 |
2.6356 |
2.4875 |
0.31 |
100 |
2.2348 |
2.1786 |
0.47 |
150 |
2.1148 |
1.9845 |
0.62 |
200 |
1.8847 |
1.8507 |
0.78 |
250 |
1.8331 |
1.6813 |
0.94 |
300 |
1.5620 |
1.5613 |
1.09 |
350 |
1.5898 |
1.4966 |
1.25 |
400 |
1.4161 |
1.4831 |
1.41 |
450 |
1.4831 |
1.4587 |
1.56 |
500 |
1.3218 |
1.433 |
1.72 |
550 |
1.3529 |
1.33 |
1.88 |
600 |
1.2453 |
1.2842 |
2.03 |
650 |
1.2956 |
1.2807 |
2.19 |
700 |
1.1993 |
1.1767 |
2.34 |
750 |
1.1557 |
1.2134 |
2.5 |
800 |
1.1393 |
1.1897 |
2.66 |
850 |
1.2016 |
1.1784 |
2.81 |
900 |
1.1235 |
1.2016 |
2.97 |
950 |
1.1378 |
1.06 |
3.12 |
1000 |
1.0803 |
1.1124 |
3.28 |
1050 |
1.1145 |
1.1191 |
3.44 |
1100 |
1.0523 |
1.0819 |
3.59 |
1150 |
1.0165 |
1.1196 |
3.75 |
1200 |
1.0349 |
1.0534 |
3.91 |
1250 |
1.0441 |
1.0365 |
4.06 |
1300 |
1.1177 |
0.9853 |
4.22 |
1350 |
1.0721 |
0.9984 |
4.38 |
1400 |
0.9923 |
0.9802 |
4.53 |
1450 |
1.0079 |
1.04 |
4.69 |
1500 |
1.0198 |
1.098 |
4.84 |
1550 |
0.9788 |
1.079 |
5.0 |
1600 |
1.0291 |
1.0664 |
5.16 |
1650 |
0.9691 |
0.9715 |
5.31 |
1700 |
0.9380 |
0.9723 |
5.47 |
1750 |
1.0164 |
1.0019 |
5.62 |
1800 |
1.0064 |
0.9895 |
5.78 |
1850 |
1.0364 |
0.9835 |
5.94 |
1900 |
0.9848 |
0.994 |
6.09 |
1950 |
0.9353 |
0.9693 |
6.25 |
2000 |
0.9425 |
0.9413 |
6.41 |
2050 |
0.9173 |
0.9375 |
6.56 |
2100 |
0.9663 |
0.952 |
6.72 |
2150 |
0.8951 |
0.8927 |
6.88 |
2200 |
0.9099 |
0.8777 |
7.03 |
2250 |
0.9238 |
0.8976 |
7.19 |
2300 |
0.9715 |
0.9451 |
7.34 |
2350 |
0.9373 |
0.8972 |
7.5 |
2400 |
0.8959 |
0.9393 |
7.66 |
2450 |
1.0062 |
0.9 |
7.81 |
2500 |
0.8920 |
0.915 |
7.97 |
2550 |
0.8833 |
0.9018 |
8.12 |
2600 |
0.8671 |
0.8272 |
8.28 |
2650 |
0.9304 |
0.943 |
8.44 |
2700 |
0.8593 |
0.8667 |
8.59 |
2750 |
0.8875 |
0.871 |
8.75 |
2800 |
0.8457 |
0.9023 |
8.91 |
2850 |
0.8448 |
0.8733 |
9.06 |
2900 |
0.8261 |
0.8686 |
9.22 |
2950 |
0.8489 |
0.8412 |
9.38 |
3000 |
0.8244 |
0.8385 |
9.53 |
3050 |
0.8830 |
0.891 |
9.69 |
3100 |
0.8349 |
0.8692 |
9.84 |
3150 |
0.8672 |
0.8247 |
10.0 |
3200 |
0.8811 |
0.799 |
10.16 |
3250 |
0.8784 |
Framework Versions
- Transformers 4.27.4
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3
📄 License
This project is licensed under the Apache - 2.0 license.