Detr Resnet 50 Dc5 Grasshopper Testdata Finetuned2.0 Maxsteps 10000 Batchsize 2 Ilham
Model Overview
This model is an object detection model based on the DETR (Detection Transformer) architecture, specifically fine-tuned for detecting insects such as grasshoppers.
Model Features
Transformer-based Object Detection
Utilizes the DETR architecture, combining the strengths of Transformer and CNN for end-to-end object detection.
Optimized for Insect Detection
Specially fine-tuned for detecting insects like grasshoppers, suitable for agricultural pest monitoring scenarios.
Efficient Detection
Uses ResNet-50 as the backbone network, achieving relatively efficient detection while maintaining high accuracy.
Model Capabilities
Object Detection
Insect Identification
Agricultural Pest Monitoring
Use Cases
Agricultural Monitoring
Grasshopper Population Monitoring
Used to monitor the density and distribution of grasshopper populations in farmland
Capable of detecting various pests such as black-tailed leafhoppers and Malayan leafhoppers
Pest Warning System
Integrated into agricultural warning systems to promptly detect the risk of pest outbreaks
đ detr-resnet-50-dc5-grasshopper-testdata-finetuned2.0-maxsteps-10000-batchsize-2-ilham
This model is a fine - tuned version of [facebook/detr - resnet - 50 - dc5](https://huggingface.co/facebook/detr - resnet - 50 - dc5) on the None dataset. It offers a set of evaluation metrics that can be used to assess its performance in relevant tasks.
đ Quick Start
This fine - tuned model is based on [facebook/detr - resnet - 50 - dc5](https://huggingface.co/facebook/detr - resnet - 50 - dc5). You can use it according to the general usage methods of the transformers
library.
⨠Features
- Fine - tuned: It is a fine - tuned version of [facebook/detr - resnet - 50 - dc5](https://huggingface.co/facebook/detr - resnet - 50 - dc5), potentially better adapted to specific tasks.
- Rich Evaluation Metrics: It provides multiple evaluation metrics such as loss, mAP, and mAR, which can comprehensively evaluate the model's performance.
đ Documentation
Model Performance
This model achieves the following results on the evaluation set:
- Loss: 3.0007
- Map: 0.0047
- Map 50: 0.0173
- Map 75: 0.0009
- Map Small: 0.0047
- Map Medium: - 1.0
- Map Large: - 1.0
- Mar 1: 0.0017
- Mar 10: 0.0117
- Mar 100: 0.0426
- Mar Small: 0.0426
- Mar Medium: - 1.0
- Mar Large: - 1.0
- Map Recilia dorsalis: 0.0113
- Mar 100 Recilia dorsalis: 0.1227
- Map Nephotettix malayanus: 0.0075
- Mar 100 Nephotettix malayanus: 0.0307
- Map Sogatella furcifera: 0.0
- Mar 100 Sogatella furcifera: 0.0
- Map Nilaparvata lugens: 0.0
- Mar 100 Nilaparvata lugens: 0.0171
Training Procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e - 05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon = 1e - 08 and optimizer_args = No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Recilia dorsalis | Mar 100 Recilia dorsalis | Map Nephotettix malayanus | Mar 100 Nephotettix malayanus | Map Sogatella furcifera | Mar 100 Sogatella furcifera | Map Nilaparvata lugens | Mar 100 Nilaparvata lugens |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4.922 | 0.5952 | 50 | 4.2729 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0085 | 0.0085 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0177 | 0.0 | 0.0 | 0.0002 | 0.0165 |
4.1668 | 1.1905 | 100 | 3.9789 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0051 | 0.0051 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0163 | 0.0 | 0.0 | 0.0 | 0.0041 |
4.5854 | 1.7857 | 150 | 3.7985 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0055 | 0.0055 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0163 | 0.0 | 0.0 | 0.0001 | 0.0059 |
4.3943 | 2.3810 | 200 | 3.7751 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.004 | 0.004 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0118 |
2.9732 | 2.9762 | 250 | 3.7922 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.0023 | 0.0023 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0001 | 0.0088 |
3.571 | 3.5714 | 300 | 3.8144 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0018 | 0.0018 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0071 |
2.4278 | 4.1667 | 350 | 3.7715 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0002 | 0.0005 | 0.0028 | 0.0028 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0094 |
4.0768 | 4.7619 | 400 | 3.6700 | 0.0001 | 0.0004 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0001 | 0.0011 | 0.0053 | 0.0053 | -1.0 | -1.0 | 0.0001 | 0.0125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0088 |
4.4523 | 5.3571 | 450 | 3.6946 | 0.0001 | 0.0004 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0074 | 0.0074 | -1.0 | -1.0 | 0.0002 | 0.0213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0082 |
3.5476 | 5.9524 | 500 | 3.7100 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0049 | 0.0049 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0153 |
3.5861 | 6.5476 | 550 | 3.6604 | 0.0001 | 0.0005 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0089 | 0.0089 | -1.0 | -1.0 | 0.0003 | 0.0343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 |
3.32 | 7.1429 | 600 | 3.6814 | 0.0003 | 0.0013 | 0.0 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.003 | 0.0167 | 0.0167 | -1.0 | -1.0 | 0.001 | 0.0667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.4471 | 7.7381 | 650 | 3.7570 | 0.0001 | 0.0005 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0003 | 0.0003 | 0.0073 | 0.0073 | -1.0 | -1.0 | 0.0003 | 0.0292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
5.2238 | 8.3333 | 700 | 3.6898 | 0.0002 | 0.0015 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0086 | 0.0086 | -1.0 | -1.0 | 0.0009 | 0.0338 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
3.8461 | 8.9286 | 750 | 3.7262 | 0.0002 | 0.001 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0133 | 0.0133 | -1.0 | -1.0 | 0.0007 | 0.0532 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.2069 | 9.5238 | 800 | 3.7627 | 0.0001 | 0.0007 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0082 | 0.0082 | -1.0 | -1.0 | 0.0005 | 0.0329 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.0347 | 10.1190 | 850 | 3.7339 | 0.0006 | 0.0028 | 0.0 | 0.0006 | -1.0 | -1.0 | 0.0 | 0.0027 | 0.0172 | 0.0172 | -1.0 | -1.0 | 0.0023 | 0.069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.0595 | 10.7143 | 900 | 3.6200 | 0.0005 | 0.0025 | 0.0 | 0.0005 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0167 | 0.0167 | -1.0 | -1.0 | 0.0021 | 0.0667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.2273 | 11.3095 | 950 | 3.6058 | 0.001 | 0.0059 | 0.0 | 0.001 | -1.0 | -1.0 | 0.0003 | 0.0027 | 0.0157 | 0.0157 | -1.0 | -1.0 | 0.0039 | 0.063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
2.6219 | 11.9048 | 1000 | 3.5723 | 0.0011 | 0.0045 | 0.0 | 0.0012 | -1.0 | -1.0 | 0.0003 | 0.0031 | 0.0115 | 0.0115 | -1.0 | -1.0 | 0.0045 | 0.0458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.5385 | 12.5 | 1050 | 3.5279 | 0.0018 | 0.0084 | 0.0002 | 0.0018 | -1.0 | -1.0 | 0.0002 | 0.0041 | 0.0233 | 0.0233 | -1.0 | -1.0 | 0.0072 | 0.0931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.6905 | 13.0952 | 1100 | 3.5101 | 0.0008 | 0.0035 | 0.0001 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0022 | 0.0163 | 0.0163 | -1.0 | -1.0 | 0.0031 | 0.0653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.0729 | 13.6905 | 1150 | 3.5698 | 0.0007 | 0.0032 | 0.0 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0024 | 0.0112 | 0.0112 | -1.0 | -1.0 | 0.0027 | 0.0449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.6636 | 14.2857 | 1200 | 3.4950 | 0.0011 | 0.0052 | 0.0001 | 0.0011 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0199 | 0.0199 | -1.0 | -1.0 | 0.0043 | 0.0796 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.4026 | 14.8810 | 1250 | 3.4642 | 0.0019 | 0.0088 | 0.0001 | 0.0019 | -1.0 | -1.0 | 0.0003 | 0.0061 | 0.0266 | 0.0266 | -1.0 | -1.0 | 0.0074 | 0.1065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.514 | 15.4762 | 1300 | 3.4471 | 0.0021 | 0.0074 | 0.0005 | 0.0021 | -1.0 | -1.0 | 0.0006 | 0.0047 | 0.0293 | 0.0293 | -1.0 | -1.0 | 0.0085 | 0.1171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.4938 | 16.0714 | 1350 | 3.5414 | 0.0028 | 0.0095 | 0.0001 | 0.0028 | -1.0 | -1.0 | 0.0007 | 0.0043 | 0.023 | 0.023 | -1.0 | -1.0 | 0.0061 | 0.0898 | 0.005 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
3.9956 | 16.6667 | 1400 | 3.4033 | 0.0031 | 0.012 | 0.0006 | 0.0031 | -1.0 | -1.0 | 0.0001 | 0.0084 | 0.0362 | 0.0362 | -1.0 | -1.0 | 0.0123 | 0.1449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.1115 | 17.2619 | 1450 | 3.4297 | 0.0022 | 0.0107 | 0.0001 | 0.0022 | -1.0 | -1.0 | 0.0007 | 0.0046 | 0.0264 | 0.0264 | -1.0 | -1.0 | 0.0086 | 0.1056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.0744 | 17.8571 | 1500 | 3.4763 | 0.0018 | 0.0071 | 0.0001 | 0.0018 | -1.0 | -1.0 | 0.0003 | 0.0063 | 0.028 | 0.028 | -1.0 | -1.0 | 0.0073 | 0.112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.3757 | 18.4524 | 1550 | 3.4410 | 0.002 | 0.008 | 0.0004 | 0.002 | -1.0 | -1.0 | 0.0008 | 0.0066 | 0.0317 | 0.0317 | -1.0 | -1.0 | 0.008 | 0.1269 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.5711 | 19.0476 | 1600 | 3.4374 | 0.0024 | 0.0101 | 0.0003 | 0.0024 | -1.0 | -1.0 | 0.0001 | 0.0069 | 0.0322 | 0.0322 | -1.0 | -1.0 | 0.0096 | 0.1287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
2.8704 | 19.6429 | 1650 | 3.4496 | 0.0018 | 0.0074 | 0.0004 | 0.0018 | -1.0 | -1.0 | 0.0007 | 0.0057 | 0.0252 | 0.0252 | -1.0 | -1.0 | 0.0073 | 0.1009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.9042 | 20.2381 | 1700 | 3.3804 | 0.0029 | 0.0119 | 0.0004 | 0.0029 | -1.0 | -1.0 | 0.0002 | 0.0065 | 0.0306 | 0.0306 | -1.0 | -1.0 | 0.0117 | 0.1222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.3289 | 20.8333 | 1750 | 3.4044 | 0.0021 | 0.0083 | 0.0 | 0.0021 | -1.0 | -1.0 | 0.001 | 0.0053 | 0.0238 | 0.0238 | -1.0 | -1.0 | 0.0086 | 0.0954 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.4952 | 21.4286 | 1800 | 3.4224 | 0.0014 | 0.0069 | 0.0001 | 0.0014 | -1.0 | -1.0 | 0.0009 | 0.003 | 0.0231 | 0.0231 | -1.0 | -1.0 | 0.0058 | 0.0926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.7537 | 22.0238 | 1850 | 3.3736 | 0.0012 | 0.0055 | 0.0001 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0039 | 0.0223 | 0.0223 | -1.0 | -1.0 | 0.0047 | 0.0894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.7113 | 22.6190 | 1900 | 3.5088 | 0.0011 | 0.0045 | 0.0002 | 0.0011 | -1.0 | -1.0 | 0.0 | 0.0038 | 0.0214 | 0.0214 | -1.0 | -1.0 | 0.0045 | 0.0856 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.6112 | 23.2143 | 1950 | 3.3937 | 0.0017 | 0.0097 | 0.0 | 0.0017 | -1.0 | -1.0 | 0.0008 | 0.0043 | 0.0227 | 0.0227 | -1.0 | -1.0 | 0.0066 | 0.0903 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
4.0428 | 23.8095 | 2000 | 3.3784 | 0.0013 | 0.0054 | 0.0001 | 0.0013 | -1.0 | -1.0 | 0.0003 | 0.0045 | 0.0233 | 0.0233 | -1.0 | -1.0 | 0.0052 | 0.0931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
4.7735 | 24.4048 | 2050 | 3.3649 | 0.0025 | 0.0101 | 0.0009 | 0.0025 | -1.0 | -1.0 | 0.0017 | 0.0082 | 0.03 | 0.03 | -1.0 | -1.0 | 0.0085 | 0.1116 | 0.0015 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0024 |
4.2583 | 25.0 | 2100 | 3.3489 | 0.0009 | 0.0043 | 0.0001 | 0.0009 | -1.0 | -1.0 | 0.0005 | 0.0032 | 0.0168 | 0.0168 | -1.0 | -1.0 | 0.0034 | 0.0671 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
3.383 | 25.5952 | 2150 | 3.3785 | 0.0014 | 0.0066 | 0.0001 | 0.0014 | -1.0 | -1.0 | 0.0015 | 0.0037 | 0.0221 | 0.0221 | -1.0 | -1.0 | 0.0054 | 0.0884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
đ License
This project uses the apache - 2.0
license.
Table Transformer Detection
MIT
A table detection model based on the DETR architecture, specifically designed for extracting tables from unstructured documents
Object Detection
Transformers

T
microsoft
2.6M
349
Grounding Dino Base
Apache-2.0
Grounding DINO is an open-set object detection model that achieves zero-shot object detection capabilities by combining the DINO detector with a text encoder.
Object Detection
Transformers

G
IDEA-Research
1.1M
87
Grounding Dino Tiny
Apache-2.0
Grounding DINO is an open-set object detection model that combines the DINO detector with grounding pre-training, enabling zero-shot object detection.
Object Detection
Transformers

G
IDEA-Research
771.67k
74
Detr Resnet 50
Apache-2.0
DETR is an end-to-end object detection model based on Transformer architecture, using ResNet-50 as the backbone network and trained on the COCO dataset.
Object Detection
Transformers

D
facebook
505.27k
857
Detr Resnet 101
Apache-2.0
DETR is an end-to-end object detection model using Transformer architecture, employing ResNet-101 as the backbone network and trained on the COCO dataset.
Object Detection
Transformers

D
facebook
262.94k
119
Detr Doc Table Detection
Apache-2.0
Document table detection model based on DETR architecture for detecting bordered and borderless tables in documents
Object Detection
Transformers

D
TahaDouaji
233.45k
59
Yolos Small
Apache-2.0
A vision Transformer (ViT)-based object detection model trained with DETR loss function, achieving excellent performance on the COCO dataset.
Object Detection
Transformers

Y
hustvl
154.46k
63
Yolos Tiny
Apache-2.0
YOLOS model fine-tuned on the COCO 2017 object detection dataset, utilizing Vision Transformer architecture for efficient object detection.
Object Detection
Transformers

Y
hustvl
144.58k
266
Rtdetr R50vd Coco O365
Apache-2.0
RT-DETR is the first real-time end-to-end object detector, achieving 53.1% AP and 108 FPS on the COCO dataset through an efficient hybrid encoder and uncertainty-minimized query selection mechanism.
Object Detection
Transformers English

R
PekingU
111.17k
11
Rtdetr R101vd Coco O365
Apache-2.0
The first real-time end-to-end object detector based on Transformer architecture, eliminating the need for non-maximum suppression, surpassing YOLO series in speed and accuracy
Object Detection
Transformers English

R
PekingU
106.81k
7
Featured Recommended AI Models
Š 2025AIbase