đ beit-base-patch16-224-pt22k-ft22k-finetuned-stroke-binary
This model is a fine - tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on an "Binary Stroke Detection" dataset. It is designed to solve the problem of binary stroke detection in images, providing high - accuracy results for medical image analysis.
đ Quick Start
This model is a fine - tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on an "Binary Stroke Detection" dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2029
- Accuracy: 0.9222
- F1: 0.9214
- Precision: 0.9234
- Recall: 0.9222
⨠Features
đ Documentation
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e - 05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon = 1e - 08 and optimizer_args = No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 48
- mixed_precision_training: Native AMP
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
F1 |
Precision |
Recall |
0.7256 |
1.2477 |
100 |
0.6913 |
0.5685 |
0.4823 |
0.4731 |
0.5685 |
0.6695 |
2.4954 |
200 |
0.6480 |
0.6210 |
0.5164 |
0.5987 |
0.6210 |
0.5963 |
3.7430 |
300 |
0.5882 |
0.6725 |
0.6118 |
0.6993 |
0.6725 |
0.518 |
4.9907 |
400 |
0.4990 |
0.7481 |
0.7167 |
0.7891 |
0.7481 |
0.4325 |
6.2477 |
500 |
0.4090 |
0.8073 |
0.7957 |
0.8232 |
0.8073 |
0.3848 |
7.4954 |
600 |
0.3703 |
0.8340 |
0.8257 |
0.8482 |
0.8340 |
0.3532 |
8.7430 |
700 |
0.3958 |
0.8313 |
0.8201 |
0.8564 |
0.8313 |
0.3297 |
9.9907 |
800 |
0.3257 |
0.8611 |
0.8558 |
0.8718 |
0.8611 |
0.3281 |
11.2477 |
900 |
0.3169 |
0.8666 |
0.8612 |
0.8791 |
0.8666 |
0.2938 |
12.4954 |
1000 |
0.2814 |
0.8865 |
0.8841 |
0.8900 |
0.8865 |
0.2866 |
13.7430 |
1100 |
0.2828 |
0.8869 |
0.8837 |
0.8943 |
0.8869 |
0.2884 |
14.9907 |
1200 |
0.2929 |
0.8847 |
0.8810 |
0.8936 |
0.8847 |
0.2808 |
16.2477 |
1300 |
0.2458 |
0.9014 |
0.8999 |
0.9034 |
0.9014 |
0.258 |
17.4954 |
1400 |
0.2351 |
0.9091 |
0.9080 |
0.9102 |
0.9091 |
0.2744 |
18.7430 |
1500 |
0.2516 |
0.9014 |
0.8994 |
0.9057 |
0.9014 |
0.261 |
19.9907 |
1600 |
0.2453 |
0.9068 |
0.9050 |
0.9107 |
0.9068 |
0.2519 |
21.2477 |
1700 |
0.2564 |
0.8987 |
0.8961 |
0.9051 |
0.8987 |
0.2595 |
22.4954 |
1800 |
0.2318 |
0.9095 |
0.9079 |
0.9129 |
0.9095 |
0.2548 |
23.7430 |
1900 |
0.2196 |
0.9136 |
0.9128 |
0.9142 |
0.9136 |
0.2327 |
24.9907 |
2000 |
0.2376 |
0.9068 |
0.9050 |
0.9110 |
0.9068 |
0.2563 |
26.2477 |
2100 |
0.2421 |
0.9028 |
0.9005 |
0.9083 |
0.9028 |
0.2348 |
27.4954 |
2200 |
0.2213 |
0.9109 |
0.9095 |
0.9132 |
0.9109 |
0.2427 |
28.7430 |
2300 |
0.2308 |
0.9077 |
0.9060 |
0.9116 |
0.9077 |
0.2166 |
29.9907 |
2400 |
0.2152 |
0.9141 |
0.9128 |
0.9165 |
0.9141 |
0.2345 |
31.2477 |
2500 |
0.2283 |
0.9068 |
0.9049 |
0.9114 |
0.9068 |
0.2355 |
32.4954 |
2600 |
0.2173 |
0.9118 |
0.9103 |
0.9149 |
0.9118 |
0.2291 |
33.7430 |
2700 |
0.2149 |
0.9127 |
0.9113 |
0.9155 |
0.9127 |
0.2319 |
34.9907 |
2800 |
0.2123 |
0.9141 |
0.9127 |
0.9167 |
0.9141 |
0.222 |
36.2477 |
2900 |
0.2053 |
0.9181 |
0.9171 |
0.9197 |
0.9181 |
0.2235 |
37.4954 |
3000 |
0.2121 |
0.9141 |
0.9127 |
0.9166 |
0.9141 |
0.2221 |
38.7430 |
3100 |
0.2013 |
0.9195 |
0.9188 |
0.9200 |
0.9195 |
0.2262 |
39.9907 |
3200 |
0.2029 |
0.9222 |
0.9214 |
0.9234 |
0.9222 |
0.2171 |
41.2477 |
3300 |
0.2075 |
0.9181 |
0.9170 |
0.9202 |
0.9181 |
0.2268 |
42.4954 |
3400 |
0.2045 |
0.9190 |
0.9180 |
0.9208 |
0.9190 |
0.2222 |
43.7430 |
3500 |
0.2050 |
0.9204 |
0.9194 |
0.9222 |
0.9204 |
0.2169 |
44.9907 |
3600 |
0.2070 |
0.9177 |
0.9165 |
0.9197 |
0.9177 |
0.2245 |
46.2477 |
3700 |
0.2064 |
0.9181 |
0.9170 |
0.9201 |
0.9181 |
0.2148 |
47.4954 |
3800 |
0.2066 |
0.9181 |
0.9170 |
0.9201 |
0.9181 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.4.0
- Tokenizers 0.21.0

đ License
The model is licensed under the apache - 2.0 license.