🚀 wav2vec2-model2-torgo
This is a model trained from scratch on the None dataset. It achieves a Loss of 4.9975 and a Wer of 1.0 on the evaluation set.
🚀 Quick Start
This section provides a brief overview of the model and its performance.
📚 Documentation
Model description
This model, wav2vec2-model2-torgo, was trained from scratch on the None dataset.
Intended uses & limitations
More information about the intended uses and limitations of this model is yet to be provided.
Training and evaluation data
Details about the training and evaluation data are currently unavailable.
Training procedure
Training hyperparameters
The following hyperparameters were used during the training process:
- learning_rate: 0.1
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Wer |
12.5453 |
0.76 |
500 |
14.6490 |
1.0 |
4.8036 |
1.53 |
1000 |
8.4523 |
1.0 |
5.0421 |
2.29 |
1500 |
5.4114 |
1.0 |
5.2055 |
3.05 |
2000 |
11.0507 |
1.0 |
4.6389 |
3.82 |
2500 |
4.6792 |
1.0 |
4.5523 |
4.58 |
3000 |
4.7855 |
1.0 |
4.7843 |
5.34 |
3500 |
11.2783 |
1.0 |
4.6066 |
6.11 |
4000 |
8.7807 |
1.0 |
4.7382 |
6.87 |
4500 |
2942.0220 |
1.0 |
130.5733 |
7.63 |
5000 |
5.8412 |
1.0 |
4.4972 |
8.4 |
5500 |
17.7038 |
1.0 |
4.5196 |
9.16 |
6000 |
11.4548 |
1.0 |
4.3198 |
9.92 |
6500 |
6.0885 |
1.0 |
4.4273 |
10.69 |
7000 |
6.7374 |
1.0 |
4.2783 |
11.45 |
7500 |
4.7276 |
1.0 |
4.2985 |
12.21 |
8000 |
6.1412 |
1.0 |
4.3262 |
12.98 |
8500 |
5.2621 |
1.0 |
4.1705 |
13.74 |
9000 |
5.2214 |
1.0 |
4.3176 |
14.5 |
9500 |
5.5359 |
1.0 |
3.9808 |
15.27 |
10000 |
4.1537 |
1.0 |
4.0228 |
16.03 |
10500 |
4.2962 |
1.0 |
4.0595 |
16.79 |
11000 |
7.6361 |
1.0 |
4.0088 |
17.56 |
11500 |
6.8715 |
1.0 |
3.8727 |
18.32 |
12000 |
8.8657 |
1.0 |
4.0073 |
19.08 |
12500 |
5.8170 |
1.0 |
3.8511 |
19.85 |
13000 |
13.9836 |
1.0 |
4.0899 |
20.61 |
13500 |
5.3287 |
1.0 |
3.8782 |
21.37 |
14000 |
8.0635 |
1.0 |
3.9235 |
22.14 |
14500 |
5.5129 |
1.0 |
3.7276 |
22.9 |
15000 |
5.0819 |
1.0 |
3.7908 |
23.66 |
15500 |
6.1458 |
1.0 |
3.9176 |
24.43 |
16000 |
4.6094 |
1.0 |
3.8477 |
25.19 |
16500 |
5.1406 |
1.0 |
3.6917 |
25.95 |
17000 |
4.5684 |
1.0 |
3.8568 |
26.72 |
17500 |
4.0306 |
1.0 |
3.7231 |
27.48 |
18000 |
5.6331 |
1.0 |
3.8145 |
28.24 |
18500 |
8.2997 |
1.0 |
3.7809 |
29.01 |
19000 |
5.7468 |
1.0 |
3.5995 |
29.77 |
19500 |
4.9975 |
1.0 |
Framework versions
- Transformers 4.17.0
- Pytorch 1.11.0
- Datasets 1.18.3
- Tokenizers 0.11.6