đ roberta-base-culinary
This model is a fine - tuned version of bert-base-uncased on the None dataset. It achieves a loss of 1.1032 on the evaluation set, offering potential for related natural language processing tasks.
đ Quick Start
This section will guide you through the basic information of the roberta-base-culinary
model.
⨠Features
- Fine - tuned Model: Based on the
bert-base-uncased
model, it is fine - tuned on a specific dataset to better adapt to relevant tasks.
- Evaluated Performance: Achieves a loss of 1.1032 on the evaluation set, which can serve as a reference for model performance.
đĻ Installation
The original README does not provide installation steps, so this section is skipped.
đģ Usage Examples
The original README does not provide code examples, so this section is skipped.
đ Documentation
Model description
This model is a fine - tuned version of bert-base-uncased on the None dataset.
Intended uses & limitations
More information needed.
Training and evaluation data
More information needed.
đ§ Technical Details
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e - 05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon = 1e - 08
- lr_scheduler_type: linear
- num_epochs: 16
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
1.5135 |
1.0 |
39823 |
1.4635 |
1.454 |
2.0 |
79646 |
1.3753 |
1.3924 |
3.0 |
119469 |
1.3375 |
1.3379 |
4.0 |
159292 |
1.2886 |
1.2969 |
5.0 |
199115 |
1.2595 |
1.2495 |
6.0 |
238938 |
nan |
1.1768 |
7.0 |
278761 |
1.2283 |
1.1687 |
8.0 |
318584 |
1.2109 |
1.2148 |
9.0 |
358407 |
1.1671 |
1.133 |
10.0 |
398230 |
1.1721 |
1.0882 |
11.0 |
438053 |
1.1624 |
1.0749 |
12.0 |
477876 |
1.1321 |
1.092 |
13.0 |
517699 |
nan |
1.0594 |
14.0 |
557522 |
1.1186 |
1.0292 |
15.0 |
597345 |
1.1074 |
0.9973 |
16.0 |
637168 |
1.1032 |
Framework versions
- Transformers 4.22.1
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1
đ License
This model is released under the Apache - 2.0 license.