đ HuggingArtists Model - O.T (RUS)
This is a HuggingArtists model based on O.T (RUS). It can generate lyrics similar to O.T (RUS)'s style. You can create your own bot using this model and the provided demo.
đ Quick Start
You can use this model directly with a pipeline for text generation:
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ot-rus')
generator("I am", num_return_sequences=5)
Or with Transformers library:
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ot-rus")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ot-rus")
⨠Features
- Artist - Specific Generation: Generate lyrics in the style of O.T (RUS).
- Easy - to - Use: Can be used with simple Python code.
đĻ Installation
No specific installation steps are provided in the original document. But you need to have the transformers
and datasets
libraries installed to use the model. You can install them using pip
:
pip install transformers datasets
đģ Usage Examples
Basic Usage
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ot-rus')
generator("I am", num_return_sequences=5)
đ Documentation
How does it work?
To understand how the model was developed, check the W&B report.
Training data
The model was trained on lyrics from O.T (RUS).
Dataset is available here.
And can be used with:
from datasets import load_dataset
dataset = load_dataset("huggingartists/ot-rus")
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
Training procedure
The model is based on a pre - trained GPT - 2 which is fine - tuned on O.T (RUS)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
đ§ Technical Details
The model uses the pre - trained GPT - 2 architecture and fine - tunes it on O.T (RUS)'s lyrics. The training process is tracked using Weights & Biases to ensure transparency and reproducibility.
đ License
No license information is provided in the original document.
Limitations and bias
â ī¸ Important Note
The model suffers from the same limitations and bias as GPT - 2.
In addition, the data present in the user's tweets further affects the text generated by the model.
About
Built by Aleksey Korshuk


[](https://t.me/joinchat/_CQ04KjcJ - 4yZTky)
For more details, visit the project repository.
