đ HuggingArtists Model - Gunna
This is a HuggingArtists model based on Gunna's lyrics. It allows you to generate text in the style of Gunna. You can create your own bot using this model and explore the world of lyric generation.
đ Quick Start
You can use this model directly with a pipeline for text generation:
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/gunna')
generator("I am", num_return_sequences=5)
Or with the Transformers library:
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/gunna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/gunna")
⨠Features
- Artist-Specific Style: Generate text in the style of Gunna, leveraging his unique lyrical patterns.
- Easy to Use: Can be used directly with the Transformers pipeline or library.
đĻ Installation
There is no specific installation step provided in the original README. If you want to use the model, you need to have the transformers
and datasets
libraries installed. You can install them using the following commands:
pip install transformers
pip install datasets
đģ Usage Examples
Basic Usage
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/gunna')
generator("I am", num_return_sequences=5)
Advanced Usage
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/gunna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/gunna")
input_text = "I am"
input_ids = tokenizer.encode(input_text, return_tensors='pt')
output = model.generate(input_ids, max_length=100, num_return_sequences=5)
for sequence in output:
print(tokenizer.decode(sequence, skip_special_tokens=True))
đ Documentation
How does it work?
To understand how the model was developed, check the W&B report.
Training data
The model was trained on lyrics from Gunna.
Dataset is available here.
And can be used with:
from datasets import load_dataset
dataset = load_dataset("huggingartists/gunna")
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
Training procedure
The model is based on a pre - trained GPT - 2 which is fine - tuned on Gunna's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
đ§ Technical Details
The model is built on top of the pre - trained GPT - 2 architecture. By fine - tuning on Gunna's lyrics, it captures the unique language patterns and styles in his songs. The training process is tracked using Weights & Biases (W&B) to ensure transparency and reproducibility.
đ License
No license information is provided in the original README.
Limitations and bias
The model suffers from the same limitations and bias as GPT - 2.
In addition, the data present in the user's tweets further affects the text generated by the model.
About
Built by Aleksey Korshuk



For more details, visit the project repository.

Property |
Details |
Model Type |
Based on pre - trained GPT - 2, fine - tuned on Gunna's lyrics |
Training Data |
Lyrics from Gunna, available at huggingartists/gunna |
â ī¸ Important Note
The model suffers from the same limitations and bias as GPT - 2, and user - related data may affect the generated text.
đĄ Usage Tip
You can adjust the parameters in the generate
method, such as max_length
and num_return_sequences
, to get different text generation results.