# Russian generation
Rugpt3small Based On Gpt2
Russian pre-trained Transformer language model developed by SberDevices team, based on GPT2 architecture, supports 1024 sequence length, trained on 80 billion tokens.
Large Language Model Other
R
ai-forever
46.92k
42
2chan Rugpt3 Small
ruGPT3-small is a small Russian language model trained on partial 2chan posts, suitable for text generation tasks.
Large Language Model
2
TheBakerCat
20
0
Featured Recommended AI Models