# Suitable for educational scenarios
Tinyllama 42M Fp32
MIT
This is a 42M-parameter Llama 2 architecture float32 precision model trained on the TinyStories dataset, suitable for simple text generation tasks.
Large Language Model
Transformers

T
nickypro
517
3
Tinyllama 110M
MIT
This is a 110-million-parameter Llama 2 architecture model trained on the TinyStories dataset, suitable for lightweight text generation tasks.
Large Language Model
Transformers

T
nickypro
1,472
5
Tinyllama 15M
MIT
A 15-million-parameter Llama 2 architecture model trained on the TinyStories dataset
Large Language Model
Transformers

T
nickypro
3,217
11
T5 Base Korean Summarization
This is a Korean text summarization model based on the T5 architecture, specifically designed for Korean text summarization tasks. It is trained on multiple Korean datasets by fine-tuning the paust/pko-t5-base model.
Text Generation
Transformers Korean

T
eenzeenee
148.32k
25
Indot5 Base Paraphrase
This model is an IndoT5-base version trained on the translated PAWS dataset, used for generating Indonesian paraphrase texts.
Text Generation Other
I
Wikidepia
296
1
Featured Recommended AI Models