# Pretrained Language Model
Chronos T5 Tiny
Apache-2.0
Chronos is a family of pretrained time series forecasting models based on language model architectures, trained by quantizing and scaling time series into token sequences.
Climate Model
Transformers

C
autogluon
318.45k
12
Japanese Gpt Neox 3.6b
MIT
A Japanese GPT-NeoX model with 3.6 billion parameters, based on the Transformer architecture, trained on 312.5 billion tokens of Japanese corpus.
Large Language Model
Transformers Supports Multiple Languages

J
rinna
34.74k
99
Bert Base Portuguese Cased
MIT
Pretrained BERT model for Brazilian Portuguese, achieving state-of-the-art performance in multiple NLP tasks
Large Language Model Other
B
neuralmind
257.25k
181
German Gpt2
MIT
This is a German language model based on the GPT-2 architecture, specifically optimized for German text generation tasks.
Large Language Model German
G
anonymous-german-nlp
176
1
Kobart Base V1
MIT
KoBART is a Korean pretrained model based on the BART architecture, suitable for various Korean natural language processing tasks.
Large Language Model
Transformers Korean

K
gogamza
2,077
1
Financialbert
FinancialBERT is a BERT model pretrained on massive financial texts, aiming to advance research and practice in financial natural language processing.
Large Language Model
Transformers English

F
ahmedrachid
3,784
27
Featured Recommended AI Models