# Multi-scale Pretraining
Openelm 1 1B
OpenELM is a series of efficient language models introduced by Apple, utilizing a hierarchical scaling strategy to optimize parameter allocation, offering pretrained and instruction-tuned models ranging from 270M to 3B parameters.
Large Language Model
Transformers

O
apple
683
31
Flaubert Small Cased
MIT
FlauBERT is a French BERT model pretrained on a large-scale French corpus, developed by the French National Centre for Scientific Research (CNRS), offering different versions to accommodate various needs.
Large Language Model
Transformers French

F
flaubert
10.11k
2
Featured Recommended AI Models