# English text completion
Ablation Model Fineweb Edu
Apache-2.0
This model is part of the FineWeb ablation experiment, with 1.82 billion parameters, based on the Llama architecture, trained using the FineWeb-Edu dataset, and suitable for English text completion tasks.
Large Language Model
Transformers English

A
HuggingFaceFW
262
14
Dbrx Base
Other
A Mixture of Experts (MoE) large language model developed by Databricks, with 132 billion total parameters and 36 billion active parameters, supporting a 32K context window
Large Language Model
Transformers

D
databricks
100
557
Featured Recommended AI Models