Consilience 40b 7Y9v38s5
Nous Consilience 40B is a 40-billion-parameter generative text model, pre-trained from scratch in a decentralized manner, supporting multiple languages and representing diverse human creative output.
Large Language Model
Safetensors Supports Multiple Languages#Multilingual text generation#Continuous pre-training#Decentralized architecture
Downloads 44
Release Time : 5/9/2025
Model Overview
This is a decoder-only Transformer model using the DeepSeek v3 + MLA architecture, with pre-training data integrating the FineWeb, FineWeb 2, and The Stack v2 datasets, totaling approximately 20T tokens.
Model Features
Decentralized pre-training
The model is pre-trained from scratch in a decentralized manner via the internet, with automatic updates every 500 training steps.
Continuous training strategy
Adopts a continuous training strategy without final data 'annealing' steps to maintain creativity and interesting behaviors.
Multilingual support
Supports over 30 languages, covering major global languages.
Dual licensing
Default CC0 license (contributed to the public domain), while allowing users to choose the MIT license with attribution and disclaimer.
Model Capabilities
Multilingual text generation
Creative content generation
Large-scale text processing
Use Cases
Content creation
Multilingual article generation
Generate creative articles or technical documents in various languages
Creative writing assistance
Assist writers with story ideation and content creation
Education
Multilingual learning assistance
Provide multilingual text examples for language learners
Featured Recommended AI Models
Š 2025AIbase