M

Mistral Nemo Base 2407

Developed by mistralai
Mistral-Nemo-Base-2407 is a 12-billion-parameter generative text pre-training model jointly trained by Mistral AI and NVIDIA, outperforming existing models of similar or smaller scale.
Downloads 44.76k
Release Time : 7/18/2024

Model Overview

This is a multilingual large language model supporting multiple languages and code generation, featuring a 128k context window and serving as a direct replacement for Mistral 7B.

Model Features

Apache 2 License
The model is released under Apache 2.0 license, permitting commercial use and modification.
128k Context Window
Supports context windows up to 128k tokens, ideal for long-text processing tasks.
Multilingual Support
Supports 9 major languages including Chinese.
Code Comprehension & Generation
Trained on extensive code data, capable of understanding and generating code.

Model Capabilities

Text generation
Multilingual processing
Code generation
Q&A systems

Use Cases

Natural Language Processing
Multilingual Chatbot
Building intelligent dialogue systems supporting multiple languages
Code Assistant
Assisting developers in understanding and generating code
Knowledge Q&A
Open-domain Q&A System
Answering knowledge questions across various domains
Achieved 73.8% accuracy on TriviaQA benchmark
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase