M

Mgpt 13B

Developed by ai-forever
mGPT 13B is a multi-language language model that supports 61 languages, covering 25 language families. It is trained on 600GB of text data and has powerful multi-language processing capabilities.
Downloads 4,742
Release Time : 4/18/2023

Model Overview

mGPT 13B is a multi-language language model based on the GPT-3 architecture, focusing on text generation and understanding tasks in multi-language scenarios.

Model Features

Extensive multi-language support
Supports 61 languages, covering 25 language families, including many low-resource languages
Large-scale data training
Pre-trained on 600GB of text data, mainly from MC4 and Wikipedia
Data quality optimization
Ensure the quality of training data through 64-bit hash deduplication and text compression ratio filtering
Excellent perplexity performance
Performs excellently with perplexity scores between 2 and 10 in most supported languages

Model Capabilities

Multi-language text generation
Cross-language text understanding
Multi-language Q&A system
Language translation assistance

Use Cases

Natural language processing
Multi-language chatbot
Build an intelligent dialogue system that supports multiple languages
Can smoothly handle user input in 61 languages
Cross-language information retrieval
Implement semantic search between documents in different languages
Improve retrieval accuracy using multi-language representation capabilities
Educational technology
Language learning assistant tool
Provide intelligent assistance for learners of multiple languages
Support grammar analysis and example generation for 61 languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase