X

Xglm 2.9B

Developed by facebook
XGLM-2.9B is a multilingual autoregressive language model with 2.9 billion parameters, trained on a diverse and balanced corpus of 500 billion subword tokens across multiple languages.
Downloads 229
Release Time : 3/2/2022

Model Overview

This model is a multilingual autoregressive language model supporting 30 languages, suitable for various natural language processing tasks.

Model Features

Multilingual Support
Supports 30 languages, covering multiple language families and low-resource languages.
Large-scale Training Data
Trained on a diverse and balanced corpus of 500 billion subword tokens.
Few-shot Learning Capability
Performs excellently in few-shot learning tasks.

Model Capabilities

Multilingual Text Generation
Few-shot Learning
Zero-shot Inference

Use Cases

Natural Language Processing
Choice of Plausible Alternatives (COPA) Task
Evaluates the model's performance on the COPA task in English, Chinese, and Hindi.
In the examples, the model correctly predicted all test cases.
Education
Multilingual Learning Assistance
Can be used to assist in multilingual learning and translation tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase