X

Xglm 7.5B

Developed by facebook
XGLM-7.5B is a multilingual autoregressive language model with 7.5 billion parameters, supporting 30+ languages, trained on a diverse corpus of 500 billion subword tokens.
Downloads 1,260
Release Time : 3/2/2022

Model Overview

This model focuses on multilingual text generation and understanding tasks, with special optimization for few-shot learning in low-resource languages.

Model Features

Extensive multilingual support
Supports 30+ languages, including many low-resource languages
Balanced training data
Upsampling strategy for low-resource languages to optimize multilingual performance
Few-shot learning capability
Specially optimized for learning performance with limited samples

Model Capabilities

Multilingual text generation
Cross-lingual understanding
Few-shot learning
Text plausibility judgment

Use Cases

Language understanding
Plausibility selection (COPA)
Determine plausible outcome options given premises
Demonstrates strong zero-shot performance across multiple languages
Content generation
Multilingual text generation
Generate coherent text in different languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase