X

Xglm 1.7B

Developed by facebook
XGLM-1.7B is a multilingual autoregressive language model with 1.7 billion parameters, trained on a diverse and balanced corpus of 500 billion subword tokens.
Downloads 1,514
Release Time : 3/2/2022

Model Overview

XGLM-1.7B is a multilingual autoregressive language model supporting over 30 languages, suitable for various natural language processing tasks.

Model Features

Multilingual Support
Supports over 30 languages, covering multiple language families and low-resource languages.
Large-scale Training Data
Trained on a diverse and balanced corpus of 500 billion subword tokens.
Low-resource Language Optimization
Upsampling for low-resource languages to enhance model performance in these languages.

Model Capabilities

Multilingual text generation
Language understanding
Zero-shot learning
Few-shot learning

Use Cases

Natural Language Processing
Choice of Plausible Alternatives (COPA) Task
Evaluates the model's performance in English, Chinese, and Hindi COPA tasks.
The model performs well under zero-shot settings.
Multilingual Text Generation
Generates coherent text in multiple languages.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase