G

Glm 2b

Developed by THUDM
GLM-2B is a general-purpose language model pre-trained with autoregressive blank filling objectives, supporting various natural language understanding and generation tasks.
Downloads 60
Release Time : 3/1/2023

Model Overview

GLM-2B is a general-purpose language model pre-trained using autoregressive blank filling objectives, which can be fine-tuned for multiple natural language understanding and generation tasks.

Model Features

Autoregressive Blank Filling
Utilizes innovative autoregressive blank filling objectives for pre-training, enhancing the model's text comprehension and generation capabilities.
Multi-task Adaptation
Can be fine-tuned for various natural language understanding and generation tasks, demonstrating broad application adaptability.
Multi-level Masking Strategy
Employs three types of masking tokens for different tasks: short text filling [MASK], sentence-level filling [sMASK], and generation tasks [gMASK].

Model Capabilities

Text Generation
Text Understanding
Sequence-to-Sequence Task Processing
Language Modeling

Use Cases

Natural Language Processing
Text Filling
Performs short text filling tasks using the [MASK] token.
Sentence Generation
Performs left-to-right text generation using the [gMASK] token.
Long Text Understanding
Performs sentence-level filling and comprehension using the [sMASK] token.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase