C

Chinese Lert Small

Developed by hfl
LERT is a linguistics theory-driven pre-trained language model designed to enhance model performance through linguistic knowledge.
Downloads 538
Release Time : 10/26/2022

Model Overview

LERT is a pre-trained language model based on linguistic theories, enhancing language understanding capabilities by incorporating linguistic knowledge.

Model Features

Linguistics Theory-Driven
The model design incorporates linguistic theories to enhance language understanding capabilities through linguistic knowledge.
Pre-training Optimization
Optimizes model parameters during the pre-training phase to achieve excellent performance in various natural language processing tasks.

Model Capabilities

Natural Language Understanding
Text Classification
Named Entity Recognition
Semantic Similarity Calculation

Use Cases

Natural Language Processing
Text Classification
Used for classifying Chinese texts, such as sentiment analysis and topic classification.
Named Entity Recognition
Used to identify named entities in text, such as person names, place names, and organization names.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase