C

Chinese Roberta L 2 H 128

Developed by uer
This is a Chinese RoBERTa medium model pre-trained on CLUECorpusSmall, featuring 8 layers and 512-dimensional hidden layers, suitable for various Chinese natural language processing tasks.
Downloads 1,141
Release Time : 3/2/2022

Model Overview

This model is the medium version in the Chinese RoBERTa mini model series, pre-trained using Masked Language Modeling (MLM) objectives, and can be used for text feature extraction and fine-tuning downstream NLP tasks.

Model Features

Efficient Pre-training
Adopts a two-stage training strategy, first training with short sequences and then fine-tuning with long sequences to optimize training efficiency
Multi-size Selection
Offers 24 different parameter-scale model choices from ultra-small to base sizes
Chinese Optimization
Specifically pre-trained and optimized for Chinese text

Model Capabilities

Masked Language Modeling
Text Feature Extraction
Sentiment Analysis
Text Classification
Sentence Matching
Natural Language Inference

Use Cases

Text Understanding
Sentiment Analysis
Analyze the sentiment tendency of user reviews
Achieves 93.4% accuracy on Chinese sentiment analysis tasks
News Classification
Automatically classify news articles
Achieves 65.1% accuracy on CLUE news classification tasks
Semantic Understanding
Sentence Matching
Determine the semantic similarity between two sentences
Achieves 86.5% accuracy on sentence matching tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase