C

Chinese Roberta L 8 H 256

Developed by uer
A Chinese RoBERTa model pretrained on CLUECorpusSmall, with 8 layers and 512 hidden units, suitable for various Chinese NLP tasks.
Downloads 15
Release Time : 3/2/2022

Model Overview

This model is the medium version in the Chinese RoBERTa mini-model series, optimized with RoBERTa architecture and supports tasks like masked language modeling.

Model Features

Multiple Size Options
Offers 24 model variants with different parameter sizes to accommodate varying computational resource needs.
Chinese Optimization
Specifically pretrained and optimized for Chinese text.
Two-Stage Training
Employs a two-stage training strategy with sequence lengths of 128 and 512.

Model Capabilities

Chinese Text Understanding
Masked Language Modeling
Text Feature Extraction

Use Cases

Text Understanding
Text Filling
Predict masked words or phrases
Accurately predicts words like '中' in test cases such as 'Beijing is the capital of [MASK] country.'
Sentiment Analysis
Review Sentiment Judgment
Analyze text sentiment tendencies
Achieves a score of 88.7 on book review sentiment tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase