AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Chinese text understanding

# Chinese text understanding

Chinese Roberta L 8 H 512
A Chinese RoBERTa model pre-trained on CLUECorpusSmall, with a parameter scale of 8 layers and 512 hidden units, supporting masked language modeling tasks.
Large Language Model Chinese
C
uer
76
3
Chinese Roberta L 6 H 256
A Chinese RoBERTa model pre-trained on CLUECorpusSmall, with a parameter scale of 8 layers and 512 hidden units.
Large Language Model Chinese
C
uer
58
1
Chinese Roberta L 2 H 256
Chinese RoBERTa model pre-trained on CLUECorpusSmall, featuring 8 layers and 512 hidden dimensions, suitable for various Chinese NLP tasks.
Large Language Model Chinese
C
uer
26
1
Roberta Medium Word Chinese Cluecorpussmall
A medium - sized Chinese word - segmented RoBERTa model pre - trained on CLUECorpusSmall, using an 8 - layer architecture with 512 hidden units. It performs better than the character - based model in multiple tasks.
Large Language Model Chinese
R
uer
293
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase