AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Text Understanding Optimization

# Text Understanding Optimization

Rbt4
Apache-2.0
This is a Chinese pretrained BERT model using whole word masking strategy, released by the Harbin Institute of Technology-iFLYTEK Joint Laboratory to accelerate Chinese natural language processing research.
Large Language Model Chinese
R
hfl
22
6
Chinese Bert Wwm Ext
Apache-2.0
A Chinese pre-trained BERT model employing whole word masking strategy, aimed at accelerating Chinese natural language processing research.
Large Language Model Chinese
C
hfl
24.49k
174
Rbt6
Apache-2.0
This is a retrained 6-layer RoBERTa-wwm-ext model using whole word masking technique for Chinese pretraining.
Large Language Model Chinese
R
hfl
796
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase