Chinese Bert Wwm Ext Upos
BERT model pre-trained on Chinese Wikipedia texts for POS tagging and dependency parsing.
Downloads 21
Release Time : 3/2/2022
Model Overview
This model is based on hfl/chinese-bert-wwm-ext pre-training, specifically designed for Chinese text POS tagging (UPOS) and dependency parsing tasks.
Model Features
Wikipedia pre-training
Pre-trained on Chinese Wikipedia texts (including Simplified and Traditional), offering broad language coverage.
Universal POS tagging (UPOS)
Adopts the Universal Dependencies (UD) POS tagging standard, ensuring consistency and universality.
Supports dependency parsing
In addition to POS tagging, it can perform dependency parsing to understand sentence structure relationships.
Model Capabilities
Chinese POS tagging
Dependency parsing
Chinese text processing
Use Cases
Natural Language Processing
Chinese text analysis
Perform POS tagging and syntactic analysis on Chinese texts
Obtain POS tags for each word and sentence structure relationships
Linguistic research
Used for Chinese grammar research and linguistic feature analysis
Provides standardized linguistic analysis data
Featured Recommended AI Models
Š 2025AIbase