Zh Core Web Trf
Chinese natural language processing pipeline based on Transformer, including part-of-speech tagging, dependency parsing, named entity recognition, etc.
Downloads 67
Release Time : 3/2/2022
Model Overview
This is a Chinese natural language processing model based on BERT, using Transformer architecture, supporting core NLP tasks such as part-of-speech tagging, dependency parsing, and named entity recognition.
Model Features
BERT-based Transformer architecture
Uses bert-base-chinese as the base model, with strong contextual understanding capabilities
Multi-task processing capability
A single model supports part-of-speech tagging, dependency parsing, and named entity recognition simultaneously
High-precision part-of-speech tagging
Part-of-speech tagging accuracy reaches 91.75%
Model Capabilities
Part-of-speech tagging
Dependency parsing
Named entity recognition
Sentence segmentation
Use Cases
Text analysis
Chinese text parsing
Performs grammatical analysis and structural understanding of Chinese text
Can identify sentence component relationships and part-of-speech
Information extraction
Entity recognition
Identifies entities such as person names, locations, and organizations from text
NER F-score reaches 74.08%
Featured Recommended AI Models
Š 2025AIbase