Bert Large German Upos
BERT model pre-trained on UD_German-HDT for German POS tagging and dependency parsing
Sequence Labeling
Transformers Supports Multiple LanguagesOpen Source License:MIT#German POS Tagging#Dependency Parsing#BERT Large Model

Downloads 41
Release Time : 3/11/2022
Model Overview
This is a BERT model based on gbert-large, specifically designed for German Universal POS (UPOS) tagging and dependency parsing tasks.
Model Features
German-specific Model
BERT model optimized specifically for German, based on gbert-large architecture
Universal POS Tagging
Supports Universal Dependencies (UD) standard POS tagging (UPOS)
Dependency Parsing
Capable of analyzing grammatical relationships between words in sentences
Model Capabilities
German text processing
POS tagging
Dependency parsing
Use Cases
Natural Language Processing
German Text Analysis
Performs grammatical analysis and POS tagging on German texts
Obtains POS tags for each word and sentence structure relationships
Linguistic Research
Used for studying German syntactic structures and POS distributions
Featured Recommended AI Models
Š 2025AIbase