S

Structbert Large Zh

Developed by junnyu
StructBERT is a novel model that extends BERT by incorporating linguistic structures into the pre-training process, leveraging two auxiliary tasks to fully utilize word and sentence order structures
Downloads 77
Release Time : 5/18/2022

Model Overview

StructBERT is an improved BERT model that enhances language understanding at both word and sentence levels by incorporating linguistic structures during pre-training

Model Features

Structure-aware Pre-training
Utilizes word and sentence order structures through two auxiliary tasks during pre-training
Deep Language Understanding
Better captures linguistic structures at both word and sentence levels
Large-scale Pre-training
Based on BERT-large architecture with 330 million parameters

Model Capabilities

Text Classification
Natural Language Inference
Semantic Similarity Calculation
Question Answering Systems

Use Cases

Natural Language Processing
Text Classification
Used for tasks like news categorization
Achieved 68.67% accuracy on TNEWS dataset
Natural Language Inference
Determines logical relationships between sentences
Achieved 84.47% accuracy on CMNLI dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase