Megatron Bert Large Swedish Cased 165 Zero Shot
Swedish zero-shot classification model based on Megatron-BERT-large-165k architecture, fine-tuned sequentially on QNLI and MNLI tasks
Downloads 70
Release Time : 12/14/2022
Model Overview
This model is specifically designed for zero-shot classification tasks on Swedish text, capable of classifying text without task-specific training.
Model Features
Zero-shot classification capability
Can classify text without task-specific training
Swedish language optimization
Specifically trained and optimized for Swedish text
Large BERT architecture
Based on the powerful Megatron-BERT-large architecture
Model Capabilities
Swedish text classification
Zero-shot learning
Multi-category classification
Use Cases
Text classification
News categorization
Automatically categorize Swedish news into different topic categories
Achieved 92.7% accuracy in entertainment news classification
Content moderation
Identify and classify user-generated content
Featured Recommended AI Models
© 2025AIbase