B

Bert Large Swedish Cased

Developed by AI-Nordics
A Swedish Bert Large model implemented based on the Megatron-LM framework, containing 340 million parameters, pre-trained on 85GB of Swedish text
Downloads 734
Release Time : 3/2/2022

Model Overview

This model is a large-scale Swedish BERT model primarily used for masked language modeling and next sentence prediction tasks, and can be fine-tuned for domain-specific tasks

Model Features

Large-scale pre-training
Trained on an 85GB diverse Swedish corpus covering multiple domains such as politics, law, and healthcare
Deep model architecture
Utilizes a 24-layer Transformer structure with 16 attention heads, supporting a context length of 1024
Extensive data sources
Integrates over 10 data sources including Wikipedia, government reports, literary works, and web data

Model Capabilities

Swedish text understanding
Masked language modeling
Next sentence prediction
Text feature extraction

Use Cases

Text processing
Swedish text classification
Classify Swedish texts by fine-tuning the model
Question answering system
Base model for building Swedish question answering systems
Information extraction
Named entity recognition
Identify specific entities in Swedish texts
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase