B

Bert Base Swedish Cased

Developed by KB
Swedish BERT base model released by the National Library of Sweden/KBLab, trained on multi-source texts
Downloads 11.16k
Release Time : 6/7/2022

Model Overview

Swedish pre-trained language model based on BERT architecture, trained on diverse sources including books, news, and government publications

Model Features

Multi-source training data
Trained on 15-20GB of diverse Swedish texts from books, news, government publications, etc.
Whole word masking
Uses Whole Word Masking technique to enhance language understanding
Case-sensitive
Preserves original text case information, suitable for applications requiring case sensitivity

Model Capabilities

Text understanding
Named entity recognition
Semantic analysis

Use Cases

Information extraction
Named entity recognition
Identify entities such as person names, locations, and organizations in text
NER model fine-tuned on SUC 3.0 dataset can recognize 5 entity types
Text analysis
Semantic understanding
Used for building question-answering systems or chatbots
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase