B

Bert Base Swedish Cased Ner

Developed by KB
Swedish BERT base model released by the National Library of Sweden/KBLab, trained on multi-source texts
Downloads 20.77k
Release Time : 6/7/2022

Model Overview

Swedish pretrained language model based on BERT architecture, with training data covering various text types including books, news, and government publications

Model Features

Multi-source Training Data
Training data sourced from books, news, government publications, Wikipedia, and online forums for broad representation
Whole Word Masking Training
Utilizes Whole Word Masking technique for pretraining
Case-sensitive
Model preserves original text case information

Model Capabilities

Text Representation Learning
Named Entity Recognition
Language Understanding

Use Cases

Information Extraction
Named Entity Recognition
Identifying entities such as person names, locations, and organizations in text
Model fine-tuned on SUC 3.0 dataset can recognize 5 entity types
Text Analysis
Semantic Understanding
Used for building advanced Swedish NLP applications
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase