D

Danskbert

Developed by vesteinn
Danish BERT is a language model optimized for Danish, excelling in the Danish ScandEval benchmark.
Downloads 151
Release Time : 11/23/2022

Model Overview

This is a Danish pre-trained language model based on the RoBERTa architecture, specifically optimized for Danish language tasks, performing well in masked language modeling tasks.

Model Features

Danish Optimization
Specifically trained and optimized for Danish, excelling in Danish language tasks.
High Performance
Achieved the best performance in the Danish ScandEval benchmark.
Large-scale Training
Trained for two weeks using 16 V100 GPUs, totaling 500,000 steps.

Model Capabilities

Danish Text Understanding
Masked Language Prediction
Danish Natural Language Processing

Use Cases

Natural Language Processing
Text Completion
Predict masked Danish vocabulary in sentences.
Accurately predicts missing vocabulary.
Language Model Fine-tuning
Used as a base model for downstream Danish NLP tasks.
Improves downstream task performance.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase