H

Hplt Bert Base Sk

Developed by HPLT
A monolingual Slovak BERT model released by the HPLT project, trained on the LTG-BERT architecture, suitable for masked language modeling tasks
Downloads 23
Release Time : 4/22/2024

Model Overview

This is a monolingual Slovak BERT model based on the HPLT 1.2 data release, using the improved LTG-BERT architecture, primarily designed for masked language modeling tasks.

Model Features

Monolingual Optimization
Specifically trained for Slovak language using the HPLT dataset of this language
Improved Architecture
Adopts the enhanced LTG-BERT architecture with performance improvements over standard BERT
Intermediate Checkpoints
Provides 10 intermediate checkpoints during training for analyzing model evolution

Model Capabilities

Masked Language Modeling
Text Understanding
Sequence Classification
Token Classification
Question Answering
Multiple Choice Tasks

Use Cases

Natural Language Processing
Text Completion
Predicting masked words
Example successfully predicted 'place' to complete the sentence
Text Classification
Classifying Slovak texts
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase