R

Robbertje 1 Gb Shuffled

Developed by DTAI-KULeuven
RobBERTje is a shuffled version of the distilled Dutch model collection based on RobBERT, with 74M parameters, trained using shuffled OSCAR corpus
Downloads 508
Release Time : 3/2/2022

Model Overview

A distilled Dutch language model based on RobBERT, specifically optimized for training with shuffled corpus, suitable for various Dutch NLP tasks

Model Features

Corpus-optimized training
Trained with shuffled OSCAR corpus to enhance the model's understanding of non-continuous text
Efficient distillation
Retains over 90% of the teacher model's (RobBERT) performance through knowledge distillation while reducing parameters by 40%
Multi-task adaptation
Excellent performance on downstream tasks such as DBRD sentiment analysis and NER

Model Capabilities

Dutch text understanding
Masked language modeling
Sentiment analysis
Named entity recognition
Part-of-speech tagging
Natural language inference

Use Cases

Text analysis
News sentiment analysis
Analyze sentiment tendencies in Dutch news comments
Achieved 92.5% accuracy on DBRD dataset
Information extraction
Legal document processing
Extract key entities and relationships from legal documents
NER task F1 score reached 82.7
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase