Roberta Polish Kgr10
This is a Polish RoBERTa model pre-trained on the KGR10 corpus, currently completing about 5% of the target training duration. Incremental versions will be released as training progresses.
Downloads 34
Release Time : 3/2/2022
Model Overview
This model is a Polish pre-trained language model based on the RoBERTa architecture, suitable for various Polish natural language processing tasks.
Model Features
Incremental releases
The model will release new incremental versions as training progresses, making it easier for users to track and use the latest developments.
Professional corpus
Pre-trained on the KGR10 corpus, which may contain high-quality Polish text data.
Model Capabilities
Polish text understanding
Polish text generation
Polish language model tasks
Use Cases
Natural Language Processing
Polish text classification
Can be used for classification tasks on Polish text
Polish question answering system
Can be used to build Polish question answering systems
Featured Recommended AI Models