P

Plt5 Base

Developed by allegro
plT5 is a language model based on the T5 architecture, trained on Polish corpora and optimized for the original T5 denoising objective.
Downloads 4,979
Release Time : 3/2/2022

Model Overview

plT5 is a Polish language model based on the T5 architecture, supporting various natural language processing tasks such as translation, summarization, question answering, and reading comprehension.

Model Features

Multicorpus Training
The model was trained on six different Polish corpora, including CCNet, the Polish National Corpus, Open Subtitles, Wikipedia, and Free Reading.
Optimized Denoising Objective
The model is optimized for the original T5 denoising objective, enhancing text generation and comprehension capabilities.
Large Vocabulary
Uses a SentencePiece unigram model for subword segmentation with a vocabulary size of 50,000 tokens.

Model Capabilities

Text Generation
Text Summarization
Machine Translation
Question Answering
Reading Comprehension

Use Cases

Natural Language Processing
Polish Text Summarization
Generate concise summaries of Polish texts.
Polish Question Answering System
Build a Polish-language question-answering system to respond to user queries.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase