P

Pile T5 Large

Developed by EleutherAI
Pile-T5 Large is an encoder-decoder model trained on The Pile dataset based on the T5x library, primarily used for English text-to-text generation tasks.
Downloads 112
Release Time : 9/1/2023

Model Overview

Pile-T5 Large is a Transformer-based language model trained with masked language modeling objectives, suitable for English text generation and feature extraction tasks.

Model Features

Large-scale training data
Trained on the 825GiB The Pile dataset, containing diverse English text sources.
Encoder-decoder architecture
Adopts T5-style encoder-decoder structure, suitable for sequence-to-sequence tasks.
Long sequence processing capability
Supports sequence length of 512 tokens, suitable for processing longer texts.

Model Capabilities

Text generation
Feature extraction
Masked language modeling

Use Cases

Research
Language model research
Used to study behaviors and characteristics of large-scale language models.
Downstream task feature extraction
Serves as base model for feature extraction in other NLP tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase