L

Large

Developed by funnel-transformer
A Transformer model pre-trained on English corpus using ELECTRA-like objective functions, learning intrinsic representations of English language through self-supervised methods.
Downloads 190
Release Time : 3/2/2022

Model Overview

This model is a Transformer pre-trained on large-scale English text in a self-supervised manner, primarily used for extracting text features to support downstream tasks.

Model Features

Self-supervised pre-training
Pre-trained on raw text without manual annotation, using automated processes to generate inputs and labels from text.
ELECTRA-like objective function
Uses a small language model to corrupt input text as a generator, predicting which tokens are original or replaced.
Efficient sequence processing
Funnel structure achieves efficient language processing by filtering sequence redundancy.

Model Capabilities

Text feature extraction
Sequence classification
Token classification
Question answering tasks

Use Cases

Natural Language Processing
Text classification
Utilizes model-generated features to train standard classifiers for text classification.
Question answering system
Constructs a question answering system based on text features extracted by the model.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase