S

Small

Developed by funnel-transformer
Transformer model pre-trained on English corpus using ELECTRA-like objectives, suitable for text feature extraction and downstream task fine-tuning
Downloads 6,084
Release Time : 3/2/2022

Model Overview

This model is pre-trained in a self-supervised manner on large English text corpora to learn internal representations of the English language, which can be used to extract features for downstream tasks or for fine-tuning

Model Features

Funnel structure
Achieves efficient language processing by filtering sequence redundancy, improving model efficiency
ELECTRA-style pretraining
Uses ELECTRA-like adversarial training to predict original/replaced tokens
Case-insensitive
Processes text input uniformly without case distinction

Model Capabilities

Text feature extraction
Sequence classification
Token classification
Question answering task processing

Use Cases

Natural Language Processing
Text classification
Perform sentiment analysis or topic classification on text
Named entity recognition
Identify entities such as person names and locations in text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase