M

Medium Base

Developed by funnel-transformer
A Transformer model pre-trained on English corpus using self-supervised learning similar to ELECTRA, trained by predicting replaced tokens.
Downloads 69
Release Time : 3/2/2022

Model Overview

This model is pre-trained on massive English text through self-supervised learning, suitable for text feature extraction or downstream task fine-tuning, especially for tasks requiring sentence summarization.

Model Features

Efficient Sequence Processing
Compresses sequence length through funnel structure, outputting only a quarter of input length to improve processing efficiency.
Self-supervised Pre-training
Uses adversarial training similar to ELECTRA, learning language representations by predicting replaced tokens.
Case Insensitive
Treats words with different cases as the same token, simplifying text processing.

Model Capabilities

Text feature extraction
Sequence classification
Token classification
Question answering system

Use Cases

Text Analysis
Sentiment Analysis
Classifies sentiment tendencies of sentences or paragraphs
Text Classification
Categorizes text into predefined classes
Information Extraction
Named Entity Recognition
Identifies entities such as person names, locations, and organizations in text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase