AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
ELECTRA-style pre-training

# ELECTRA-style pre-training

Large
Apache-2.0
A Transformer model pre-trained on English corpus using ELECTRA-like objective functions, learning intrinsic representations of English language through self-supervised methods.
Large Language Model Transformers English
L
funnel-transformer
190
2
Xlarge
Apache-2.0
Funnel Transformer is an English text pre-training model based on self-supervised learning, adopting objectives similar to ELECTRA, achieving efficient language processing by filtering sequence redundancy.
Large Language Model Transformers English
X
funnel-transformer
31
1
Deberta V3 Xsmall Squad2
DeBERTa v3 xsmall is an improved natural language understanding model developed by Microsoft, which enhances performance through decoupled attention mechanisms and enhanced masked decoders, surpassing RoBERTa in multiple NLU tasks.
Question Answering System Transformers English
D
nbroad
17
0
Deberta V3 Large
MIT
DeBERTaV3 improves upon DeBERTa with ELECTRA-style pre-training and gradient-disentangled embedding sharing techniques, excelling in natural language understanding tasks
Large Language Model Transformers English
D
microsoft
343.39k
213
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase