AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
NLP Task Optimization

# NLP Task Optimization

Chinese Electra Small Generator
Apache-2.0
Chinese ELECTRA is a pre-trained model developed by the Harbin Institute of Technology-iFLYTEK Joint Lab based on Google's ELECTRA architecture, with only 1/10 the parameters of BERT but comparable performance.
Large Language Model Transformers Chinese
C
hfl
16
0
Ptt5 Base Portuguese Vocab
MIT
PTT5 is a T5 model pretrained on the BrWac corpus, optimized for Portuguese natural language processing tasks, offering three sizes and two vocabulary options.
Large Language Model Transformers Other
P
unicamp-dl
4,090
40
Chinese Electra 180g Base Generator
Apache-2.0
ELECTRA model trained on 180G Chinese data, with small parameters but superior performance, suitable for Chinese natural language processing tasks
Large Language Model Transformers Chinese
C
hfl
28
0
Superpal
SuperPAL is a model designed for proposition-level alignment tasks between summaries and source texts, enabling fine-grained alignment between text summaries and their source materials.
Text Classification Transformers
S
biu-nlp
247
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase