# ALBERT architecture

Zero Shot Classify SSTuning ALBERT
MIT
A zero-shot text classification model trained with Self-Supervised Tuning (SSTuning), based on ALBERT-xxlarge-v2 architecture, which can be directly applied to tasks like sentiment analysis and topic classification without fine-tuning.
Text Classification Transformers
Z
DAMO-NLP-SG
98
4
Albert Base Chinese Ws
Gpl-3.0
A Traditional Chinese natural language processing model developed by the Academia Sinica CKIP team, based on the ALBERT architecture, supporting tasks such as word segmentation and part-of-speech tagging.
Sequence Labeling Transformers Chinese
A
ckiplab
1,498
1
Albert Squad V2
Apache-2.0
A Q&A system model based on the ALBERT architecture, trained on the SQuAD v2 dataset for handling question-answering tasks.
Question Answering System Transformers English
A
abhilash1910
22
2
Albert Fa Base V2 Sentiment Multi
Apache-2.0
A lightweight BERT model for Persian self-supervised language representation learning
Large Language Model Transformers Other
A
m3hrdadfi
39
1
Albert Base Chinese
Gpl-3.0
A Traditional Chinese Transformer model developed by the Lexical Knowledge Base Group of Academia Sinica, including architectures such as ALBERT, BERT, GPT2 and natural language processing tools
Large Language Model Transformers Chinese
A
ckiplab
280
11
Albert Base Chinese Pos
Gpl-3.0
Traditional Chinese natural language processing model developed by Academia Sinica's CKIP team, supporting tasks like word segmentation and part-of-speech tagging
Sequence Labeling Transformers Chinese
A
ckiplab
1,095
1
Albert Base V2 Finetuned Ner
Apache-2.0
This model is a Named Entity Recognition (NER) model fine-tuned on the conll2003 dataset based on the ALBERT-base-v2 architecture, demonstrating excellent performance in entity recognition tasks.
Sequence Labeling Transformers
A
ArBert
20
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase