# Bidirectional Transformer

Gliner Large V2.5
Apache-2.0
GLiNER is a general-purpose named entity recognition (NER) model capable of identifying any type of entity, providing a practical alternative to traditional NER models.
Sequence Labeling Other
G
gliner-community
2,896
18
Gliner Medium V2.5
Apache-2.0
GLiNER is a general-purpose named entity recognition (NER) model capable of identifying any type of entity, providing a practical alternative to traditional NER models while addressing the high resource consumption issues of large language models.
Sequence Labeling Other
G
gliner-community
678
7
Gliner Small V2.5
Apache-2.0
GLiNER is a general-purpose Named Entity Recognition (NER) model capable of identifying any entity type through a bidirectional Transformer encoder.
Sequence Labeling PyTorch
G
gliner-community
2,252
6
Gliner ITA LARGE
Apache-2.0
GLiNER is a bidirectional Transformer-based general named entity recognition model, specifically optimized for Italian.
Sequence Labeling Other
G
DeepMount00
65
7
Gliner Large V2.1
Apache-2.0
GLiNER is a general-purpose named entity recognition (NER) model capable of identifying any type of entity, providing a practical alternative to traditional NER models and large language models.
Sequence Labeling Other
G
urchade
10.31k
34
Gliner Ko
GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type, providing a practical alternative to traditional NER models.
Sequence Labeling Korean
G
taeminlee
165
11
Gliner Base
GLiNER is a general-purpose Named Entity Recognition (NER) model capable of identifying any type of entity through a bidirectional Transformer encoder, providing a practical alternative to traditional NER models.
Sequence Labeling English
G
urchade
4,921
76
Gliner Multi
GLiNER is a multilingual Named Entity Recognition (NER) model capable of identifying any entity type through a bidirectional Transformer encoder, providing a flexible alternative to traditional NER models.
Sequence Labeling Other
G
urchade
1,459
128
Xlm Roberta Xl
MIT
XLM-RoBERTa-XL is a multilingual model pre-trained on 2.5TB of filtered CommonCrawl data, covering 100 languages.
Large Language Model Transformers Supports Multiple Languages
X
facebook
53.53k
27
Roberta Large
MIT
A large English language model pre-trained with masked language modeling objectives, using improved BERT training methods
Large Language Model English
R
FacebookAI
19.4M
212
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase