# BERT Architecture
Burmesebert
Burmese-Bert is a bilingual masked language model based on bert-large-uncased, supporting both English and Burmese.
Large Language Model
Transformers Supports Multiple Languages

B
jojo-ai-mst
20
2
M2 BERT 8k Retrieval Encoder V1
Apache-2.0
M2-BERT-8K is an 80-million-parameter long-context retrieval model based on the architecture proposed in the paper 'Benchmarking and Building Long-Context Retrieval Models with LoCo and M2-BERT'.
Large Language Model
Transformers English

M
hazyresearch
52
4
Gte Large Gguf
MIT
GGUF format version of the General Text Embedding (GTE) model, suitable for tasks like information retrieval and semantic text similarity.
Text Embedding English
G
ChristianAzinn
184
1
Beto Sentiment Analysis Spanish
A sentiment analysis model based on BETO (Spanish version of BERT), supporting sentiment classification for Spanish text.
Text Classification
Transformers Spanish

B
ignacio-ave
1,708
6
Erya4ft
Apache-2.0
Erya4FT is a fine-tuned Classical Chinese translation model based on the Erya model, specializing in translating Classical Chinese into Modern Chinese.
Machine Translation
Transformers Chinese

E
RUCAIBox
13
4
Evaluation Xlm Roberta Model
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding
Transformers

E
loutchy
22
0
Rubert Tiny Bviolet
This is a sentence embedding model based on sentence-transformers, which can map text to a 312-dimensional vector space and is suitable for tasks such as semantic search and text similarity calculation.
Text Embedding
Transformers

R
pouxie
46
2
My Awesome Qa Model
Apache-2.0
A question-answering model fine-tuned on the SQuAD dataset based on bert-base-multilingual-cased
Question Answering System
Transformers

M
vnktrmnb
14
0
Tamil Bert
A BERT model trained on publicly available monolingual Tamil datasets, suitable for Tamil natural language processing tasks.
Large Language Model
Transformers Other

T
l3cube-pune
331
3
Setfit Model
This is a model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space for sentence similarity calculation and semantic search tasks.
Text Embedding
Transformers

S
rajistics
18
1
Setfit ST ICD10 L3
This is a model based on sentence-transformers that can map sentences and paragraphs to a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding
Transformers

S
rjac
14
0
Qqp Nli Training Paraphrase Multilingual MiniLM L12 V2
This is a sentence similarity model based on sentence-transformers, which maps text to a 384-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding
Transformers

Q
TingChenChang
13
0
Bert Base Han Chinese Ws
Gpl-3.0
This model provides word segmentation functionality for classical Chinese, with training datasets covering four historical periods of Chinese language development.
Sequence Labeling
Transformers Chinese

B
ckiplab
14
2
Bpr Gpl Robust04 Base Msmarco Distilbert Tas B
This is a model based on sentence-transformers that can map sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding
Transformers

B
income
27
0
Bert Fc Base
A language model based on BERT architecture, using first character prediction as the pre-training objective
Large Language Model
Transformers

B
aajrami
15
0
Bert4ner Base Chinese
Apache-2.0
A BERT-based Chinese named entity recognition model, achieving near state-of-the-art performance on the People's Daily dataset
Sequence Labeling
Transformers Supports Multiple Languages

B
shibing624
439
31
Bert Base Buddhist Sanskrit
A BERT-based masked language model for Buddhist Sanskrit text processing, specifically designed for handling Buddhist Sanskrit texts
Large Language Model
Transformers

B
Matej
31
3
Bert Base NER
MIT
BERT-base based named entity recognition model, capable of identifying four types of entities including locations, organizations, and person names
Sequence Labeling
Transformers English

B
optimum
69
2
Trec News Distilbert Tas B Gpl Self Miner
This is a model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding
Transformers

T
GPL
29
0
Bert2bert L 24 Wmt De En
Apache-2.0
A BERT-based encoder-decoder model specifically designed for German-to-English machine translation tasks.
Machine Translation
Transformers Supports Multiple Languages

B
google
1,120
8
My Awesome Model
A pre-trained language model based on the Transformer architecture, suitable for various natural language processing tasks.
Text Classification
Transformers

M
mishig
15
0
Rubert Base Cased Nli Threeway
A Russian natural language inference model fine-tuned from DeepPavlov/rubert-base-cased, capable of predicting logical relationships (entailment/contradiction/neutral) between two texts
Text Classification
Transformers Other

R
cointegrated
144.68k
34
Chinese Pert Base
PERT is a Chinese pre-trained model based on BERT, focusing on improving Chinese text processing capabilities.
Large Language Model
Transformers Chinese

C
hfl
131
13
Bert Base Cased Finetuned Mnli
Apache-2.0
A text classification model fine-tuned on the GLUE MNLI dataset based on bert-base-cased, designed for natural language inference tasks
Text Classification
Transformers English

B
gchhablani
84
2
Rubert Base Cased Dp Paraphrase Detection
This is a paraphrase detector developed based on DeepPavlov, migrated to the Transformers format, used to detect whether Russian texts are paraphrases.
Text Classification
Transformers Other

R
cointegrated
39
4
Model Paraphrase Multilingual MiniLM L12 V2 10 Epochs
This is a model based on sentence-transformers that can map sentences and paragraphs to a 384-dimensional dense vector space, suitable for tasks such as clustering and semantic search.
Text Embedding
Transformers

M
jfarray
11
0
Kobert Lm
Apache-2.0
KoBERT-LM is a pre-trained language model optimized for Korean, based on the BERT architecture and further pre-trained specifically for Korean text.
Large Language Model Korean
K
monologg
49
1
Bert Base Arabic
Pretrained Arabic BERT base language model supporting Modern Standard Arabic and some dialects
Large Language Model Arabic
B
asafaya
14.40k
38
Scibert Scivocab Uncased Squad V2
A BERT-based pre-trained language model for the scientific domain, trained using a scientific literature vocabulary
Question Answering System
S
ktrapeznikov
20
0
Hate Speech Detector
This model is a branch version based on the bert-based-uncased-hatespeech-movies model, used to classify text as normal, offensive, or hate speech.
Text Classification
Transformers English

H
risingodegua
16
2
Bert Turkish Question Answering
This is a BERT-based Turkish question-answering model, specifically designed for handling Turkish Q&A tasks.
Question Answering System Other
B
lserinol
186
23
Bert Base Cased Spell Correction
A BERT-based spelling correction model for detecting and correcting spelling errors in text.
Large Language Model
B
murali1996
24
7
Biomednlp PubMedBERT Base Uncased Abstract Fulltext Finetuned Pubmedqa 1
MIT
A question-answering model fine-tuned on biomedical text based on PubMedBERT, focused on PubMedQA tasks
Question Answering System
Transformers

B
blizrys
31
0
Guwen Sent
Apache-2.0
A BERT-based sentiment classifier for classical poetry, specialized in analyzing emotional tendencies in ancient texts.
Text Classification
Transformers Chinese

G
ethanyt
20
4
Hebemo Fear
HebEMO is a tool designed to detect sentiment polarity and extract emotions from modern Hebrew user-generated content, trained on a unique COVID-19 related dataset with superior performance.
Text Classification
Transformers

H
avichr
111
1
Bert Base Multilingual Xquad
A multilingual QA model based on bert-base-multilingual-uncased, fine-tuned on the XQuAD dataset
Question Answering System
Transformers Other

B
alon-albalak
24
0
Guwen Ner
Apache-2.0
A named entity recognition tool specifically designed for classical Chinese texts, capable of identifying named entities in ancient writings.
Sequence Labeling Chinese
G
ethanyt
52
5
Model Bert Base Multilingual Uncased 10 Epochs
This is a sentence embedding model based on sentence-transformers, which can map text to a 256-dimensional vector space and is suitable for semantic search and clustering tasks.
Text Embedding
M
jfarray
10
0
Bert Base Turkish Ner Cased
This is a BERT-based Turkish named entity recognition model, suitable for entity recognition tasks in Turkish texts.
Sequence Labeling Other
B
savasy
1,269
18
Bert Base Uncased Few Shot K 1024 Finetuned Squad Seed 2
Apache-2.0
A question-answering model fine-tuned on the SQuAD dataset based on the BERT base model, suitable for few-shot learning scenarios
Question Answering System
Transformers

B
anas-awadalla
16
0
- 1
- 2
Featured Recommended AI Models