# BERT architecture
Gobert
MIT
GoBERT is a model specifically designed for general gene function prediction. It can effectively capture the relationships between Gene Ontology (GO) functions by leveraging gene ontology graph information.
Protein Model
Safetensors
G
MM-YY-WW
479
1
Modernbert Ai Detection
Apache-2.0
A machine-generated text detection model based on the ModernBERT-base architecture, designed to prevent language model collapse
Text Classification English
M
GeorgeDrayson
16
1
Morphbert Large Morpheme Segmentation Ru
Apache-2.0
A large Russian morpheme segmentation model based on Transformer architecture, capable of classifying each character of Russian words into one of 25 morpheme categories
Sequence Labeling
Transformers Other

M
CrabInHoney
16
1
Log Classifier BERT V1
A transformers classification model trained based on the BERTForSequenceClassification framework, specifically designed for network and device log mining tasks
Text Classification
Transformers English

L
rahulm-selector
25
2
Cubert 20210711 Python 1024
Apache-2.0
CuBERT is a context embedding model based on Python code, specifically designed for source code analysis tasks.
Large Language Model
Transformers Other

C
claudios
22
1
Augment Sentiment Finnlp Th
Thai sentiment analysis model fine-tuned from WangchanBERTa, achieving 74.15% accuracy on the evaluation set
Large Language Model
Transformers

A
nlp-chula
68
3
Dictabert
State-of-the-art BERT language model suite for Modern Hebrew
Large Language Model
Transformers Other

D
dicta-il
50.83k
8
Turkish Base Bert Uncased
This is a base BERT model for Turkish language, case-insensitive, primarily used for fill-mask tasks.
Large Language Model
Transformers Other

T
ytu-ce-cosmos
241
16
Zero Shot Explicit Binary Bert
MIT
A zero-shot text classification model based on BERT architecture, trained for binary classification on the UTCD dataset using an explicit training strategy
Text Classification
Transformers English

Z
claritylab
23
0
Latinbert
LatinBERT is a Latin language pre-trained model based on BERT architecture
Large Language Model
Transformers

L
pnadel
23
1
Banglabert Finetuned Squad
This model is a fine-tuned version of BanglaBERT on the Bengali SQuAD dataset for question answering tasks
Question Answering System
Transformers

B
Naimul
15
0
Rubert Large Squad 2
MIT
Russian QA model trained on sberbank-ai/ruBert-base, suitable for reading comprehension tasks
Question Answering System
Transformers

R
Den4ikAI
271
4
Bert Base Han Chinese Pos
Gpl-3.0
This model provides POS tagging functionality for ancient Chinese, with training data covering four historical periods of Chinese language development.
Sequence Labeling
Transformers Chinese

B
ckiplab
20
1
Bert Base Russian Upos
BERT model pre-trained on UD_Russian for Russian POS tagging and dependency parsing
Sequence Labeling
Transformers Other

B
KoichiYasuoka
87
4
Model Paraphrase Multilingual MiniLM L12 V2 100 Epochs
This is a model based on sentence-transformers that can map sentences and paragraphs to a 384-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding
Transformers

M
jfarray
13
0
Rubert Base Cased Sentiment Rusentiment
A sentiment analysis model based on DeepPavlov/rubert-base-cased-conversational architecture, trained on the RuSentiment dataset, capable of identifying neutral, positive, and negative emotions in Russian text.
Text Classification Other
R
blanchefort
80.75k
12
Bert2bert L 24 Wmt En De
Apache-2.0
An encoder-decoder model based on the BERT architecture, specifically designed for English-to-German machine translation tasks.
Machine Translation
Transformers Supports Multiple Languages

B
google
129
5
English Pert Base
PERT is a BERT-based pre-trained language model designed for English text processing tasks.
Large Language Model
Transformers English

E
hfl
37
6
Danbert Small Cased
Apache-2.0
DanBERT is a Danish pre-trained model based on the BERT-Base architecture, trained on over 2 million Danish sentences.
Large Language Model Supports Multiple Languages
D
alexanderfalk
18
1
TILDE
TILDE is a model based on the BERT architecture, mainly used for text retrieval and language modeling tasks.
Large Language Model
Transformers

T
ielab
134
3
Sbert Base Ja
Basic Sentence BERT model for Japanese, fine-tuned based on BERT model, used for sentence similarity calculation
Text Embedding Japanese
S
colorfulscoop
537
13
Rubert Base Cased Sentiment Rurewiews
A sentiment analysis model based on DeepPavlov/rubert-base-cased-conversational architecture, trained on the RuReviews dataset for sentiment classification of Russian product reviews.
Text Classification Other
R
blanchefort
46
5
Sportsbert
SportsBERT is a BERT model specialized in the sports domain, trained on a corpus of sports news, suitable for sports-related natural language processing tasks.
Large Language Model
S
microsoft
3,361
24
Guwen Seg
Apache-2.0
A specialized tool for sentence segmentation in Classical Chinese, developed based on BERT architecture
Sequence Labeling
Transformers Chinese

G
ethanyt
20
7
Flaubert Base Uncased
MIT
FlauBERT is a French BERT model trained on a large-scale French corpus, developed by the French National Center for Scientific Research.
Large Language Model
Transformers French

F
flaubert
1,838
3
Ko Sbert Multitask
This is a Korean sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space.
Text Embedding
K
jhgan
7,030
17
FERNET CC Sk Ner
Slovak named entity recognition model fine-tuned on FERNET-CC_sk, supporting recognition of three entity types: locations, persons, and organizations
Sequence Labeling
Transformers Other

F
crabz
24
1
Bert Base Chinese
Gpl-3.0
Traditional Chinese BERT model developed by Academia Sinica CKIP Lab, supporting natural language processing tasks
Large Language Model Chinese
B
ckiplab
81.96k
26
Chemical Bert Uncased Tsdae
Apache-2.0
A chemical domain BERT model trained based on TSDAE (Transformer-based Sequential Denoising Auto-Encoder), focusing on sentence similarity tasks
Text Embedding
Transformers

C
recobo
16
0
Model Dccuchile Bert Base Spanish Wwm Uncased 1 Epochs
This is a sentence embedding model based on sentence-transformers, which can map text to a 256-dimensional vector space and is suitable for semantic search and clustering tasks.
Text Embedding
M
jfarray
8
0
Indobert Base P2
MIT
IndoBERT is the state - of - the - art Indonesian language model based on the BERT model, trained through masked language modeling and next - sentence prediction objectives.
Large Language Model Other
I
indobenchmark
25.89k
5
Kobert
Apache-2.0
KoBERT is a Korean pre-trained language model based on the BERT architecture, suitable for various Korean natural language processing tasks.
Large Language Model Korean
K
monologg
1.2M
14
Bert Kor Base
Korean BERT base model trained on a 70GB Korean text dataset, using 42,000 lowercase subword units.
Large Language Model Korean
B
kykim
89.96k
31
Bert Base Chinese Ner
Gpl-3.0
Provides Traditional Chinese transformers models and natural language processing tools
Sequence Labeling Chinese
B
ckiplab
17.95k
117
Rubertconv Toxic Clf
Apache-2.0
Russian toxicity text classifier based on rubert-base-cased-conversational model
Text Classification
Transformers Other

R
IlyaGusev
1,381
13
Bert Base Fa Qa
A Persian Q&A model based on BERT architecture, specifically designed for Persian question-answering tasks.
Question Answering System
B
SajjadAyoubi
115
8
Bert Base Thai Upos
Apache-2.0
BERT model pre-trained on Thai Wikipedia text for POS tagging and dependency parsing
Sequence Labeling
Transformers Other

B
KoichiYasuoka
53.03k
1
Danish Bert Botxo
A Danish BERT model developed by Certainly (formerly BotXO), supporting case-insensitive Danish text processing.
Large Language Model Other
D
Maltehb
1,306
14
Rubert Telegram Headlines
Apache-2.0
Russian news headline generation model based on RuBERT, specifically optimized for Telegram content
Text Generation
Transformers Other

R
IlyaGusev
241
18
Chinese Bert Wwm Ext Upos
Apache-2.0
BERT model pre-trained on Chinese Wikipedia texts for POS tagging and dependency parsing.
Sequence Labeling
Transformers Supports Multiple Languages

C
KoichiYasuoka
21
8
- 1
- 2
Featured Recommended AI Models