# Small-scale BERT
Kyrgyzbert
Apache-2.0
A small-scale language model based on the BERT architecture, specifically designed for Kyrgyz natural language processing applications.
Large Language Model
Transformers Other

K
metinovadilet
79
2
Vectorizer V1 S Multilingual
A multilingual vectorizer developed by Sinequa that generates embedding vectors for input paragraphs or queries, used for similarity calculation and information retrieval.
Text Embedding
Transformers Supports Multiple Languages

V
sinequa
322
0
Vectorizer V1 S En
A vectorizer developed by Sinequa capable of generating embedding vectors from paragraphs or queries for sentence similarity computation and feature extraction.
Text Embedding
Transformers English

V
sinequa
304
0
Bert Small Kor V1
Apache-2.0
Korean foundational model based on the BERT architecture, trained using Korean text data from the AI Hub web corpus (approximately 52 million texts)
Large Language Model
Transformers Supports Multiple Languages

B
bongsoo
41
1
Gn Bert Small Cased
MIT
A BERT model pretrained for Guarani (6 layers, case-sensitive). Trained on Wikipedia + Wiktionary (approx. 800k tokens).
Large Language Model
Transformers Other

G
mmaguero
26
0
Bertinho Gl Small Cased
A pre-trained BERT model for Galician (6 layers, case-sensitive). Trained on Wikipedia.
Large Language Model Other
B
dvilares
56
2
Featured Recommended AI Models