# RoBERTa-large architecture
Polish Sts V2
This is a Polish-language sentence embedding model capable of mapping sentences and paragraphs into a 1024-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding
Transformers Other

P
radlab
43
2
Bertin Roberta Large Spanish
BERTIN is a series of Spanish language models based on BERT. This model follows the RoBERTa-large architecture, trained from scratch using the Flax framework, with data sourced from the Spanish portion of the mC4 corpus.
Large Language Model Spanish
B
flax-community
26
0
Roberta Large Bne Capitel Pos
Apache-2.0
A RoBERTa-large model trained on data from the Spanish National Library (BNE), fine-tuned for Spanish POS tagging on the CAPITEL dataset
Sequence Labeling
Transformers Supports Multiple Languages

R
PlanTL-GOB-ES
186
2
Featured Recommended AI Models