# Multi-task Support
Seed Coder 8B Instruct V1
A general-purpose text generation model based on the Transformers library, supporting various natural language processing tasks
Large Language Model
Transformers

S
wasmdashai
57
2
YOYO O1 32B
YOYO-O1-32B is a versatile large language model that integrates top-tier 32B reasoning and code models from the open-source community, built using SCE fusion technology.
Large Language Model
Transformers

Y
YOYO-AI
25
2
Mxbai Rerank Large V2 Seq
Apache-2.0
A sentence transformer model supporting multiple languages, suitable for text ranking tasks
Large Language Model
Transformers Supports Multiple Languages

M
michaelfeil
210
8
Phien Table Structure Recognition 143 V 1 1 All
This model is based on the Transformers architecture, and its specific purpose and functionality require further information to confirm.
Large Language Model
Transformers

P
trungphien
13
0
Sam2 Hiera Small
Apache-2.0
SAM2-Hiera Small is a compact variant of Meta's SAM2 model for efficient mask generation tasks.
Image Segmentation
S
merve
44
1
Instantid
Apache-2.0
InstantID is an advanced, tuning-free method that achieves identity-preserving generation with just a single image, supporting multiple downstream tasks.
Image Generation English
I
InstantX
86.99k
783
Has 820m
A privacy protection model developed by Tencent Security Xuanwu Lab, safeguarding user privacy by hiding sensitive information and restoring output content.
Large Language Model
Transformers Supports Multiple Languages

H
SecurityXuanwuLab
2,730
24
Chinese Roberta Wwm Ext Large
Apache-2.0
A Chinese pre-trained BERT model employing whole word masking strategy, designed to accelerate Chinese natural language processing research.
Large Language Model Chinese
C
hfl
30.27k
200
Bertin Base Xnli Es
A pre-trained model based on the Spanish RoBERTa-base architecture, fine-tuned for the XNLI dataset, optimized with Gaussian sampling for training data quality
Large Language Model
Transformers Spanish

B
bertin-project
20
1
Hebert NER
HeBERT is a Hebrew pretrained language model based on the BERT architecture, supporting tasks such as polarity analysis and sentiment recognition.
Large Language Model
Transformers

H
avichr
435
5
Bert Base Greek Uncased V1
GreekBERT is a pre-trained language model for Greek, suitable for various Greek natural language processing tasks.
Large Language Model Other
B
nlpaueb
5,984
37
Chinese Roberta Wwm Ext
Apache-2.0
A Chinese pretrained BERT model using whole word masking technology, designed to accelerate the development of Chinese natural language processing.
Large Language Model Chinese
C
hfl
96.54k
324
Featured Recommended AI Models