# Downstream Task Adaptation
Qwen 0.6B Gptq
This is a transformers model hosted on Hugging Face Hub, with no detailed information provided about its specific functions and uses.
Large Language Model
Transformers

Q
Wfiles
29
1
Qwen2.5 7b SFT Three Subtasks 3epoch
This is a model based on the 🤗 transformers library, with specific functions and purposes not yet clearly stated.
Large Language Model
Transformers

Q
mjfmark
97
1
Test Transformers Model Hogehoge
This is a transformers model hosted on the Hugging Face Hub, with its specific functions and uses not clearly stated.
Large Language Model
Transformers

T
fztkm
14
0
C RADIOv2 G
Other
C-RADIOv2 is a visual feature extraction model developed by NVIDIA, offering multiple specification versions suitable for image understanding and dense processing tasks.

C
nvidia
648
11
C RADIOv2 B
Other
C-RADIOv2 is a visual feature extraction model developed by NVIDIA, offering multiple size versions suitable for image understanding and dense visual tasks.

C
nvidia
404
8
Dac 44khz
This is a feature extraction model based on 🤗 transformers. Specific functions and uses require further information.
Large Language Model
Transformers

D
descript
192.61k
6
Accent Classifier
This is a 🤗 transformers model card hosted on Hugging Face Hub, automatically generated by the system.
Large Language Model
Transformers

A
ylacombe
131
4
C RADIO
Other
A visual feature extraction model developed by NVIDIA for generating image embeddings, supporting downstream tasks such as image classification.

C
nvidia
398
14
Dart V2 Vectors
Apache-2.0
This is a 🤗 Transformers model card hosted on Hugging Face Hub. The content is automatically generated.
Large Language Model
Transformers

D
p1atdev
71
1
Vits Tts Thai
This is a transformers model hosted on Hugging Face Hub. Specific functions and usage require additional information.
Large Language Model
Transformers

V
chuubjak
64
2
Distilbert Feature Extraction
DistilBERT is a lightweight distilled version of BERT, retaining 97% of BERT's performance while being 40% smaller in size.
Large Language Model
Transformers

D
julien-c
2,223
2
Distilroberta Base V2
Apache-2.0
DistilRoBERTa is a lightweight distilled version of the RoBERTa model, retaining most of its performance with fewer parameters, making it suitable for efficient text processing tasks.
Large Language Model English
D
typeform
22
0
Featured Recommended AI Models