AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Mixed precision training

# Mixed precision training

Kanana Nano 2.1b Instruct GGUF
Kanana Nano 2.1B Instruct is a 2.1 billion parameter instruction-finetuned language model developed by Kakao, supporting English and Korean text generation tasks.
Large Language Model Supports Multiple Languages
K
Melvin56
55
1
My Frugal Audio Model
Apache-2.0
This is an audio processing model fine-tuned based on facebook/wav2vec2-base, mainly used for speech-related tasks
Audio Classification Transformers
M
hsalehILB
1
0
Fillmaskmodel
MIT
A fill-mask model fine-tuned based on xlm-roberta-base for predicting masked text segments
Large Language Model Transformers
F
Okyx
46
1
Vit Base Patch16 224 Wi2
Apache-2.0
Vision Transformer model fine-tuned from google/vit-base-patch16-224, suitable for image classification tasks
Image Classification Transformers
V
Imene
21
0
Vit Base Patch16 224 In21k Wr
Apache-2.0
This model is a fine-tuned Vision Transformer based on google/vit-base-patch16-224-in21k on an unknown dataset, primarily used for image classification tasks.
Image Classification Transformers
V
Imene
21
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase