AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient Low-resource Training

# Efficient Low-resource Training

Llama Bodo Translation Model
Apache-2.0
A 4-bit quantized version of Meta-Llama-3.1-8B fine-tuned for bidirectional Bodo-English translation, optimized with Unsloth for faster training
Large Language Model Transformers Supports Multiple Languages
L
Luson045
27
1
Suzume Llama 3 8B Japanese
Other
Japanese fine-tuned model based on Llama 3, optimized for Japanese dialogue
Large Language Model Transformers
S
lightblue
2,011
24
Distilbert Base Squad2 Custom Dataset
A model fine-tuned on SQuAD2.0 and custom Q&A datasets based on Distilbert_Base, focusing on efficient Q&A tasks
Question Answering System Transformers
D
superspray
17
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase