A

Arcana Qwen3 2.4B A0.6B

Developed by suayptalha
This is a Mixture of Experts (MoE) model based on Qwen3, with a total of 2.4 billion parameters, including four expert models with 0.6 billion parameters each, designed to deliver more accurate results with higher efficiency and lower memory usage.
Downloads 199
Release Time : 5/12/2025

Model Overview

This model is a Mixture of Experts (MoE) model that combines expert models from four domains: code, mathematics, medicine, and instruction-following, making it suitable for multi-domain text generation tasks.

Model Features

Mixture of Experts Model
Combines expert models from four domains (code, mathematics, medicine, and instruction-following) to provide more efficient and accurate text generation.
Efficient Inference
Dynamically selects experts through a routing model, reducing memory usage and computational costs.
Multi-Domain Support
Covers various tasks such as code generation, mathematical reasoning, medical Q&A, and general instruction-following.

Model Capabilities

Code Generation
Mathematical Reasoning
Medical Q&A
Instruction-Following
Conditional Judgment
Text Generation

Use Cases

Healthcare
Medical Q&A
Answer medical-related questions, such as symptom analysis, treatment recommendations, etc.
Programming
Code Generation
Generate code snippets or solve programming problems based on requirements.
Mathematics
Mathematical Reasoning
Solve mathematical problems or perform logical reasoning.
General
Instruction-Following
Generate corresponding text responses based on user instructions.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase