AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Gated IQ Reasoning Enhancement

# Gated IQ Reasoning Enhancement

Llama3.1 MOE 4X8B Gated IQ Multi Tier Deep Reasoning 32B GGUF
Apache-2.0
A Mixture of Experts (MoE) model based on the Llama 3.1 architecture, featuring gated IQ and multi-tier deep reasoning capabilities, supporting 128k context length and multiple languages.
Large Language Model Supports Multiple Languages
L
DavidAU
652
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase