AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Long context reasoning

# Long context reasoning

Qwen3 8B NEO Imatrix Max GGUF
Apache-2.0
A NEO Imatrix quantized version based on the Qwen3-8B model, supporting 32K long context and enhanced reasoning ability
Large Language Model
Q
DavidAU
178
1
Qwen3 0.6B
Apache-2.0
Qwen3-0.6B is the latest generation of large language model with a parameter scale of 0.6B in the Tongyi Qianwen series. It supports the switching between thinking and non-thinking modes and has powerful reasoning, instruction-following, and intelligent agent capabilities.
Large Language Model Transformers
Q
Qwen
497.09k
264
Smart Lemon Cookie 7B GGUF
An uncensored role-playing model based on the GGUF format, with excellent reasoning and context tracking capabilities, suitable for local AI chat applications.
Large Language Model Transformers
S
backyardai
811
9
C4ai Command R Plus
Command R+ is a research version model with open weights and 104 billion parameters launched by Cohere Labs. It has retrieval-augmented generation (RAG) and tool usage capabilities, and supports multilingual and multi-step task automation.
Large Language Model Transformers Supports Multiple Languages
C
CohereLabs
8,002
1,719
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase