AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-expert fusion

# Multi-expert fusion

Eagle X4 8B Plus
Eagle is a vision-centric high-resolution multimodal large language model family that enhances the perception ability of multimodal large language models by fusing multiple visual encoders and different input resolutions.
Multimodal Fusion Transformers
E
NVEagle
1,699
4
Noro Hermes 3x7B
Apache-2.0
Noro-Hermes-3x7B is a Mixture of Experts (MoE) model built using the lazy merge toolkit, combining three 7B-parameter Mistral variant models with capabilities in intelligent assistance, creative role-playing, and general task processing.
Large Language Model Transformers
N
ThomasComics
16
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase