M

Miqu 1 120b

Developed by wolfram
A 120B hybrid large language model generated by interleaving fusion of miqu-1-70b-sf layers using the mergekit tool based on miqu-1-70b
Downloads 15
Release Time : 2/3/2024

Model Overview

This is a 120B-parameter large language model enhanced through merging techniques, supporting multilingual processing, and excelling in long-context understanding and complex dialogue interactions.

Model Features

Massive Parameter Scale
120B parameters provide stronger comprehension and generation capabilities
Long Context Support
Supports a long context window of 32764 tokens while maintaining excellent contextual understanding
Multilingual Capabilities
Supports English, German, French, Spanish, and Italian
Diverse Quantized Versions
Offers multiple EXL2 and GGUF quantized versions to meet different hardware requirements

Model Capabilities

Long Text Generation
Multilingual Dialogue
Complex Instruction Understanding
Context Retention
Creative Writing

Use Cases

Dialogue Systems
AI Assistant
Can serve as a ChatGPT alternative for personal assistance
User evaluations show its comprehension is close to ChatGPT-4 levels
Content Creation
Creative Writing
Generates coherent long-form creative content
Maintains coherence and stylistic consistency in long texts
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase