G

Goku 8x22B V0.1

Developed by MaziyarPanahi
A multilingual large model fine-tuned based on Mixtral-8x22B-v0.1, with a total of 141B parameters and 35B activated parameters
Downloads 35
Release Time : 4/12/2024

Model Overview

This is a hybrid expert model fine-tuned on the guanaco-sharegpt-style dataset, supporting multilingual text generation tasks

Model Features

Hybrid Expert Architecture
Utilizes a combination of 8 expert models, activating only a subset during inference for efficient computation
Multilingual Support
Native support for French, Italian, German, Spanish, and English
Instruction Fine-tuning
Optimized on the guanaco-sharegpt-style dataset to enhance dialogue and instruction-following capabilities

Model Capabilities

Multilingual Text Generation
Long Text Understanding
Programming Code Generation
Basic Reasoning
Story Creation

Use Cases

Content Creation
Story Generation
Generates coherent long-form narrative texts
As demonstrated in the Dragon Ball-themed story example
Technical Applications
Code Assistance
Generates and explains programming code
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase