C

Capytessborosyi 34B 200K DARE Ties

Developed by brucethemoose
This is a 34B-parameter large language model merged using the DARE Ties method via mergekit, based on the Yi-34B-200K architecture, integrating the capabilities of three models: Nous-Capybara-34B, Tess-M-v1.3, and airoboros-3_1-yi-34b-200k.
Downloads 88
Release Time : 11/28/2023

Model Overview

The model focuses on text generation tasks, optimizing perplexity performance through the innovative DARE Ties merging method, supporting up to 200K context length, making it suitable for scenarios requiring long text processing.

Model Features

DARE Ties Merging Technique
Uses the experimental DARE Ties merging method, offering better perplexity performance compared to traditional Ties merging.
Long Context Support
Supports up to 200K context length, making it suitable for processing long documents and complex dialogue scenarios.
Multi-model Capability Fusion
Integrates the dialogue capabilities of Nous-Capybara-34B, the general capabilities of Tess-M-v1.3, and the long text processing capabilities of airoboros-3_1-yi-34b-200k.

Model Capabilities

Long Text Generation
Dialogue Systems
Text Continuation
Instruction Following

Use Cases

Content Creation
Novel Writing
Leverages the 200K context length advantage for coherent long-form story creation
Generates ultra-long texts with consistent plotlines
Dialogue Systems
Complex Dialogue Scenarios
Handles complex dialogues with extensive context
Understands and responds to details within long conversation histories
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase