D

Darebeagle 7B

Developed by shadowml
DareBeagle-7B is a 7B-parameter large language model obtained by merging mlabonne/NeuralBeagle14-7B and mlabonne/NeuralDaredevil-7B using LazyMergekit, demonstrating excellent performance across multiple benchmarks.
Downloads 77
Release Time : 1/16/2024

Model Overview

DareBeagle-7B is a merged model that combines the strengths of NeuralBeagle14-7B and NeuralDaredevil-7B, focusing on text generation tasks and performing exceptionally well on the Open LLM Leaderboard.

Model Features

Model Merging Technique
Uses slerp method to merge two excellent models, combining their respective strengths
High Performance
Achieves outstanding results in multiple benchmarks with an average score of 74.58
Flexible Layer Configuration
Adopts different merging parameters for self_attn and mlp layers to optimize model performance

Model Capabilities

Text Generation
Question Answering
Reasoning Tasks
Knowledge QA

Use Cases

Education
Knowledge QA
Answers questions across various academic disciplines
65.03% accuracy on MMLU test
Research
Reasoning Tasks
Solves complex reasoning problems
71.67% normalized accuracy on AI2 Reasoning Challenge
Business Applications
Math Problem Solving
Solves mathematical calculations and reasoning problems
71.49% accuracy on GSM8k test
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase