B

Boreangale 70B

Developed by alchemonaut
BoreanGale-70B is a 70B-parameter large language model merged using a custom algorithm (NearSwap), combining the miqu-1-70b-sf and WinterGoddess-1.4x-70B-L2 models.
Downloads 17
Release Time : 2/2/2024

Model Overview

BoreanGale-70B is a high-performance large language model focused on text generation tasks, excelling in multiple benchmarks.

Model Features

High-performance Text Generation
Excels in multiple benchmarks including AI2 Reasoning Challenge, HellaSwag, MMLU, etc.
Model Merging Technique
Uses a custom NearSwap algorithm to merge two high-performance models, potentially combining their strengths.
Multi-quantization Version Support
The community provides various quantization versions, including GGUF and exl2 formats, for easy deployment across different hardware environments.

Model Capabilities

Text Generation
Reasoning Ability
Common Sense Understanding
Mathematical Calculation

Use Cases

Academic Research
Scientific Problem Solving
Answers complex scientific questions
Achieved a normalized accuracy of 73.89 in the AI2 Reasoning Challenge
Common Sense Reasoning
Common Sense Q&A
Answers questions requiring common sense judgment
Achieved a normalized accuracy of 89.37 in the HellaSwag test
Mathematical Problem Solving
Mathematical Calculation
Solves mathematical problems
Achieved an accuracy of 67.32 in the GSM8k test
Featured Recommended AI Models
ยฉ 2025AIbase