E.star.7.b
A 7B-parameter large language model based on the Mistral architecture, efficiently trained using Unsloth and TRL libraries, demonstrating excellent performance in multiple benchmarks.
Downloads 86
Release Time : 3/24/2024
Model Overview
This is a text generation model, fine-tuned from yam-peleg/Experiment26-7B, focusing on delivering high-quality text generation capabilities.
Model Features
Efficient Training
Trained using Unsloth and TRL libraries, achieving 2x speed improvement
Multitask Performance
Excellent performance in multiple benchmarks including AI2 Reasoning Challenge and HellaSwag
Open Source License
Released under Apache 2.0 license, allowing commercial use
Model Capabilities
Text Generation
Question Answering
Reasoning Tasks
Knowledge QA
Use Cases
Education
Knowledge QA
Answering questions across various academic disciplines
Achieved 63.44% accuracy in MMLU benchmark
Research
Reasoning Tasks
Solving logical reasoning problems
Achieved 63.91% normalized accuracy in AI2 Reasoning Challenge
Business
Content Generation
Generating various business text content
Featured Recommended AI Models
Š 2025AIbase