T

Tito 7B Slerp

Developed by Stopwolf
Tito-7B-slerp is a large language model created by merging the YugoGPT and AlphaMonarch-7B models using the mergekit tool, excelling in Serbian and English tasks.
Downloads 22
Release Time : 2/28/2024

Model Overview

This model employs the slerp fusion method to combine the strengths of YugoGPT and AlphaMonarch-7B, demonstrating robust performance in text generation tasks, particularly excelling in Serbian language tasks.

Model Features

Bilingual capability
Performs excellently in both English and Serbian tasks, with special optimization for Serbian language performance.
Model fusion technology
Uses the slerp fusion method to combine the strengths of YugoGPT and AlphaMonarch-7B.
High performance
Achieves an average score of 70.13 on the Open LLM Leaderboard, surpassing the base model YugoGPT.

Model Capabilities

Text generation
Question answering systems
Reasoning tasks
Multilingual processing

Use Cases

Education
Language learning assistant
Assists in learning Serbian and English
Achieved an average score of 47.82 in Serbian language evaluations.
Research
Benchmark testing
Used to evaluate multilingual model performance
Scored 68.09 in the AI2 Reasoning Challenge.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase