U

Una Xaberius 34b V1beta

Developed by fblgit
An experimental 34B model based on LLaMa-Yi-34B, trained on multiple datasets through SFT, DPO, and UNA techniques, with outstanding performance on the HuggingFace Open LLM Leaderboard
Downloads 97
Release Time : 12/5/2023

Model Overview

XABERIUS 34B v1-BETA is a 34B parameter large language model developed by Juanako.AI, trained using Unified Neural Alignment (UNA) technology, setting new records for 34B models in multiple benchmarks

Model Features

Unified Neural Alignment technology
Utilizes innovative UNA (Unified Neural Alignment) training methods to significantly improve model performance
34B parameter scale
Achieves performance surpassing larger-scale models at the 34B parameter size
Multi-stage training
Combines SFT (Supervised Fine-Tuning), DPO (Direct Preference Optimization), and UNA techniques for training
Leaderboard leader
Achieves a comprehensive score of 74.21 on the HuggingFace Open LLM Leaderboard, ranking first

Model Capabilities

Text generation
Question answering system
Dialogue system
Reasoning tasks

Use Cases

Knowledge Q&A
AI2 Reasoning Challenge
Performs excellently on the AI2 Reasoning Challenge (ARC-Challenge) dataset
Standardized accuracy rate 70.39 (25-shot few-shot)
General dialogue
Open-domain dialogue
Supports high-quality open-domain dialogue interactions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase