Progenitor V3.3 LLaMa 70B
P
Progenitor V3.3 LLaMa 70B
Developed by Tarek07
This project aims to create a language model with better performance by fusing multiple pre-trained language models of 70B scale. Based on the Llama 3.3 instruction model, the Linear DELLA fusion method is used for model fusion.
Downloads 101
Release Time : 2/9/2025
Model Overview
This is a model fusion project based on Llama 3.3-70B-Instruct. By merging multiple high-performance 70B models, the language processing ability is improved.
Model Features
Multi-model fusion
Integrates 6 different 70B scale models, including Negative_LLAMA_70B, Anubis-70B-v1, etc.
Advanced fusion method
Uses the Linear DELLA fusion technology to optimize model performance
High-performance foundation
Based on the meta-llama/Llama-3.3-70B-Instruct model
Model Capabilities
Text generation
Instruction understanding
Natural language processing
Use Cases
Text generation
Creative writing
Generate high-quality creative content such as articles and stories
Dialogue system
Build an intelligent dialogue assistant
Research application
Language model research
Used for research and experiments on model fusion technology
Featured Recommended AI Models