13B Thorns L2
1
13B Thorns L2
Developed by CalderaAI
13B-Thorns is an instruction-based integrated and merged model of LLaMAv2-13B, using the Alpaca format, combining the advantages of multiple models to provide powerful language processing capabilities.
Downloads 386
Release Time : 9/6/2023
Model Overview
This model merges multiple carefully selected models through the Spherical Linear Interpolation (SLERP) method, aiming to provide more powerful and diverse language processing capabilities, with a particular focus on the balance between logic and creativity.
Model Features
Multi-model integration
Merge multiple carefully selected models using the Spherical Linear Interpolation (SLERP) method to achieve a smooth transition in the feature space
Balance between logic and creativity
Optimize logical reasoning and creative output capabilities respectively through segmented design
Low-rank adapter fusion
Strategically fuse LoRAs into the models that can benefit the most from them to enhance specific capabilities
Model Capabilities
Instruction following
Text generation
Logical reasoning
Creative writing
Use Cases
Research purposes
Language model research
Used to study the effects of multi-model integration methods and low-rank adapters
Text generation
Creative writing
Generate diverse content using the model's creative segments
Instruction response
Follow complex instructions to generate coherent and logical responses
Featured Recommended AI Models