đ Skywork SWE 32B - A Community Model by Skywork
Part of the LM Studio Community models highlights program, which showcases new and remarkable models from the community. Join the discussion on Discord.
This model aims to provide high - performance solutions for software engineering tasks, leveraging advanced code - generation capabilities.
đ Quick Start
There is no specific quick - start guide provided in the original document.
⨠Features
- Quantization: Quantized by bartowski based on
llama.cpp
release b5697.
- Context Length: Supports a context length of 32k tokens.
- Benchmark Performance: Achieves 38.0% pass@1 accuracy on the SWE - bench Verified benchmark, outperforming previous open - source SoTA LLMs. With test - time scaling techniques, the accuracy can reach 47.0%.
đĻ Installation
There is no installation guide provided in the original document.
đģ Usage Examples
There are no code examples provided in the original document.
đ Documentation
Model Information
Property |
Details |
Model Creator |
Skywork |
Original Model |
[Skywork - SWE - 32B](https://huggingface.co/Skywork/Skywork - SWE - 32B) |
GGUF Quantization Provider |
bartowski |
Quantization Base |
llama.cpp release b5697 |
Pipeline Tag |
text - generation |
Base Model |
Skywork/Skywork - SWE - 32B |
License |
apache - 2.0 |
Base Model Relation |
quantized |
Tags |
swe - bench |
Metrics |
pass@1 |
Model Description
Skywork - SWE - 32B is a code agent model developed by Skywork AI, specifically designed for software engineering (SWE) tasks.
đ§ Technical Details
- Context Length: Supports a context length of 32k tokens.
- Benchmark Performance: Skywork - SWE - 32B attains 38.0% pass@1 accuracy on the SWE - bench Verified benchmark, outperforming previous open - source SoTA [Qwen2.5 - Coder - 32B - based](https://huggingface.co/Qwen/Qwen2.5 - Coder - 32B) LLMs built on the [OpenHands](https://github.com/All - Hands - AI/OpenHands) agent framework. When incorporated with test - time scaling techniques, the performance further improves to 47.0% accuracy, surpassing the previous SoTA results for sub - 32B parameter models.
đ License
The model is licensed under the apache - 2.0 license.
Special thanks
Special thanks to Georgi Gerganov and the whole team working on llama.cpp for making all of this possible.
Disclaimers
â ī¸ Important Note
LM Studio is not the creator, originator, or owner of any Model featured in the Community Model Program. Each Community Model is created and provided by third parties. LM Studio does not endorse, support, represent or guarantee the completeness, truthfulness, accuracy, or reliability of any Community Model. You understand that Community Models can produce content that might be offensive, harmful, inaccurate or otherwise inappropriate, or deceptive. Each Community Model is the sole responsibility of the person or entity who originated such Model. LM Studio may not monitor or control the Community Models and cannot, and does not, take responsibility for any such Model. LM Studio disclaims all warranties or guarantees about the accuracy, reliability or benefits of the Community Models. LM Studio further disclaims any warranty that the Community Model will meet your requirements, be secure, uninterrupted or available at any time or location, or error - free, viruses - free, or that any errors will be corrected, or otherwise. You will be solely responsible for any damage resulting from your use of or access to the Community Models, your downloading of any Community Model, or use of any other Community Model provided by or through LM Studio.