D

Dolly V2 12b

Developed by databricks
A 12-billion-parameter instruction fine-tuned large language model launched by Databricks, fine-tuned on 15K instruction data points based on pythia-12b, with commercial use permitted
Downloads 3,270
Release Time : 4/11/2023

Model Overview

A causal language model based on Pythia-12b, fine-tuned with instruction datasets generated by Databricks employees, capable of instruction following

Model Features

Business-friendly License
Adopts the MIT open-source license, explicitly permitting commercial use
Instruction Fine-tuning Optimization
Fine-tuned on 15K human-generated instruction data points, covering 7 types of task scenarios
Multiple Size Options
Offers three parameter scale versions: 3B/7B/12B

Model Capabilities

Instruction Following
Brainstorming
Text Classification
Closed-ended QA
Content Generation
Information Extraction
Open-ended QA
Text Summarization

Use Cases

Knowledge QA
Scientific Concept Explanation
Explain the difference between nuclear fission and nuclear fusion
Generates scientifically accurate comparative explanations
Information Processing
Person Information Extraction
Extract key personal information from given text
Accurately identifies and summarizes key points of a person's biography
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase