O

Opt 350m

Developed by facebook
OPT is an open-source pre-trained Transformer language model developed by Meta AI, with parameter scales ranging from 125 million to 175 billion, designed to advance research in large-scale language models.
Downloads 314.14k
Release Time : 5/11/2022

Model Overview

OPT is a set of decoder-only pre-trained Transformer models trained using causal language modeling objectives, supporting text generation and fine-tuning for downstream tasks.

Model Features

Open Research Orientation
Aims to lower the barriers to large language model research, promoting reproducibility and community engagement
GPT-3 Level Performance
Model scale and performance are comparable to GPT-3, but with more efficient data collection and training methods
Multiple Scale Options
Offers model choices ranging from 125 million to 175 billion parameters

Model Capabilities

Text Generation
Zero-shot Learning
Few-shot Learning
Downstream Task Fine-tuning

Use Cases

Text Generation
Content Creation
Generate articles, stories, or dialogue content
Can produce coherent text paragraphs
Educational Research
Language Model Research
Study biases, robustness, and other issues in large language models
Featured Recommended AI Models
ยฉ 2025AIbase