G

Gpt Neox 20b

Developed by EleutherAI
GPT-NeoX-20B is an open-source autoregressive language model with 20 billion parameters, designed based on the GPT-3 architecture and trained on The Pile dataset.
Downloads 345.06k
Release Time : 4/7/2022

Model Overview

GPT-NeoX-20B is a large English language model primarily intended for research purposes, capable of generating coherent text and learning internal language representations.

Model Features

Large-scale parameters
With 20 billion parameters, it provides powerful language understanding and generation capabilities
Open-source model
Released under the Apache 2.0 license, allowing for both research and commercial use
GPT-3 architecture
Adopts a Transformer architecture similar to GPT-3
Rotary Position Embedding
Uses RoPE (Rotary Position Embedding) for positional encoding

Model Capabilities

English text generation
Language understanding
Text completion
Linguistic feature extraction

Use Cases

Research
Language model research
Used to study the behavior and characteristics of large-scale language models
Downstream task feature extraction
Serves as a base model for extracting features for other NLP tasks
Application development
Text generation applications
Can be fine-tuned for developing text generation applications
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase