S

Simctg Wikitext103

Developed by cambridgeltl
GPT-2 language model trained with the SimCTG framework, using contrastive search to generate more coherent text
Downloads 19
Release Time : 3/2/2022

Model Overview

This model is a GPT-2 variant trained on the Wikitext-103 dataset, improving neural text generation quality through a contrastive framework, especially suitable for open-domain text generation tasks

Model Features

Contrastive Search Generation
Uses contrastive search algorithm to generate text, balancing diversity and coherence
Improved Text Quality
Generates more coherent and contextually relevant text compared to standard GPT-2
Easy to Use
Provides clear API interfaces and example code for quick integration and usage

Model Capabilities

Open-domain text generation
Text continuation
Language modeling

Use Cases

Content Creation
Article Continuation
Automatically generates coherent subsequent content based on given text prefixes
Produces fluent text consistent with the input context
Game Development
Game Review Generation
Automatically generates game review content
For example, generates Tetris game reviews as shown in the examples
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase