G

Gpt2 Turkish 900m

Developed by cenkersisman
A Turkish large language model based on the GPT-2 architecture, designed specifically for Turkish text generation tasks
Downloads 246
Release Time : 8/15/2023

Model Overview

This model is a Turkish large language model built on the GPT-2 architecture, utilizing a specially designed Turkish tokenizer capable of generating human-like text from given initial prompts.

Model Features

Turkish language optimization
Utilizes a tokenizer compliant with Turkish spelling rules, specifically optimized for Turkish text
Limited length generation
Maximum sentence length is limited to 128 tokens, suitable for generating shorter texts
Localized training
Trained on a Turkish Wikipedia dataset containing 900 million characters

Model Capabilities

Turkish text generation
Context continuation
Question-answer generation

Use Cases

Education
Language learning assistance
Provides example sentences and practice materials for Turkish language learners
Content creation
Creative writing
Assists writers in generating creative Turkish text fragments
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase