D

Distilgpt2 Base Pretrained He

Developed by Norod78
A compact Hebrew text generation model based on GPT2 architecture, trained on TPU and GPU
Downloads 1,632
Release Time : 3/2/2022

Model Overview

This is a text generation model specifically optimized for Hebrew, distilled and fine-tuned based on the GPT2 architecture, suitable for Hebrew-related natural language processing tasks

Model Features

Hebrew optimization
Specifically trained and optimized for Hebrew, capable of generating fluent Hebrew text
Distilled architecture
Distilled based on GPT2 architecture, reducing model size while maintaining performance
Multi-source training
Trained using multiple Hebrew data sources including OSCAR corpus, CC-100, Twitter, and Wikipedia

Model Capabilities

Hebrew text generation
Context understanding
Language model fine-tuning

Use Cases

Content creation
Story continuation
Continue a story based on a given Hebrew opening
Example shows generated results for 'The last person on Earth sat alone in a room when suddenly there was a knock at the door'
Dialogue systems
Dialogue generation
Generate Hebrew dialogue responses
Example shows generated dialogue starting with 'Hello, my name is'
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase