Distilroberta Base Finetuned Wikitext2
A fine-tuned version of the distilroberta-base model on the wikitext2 dataset, suitable for text-related tasks
Downloads 17
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of the distilled RoBERTa base model (distilroberta-base) on the wikitext2 dataset, primarily used for natural language processing tasks.
Model Features
Efficient Distilled Model
A distilled version based on RoBERTa, reducing model size while maintaining performance
Domain-Specific Fine-tuning
Fine-tuned on the wikitext2 dataset, potentially better suited for specific domain text processing
Model Capabilities
Text Understanding
Text Generation
Language Model Fine-tuning
Use Cases
Text Processing
Text Classification
Can be used for text classification tasks
Language Model Fine-tuning
Can serve as a base model for further task-specific fine-tuning
Featured Recommended AI Models
Š 2025AIbase