D

Distil Large V3

Developed by distil-whisper
Distil-Whisper is a knowledge-distilled version of Whisper large-v3, focusing on English automatic speech recognition, offering faster inference speeds while maintaining accuracy close to the original model.
Downloads 417.11k
Release Time : 3/21/2024

Model Overview

This is the third version of the Distil-Whisper English series, developed through large-scale pseudo-label knowledge distillation technology, optimizing long-form transcription accuracy with significant performance improvements over previous versions.

Model Features

Efficient Inference
6.3x faster than the original Whisper large-v3 model and 1.1x faster than the previous distil-large-v2
Long-Form Transcription Optimization
Uses sequential long-form algorithms to provide superior long-form transcription accuracy
Compatibility with Mainstream Libraries
Designed to be compatible with popular libraries like Whisper cpp, Faster-Whisper, and OpenAI Whisper

Model Capabilities

English speech recognition
Short-form audio transcription
Long-form audio transcription
Timestamp generation

Use Cases

Speech Transcription
Meeting Minutes
Convert meeting recordings into text transcripts
Accuracy close to the original Whisper large-v3 model
Podcast Transcription
Convert long-form podcast content into text
4.8% higher accuracy than distil-large-v2 when using sequential algorithms
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase