D

Distilbert Base En Fr Ar Cased

Developed by Geotrend
This is a distilled version of distilbert-base-multilingual-cased, supporting English, French and Arabic processing while maintaining the original model's representation accuracy.
Downloads 13
Release Time : 3/2/2022

Model Overview

This model is a distilled version of distilbert-base-multilingual-cased, focusing on processing capabilities for three languages: English, French, and Arabic. It can generate identical representations as the original model, thus preserving the original accuracy.

Model Features

Multilingual support
Focuses on processing capabilities for three languages: English, French, and Arabic
Preserves original accuracy
Can generate identical representations as the original model, maintaining its accuracy
Distilled architecture
Based on DistilBERT architecture, more lightweight than the full BERT model

Model Capabilities

Multilingual text understanding
Multilingual text representation generation

Use Cases

Natural Language Processing
Cross-lingual text classification
Classify texts in English, French, and Arabic
Multilingual information retrieval
Process queries and documents in multiple languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase