B

Bert Base 15lang Cased

Developed by Geotrend
This is a smaller version of bert-base-multilingual-cased, supporting 15 languages while preserving the original model's accuracy.
Downloads 21
Release Time : 3/2/2022

Model Overview

This model is a compact version of multilingual BERT, supporting 15 languages and fully retaining the representations generated by the original model, thus maintaining its original accuracy.

Model Features

Multilingual support
Supports 15 languages including English, French, Spanish, German, Chinese, etc.
Maintains original accuracy
Fully retains the representations generated by the original model, preserving its accuracy.
Compact model size
Reduced parameter count and memory usage compared to the original model, with faster loading times.

Model Capabilities

Multilingual text understanding
Masked language modeling
Text classification
Named entity recognition

Use Cases

Natural language processing
Multilingual text classification
Can be used for classification tasks on multilingual texts.
Named entity recognition
Can be used to identify named entities in multilingual texts.
Featured Recommended AI Models
ยฉ 2025AIbase