Distilbert Base Multi Cased Ner
This is a multilingual named entity recognition (NER) model based on DistilBERT, supporting 10 languages, and quantized for compatibility with Transformers.js.
Sequence Labeling
Transformers Supports Multiple Languages#Multilingual NER#Lightweight Distillation Model#ONNX Quantization

Downloads 16
Release Time : 11/29/2023
Model Overview
This model is a multilingual variant of DistilBERT, specifically designed for named entity recognition tasks. After quantization, the model size is smaller, making it suitable for use in browser environments.
Model Features
Multilingual Support
Supports named entity recognition in 10 languages
Quantization
Quantized with 02 quantization, resulting in a smaller model size suitable for front-end deployment
Transformers.js Compatibility
Can be used directly in browser environments
Based on DistilBERT
Uses the lightweight DistilBERT architecture, maintaining performance while reducing computational resource requirements
Model Capabilities
Text Entity Recognition
Multilingual Processing
Browser-side Inference
Use Cases
Information Extraction
Multilingual Document Analysis
Extracts entity information such as person names, locations, and organization names from multilingual documents
Accurately identifies named entities in 10 languages
Content Classification
News Classification
Classifies news content based on entity information
Featured Recommended AI Models