D

Distilbert Base Uncased Squad2 With Ner With Neg With Multi

Developed by andi611
A multi-task model based on DistilBERT for question answering and named entity recognition, fine-tuned on the conll2003 dataset
Downloads 20
Release Time : 3/2/2022

Model Overview

This model is a multi-task model based on the DistilBERT architecture, supporting both question answering systems and named entity recognition tasks, fine-tuned on the conll2003 dataset.

Model Features

Multi-task Processing
Supports both question answering systems and named entity recognition tasks simultaneously
Lightweight Architecture
Based on the DistilBERT architecture, more lightweight and efficient than the full BERT model
Domain-specific Optimization
Fine-tuned on the conll2003 dataset, suitable for named entity recognition in news domains

Model Capabilities

Text Question Answering
Named Entity Recognition
Text Understanding

Use Cases

Information Extraction
News Entity Recognition
Identify entities such as person names, locations, and organization names from news texts
Performs well on the conll2003 dataset
Intelligent Question Answering
Document QA System
Answer user questions based on given documents
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase