Bert Large Uncased Whole Word Masking Squad2 With Ner Conll2003 With Neg With Repeat
B
Bert Large Uncased Whole Word Masking Squad2 With Ner Conll2003 With Neg With Repeat
Developed by andi611
A BERT-large architecture-based model fine-tuned on SQuAD 2.0 and CoNLL-2003 datasets, supporting question answering and named entity recognition tasks
Downloads 18
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of BERT-large on the squad_v2 and conll2003 datasets, primarily used for question answering systems and named entity recognition tasks.
Model Features
Multi-task Capability
Supports both question answering systems and named entity recognition tasks simultaneously
Whole Word Masking Training
Utilizes whole word masking technology for pre-training, enhancing model comprehension
Large Model Architecture
Based on the BERT-large architecture, providing stronger representation capabilities
Model Capabilities
Question Answering System
Named Entity Recognition
Text Understanding
Context Analysis
Use Cases
Information Extraction
Document Question Answering System
Extracts precise answers to questions from documents
Performs well on the SQuAD 2.0 dataset
Text Analysis
Entity Recognition
Identifies entities such as person names, locations, and organization names in text
Performs well on the CoNLL-2003 dataset
Featured Recommended AI Models
Š 2025AIbase