D

Dpr Question Encoder Bert Uncased L 2 H 128 A 2

Developed by nlpconnect
DPR question encoder model based on BERT architecture for dense passage retrieval tasks
Downloads 21
Release Time : 3/2/2022

Model Overview

This model is the question encoder component of the Dense Passage Retrieval (DPR) system based on BERT architecture, used to encode natural language questions into vector representations for similarity matching with vectors generated by the passage encoder.

Model Features

Lightweight BERT Architecture
Uses a small BERT structure with 2 layers and 128-dimensional hidden layers for high computational efficiency
Dense Retrieval Capability
Optimized for dense passage retrieval tasks, capable of effectively encoding question semantics
High Performance
Achieves 60.53% R@10 recall rate on the NQ development dataset

Model Capabilities

Natural language question encoding
Semantic vector generation
Dense retrieval support

Use Cases

Question Answering Systems
Open-Domain Question Answering
Used to build retrieval components in open-domain question answering systems
Outperforms baseline models on the NQ dataset
Information Retrieval
Document Retrieval
Used to retrieve document passages most relevant to user queries
Achieves 49.68% R@10 recall rate on BEIR test data
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase