S

Spar Wiki Bm25 Lexmodel Context Encoder

Developed by facebook
A dense retriever based on BERT-base architecture, trained on Wikipedia articles to mimic BM25 behavior
Downloads 68
Release Time : 9/21/2022

Model Overview

This model is the context encoder for the Wiki BM25 lexical model (Λ) in the SPAR paper, designed for dense retrieval tasks and capable of mimicking the behavior of the sparse retriever BM25.

Model Features

Mimics BM25 Behavior
Trained on Wikipedia articles to mimic the behavior of the sparse retriever BM25
Works with Query Encoder
Designed to be used with a corresponding query encoder, similar to the DPR model
Combines with Dense Retrievers
Can be combined with standard dense retrievers (e.g., DPR, Contriever) to build a dense retriever proficient in both lexical and semantic matching

Model Capabilities

Feature Extraction
Dense Retrieval
Lexical Matching

Use Cases

Open-Domain Question Answering
Wikipedia Information Retrieval
Used to retrieve information relevant to queries from Wikipedia articles
Effectively mimics BM25 behavior, providing accurate retrieval results
Information Retrieval
Combined with DPR
Integrated with the DPR model to build a dense retriever proficient in both lexical and semantic matching
Enhances retrieval performance by combining the strengths of lexical and semantic matching
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase