S

Spar Wiki Bm25 Lexmodel Query Encoder

Developed by facebook
A dense retriever based on BERT-base architecture, trained on Wikipedia articles to emulate BM25 behavior
Downloads 80
Release Time : 9/21/2022

Model Overview

This model is the query encoder component of the lexical model (Λ) proposed in the SPAR paper. When used with a context encoder, it can achieve effects similar to the sparse retriever BM25. The model learns lexical matching patterns and can be combined with standard dense retrievers to improve retrieval performance.

Model Features

BM25 Behavior Emulation
Trained to emulate the behavior of traditional BM25 sparse retrievers, preserving lexical matching advantages
Complementary to Dense Retrievers
Can be combined with dense retrievers like DPR to simultaneously possess lexical and semantic matching capabilities
BERT-base Architecture
Based on the widely-used BERT-base architecture, facilitating integration into existing systems

Model Capabilities

Text Embedding Generation
Lexical Feature Extraction
Document Retrieval
QA System Support

Use Cases

Information Retrieval
Open-domain QA
Used for building QA systems to retrieve relevant answers from large-scale documents
When combined with DPR, it can capture both lexical and semantic matching signals
Document Retrieval
Used for relevant document retrieval in search engines or knowledge bases
Emulates BM25 behavior, retaining advantages of traditional retrieval systems
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase