M

Ms Marco TinyBERT L2

Developed by cross-encoder
A lightweight cross-encoder trained on the MS Marco passage ranking task for query-passage relevance scoring in information retrieval
Downloads 71.76k
Release Time : 3/2/2022

Model Overview

This model is specifically designed for information retrieval tasks, capable of scoring the relevance between queries and passages, suitable for search engine result reranking scenarios. Optimized based on the BERT-Tiny architecture, it maintains high performance while offering extremely fast processing speeds.

Model Features

Efficient and Lightweight
Optimized based on TinyBERT architecture, processing speed up to 9000 passages/second (V100 GPU)
Precise Ranking
Excellent performance on MS Marco and TREC DL benchmarks, achieving NDCG@10 of 69.84
Plug-and-Play
Compatible with HuggingFace Transformers and SentenceTransformers ecosystems

Model Capabilities

Query-passage relevance scoring
Search result reranking
Information retrieval

Use Cases

Search Engine Optimization
Search Result Reranking
Rerank initial results returned by search engines like ElasticSearch based on relevance
Improves the quality of relevance ranking in search results
Question Answering Systems
Answer Passage Filtering
Filter out the most relevant answer from candidate answer passages
Enhances the accuracy of question answering systems
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase