Polish Reranker Roberta V2
An improved Polish re-ranking model based on sdadas/polish-roberta-large-v2, trained with RankNet loss function and supports Flash Attention 2 acceleration
Downloads 961
Release Time : 9/20/2024
Model Overview
This is a text re-ranking model optimized for Polish, primarily used for result re-ranking tasks in information retrieval systems to improve the relevance ranking of search results
Model Features
Knowledge Distillation Optimization
Trained using knowledge distillation with predictions from BAAI/bge-reranker-v2.5-gemma2-lightweight
Efficient Attention Mechanism
Supports Flash Attention 2 acceleration, requires setting trust_remote_code=True and attn_implementation parameter during loading
Compact Model Size
21x smaller than comparable models with only 435 million parameters
Model Capabilities
Text Relevance Scoring
Search Result Re-ranking
Cross-language Information Retrieval
Use Cases
Information Retrieval
Search Engine Result Optimization
Re-rank search engine results to improve the ranking of most relevant results
Achieved NDCG@10 of 65.30 on PIRB benchmark
Question Answering Systems
Rank candidate answers by relevance in QA systems
Featured Recommended AI Models