G

Gte Reranker Modernbert Base

Developed by Alibaba-NLP
An English text re-ranking model based on the ModernBERT pre-training architecture, developed by Alibaba's Tongyi Lab, supporting long-text processing up to 8192 tokens.
Downloads 17.69k
Release Time : 1/20/2025

Model Overview

This neural network model is specifically designed for text re-ranking tasks, capable of scoring the relevance of text pairs to optimize the ranking effectiveness of retrieval systems.

Model Features

Long Text Processing Capability
Supports input lengths of up to 8192 tokens, suitable for long-document retrieval tasks
Efficient Attention Mechanism
Optionally supports Flash Attention 2 for accelerated computation, improving inference efficiency
Multi-task Optimization
Demonstrates excellent performance across multiple benchmarks including MTEB, LoCO, and COIR

Model Capabilities

Text Relevance Scoring
Retrieval Result Re-ranking
Long Document Processing

Use Cases

Information Retrieval
Document Retrieval System Optimization
Re-ranks initial retrieval results to improve the ranking of relevant documents
Achieved a comprehensive score of 90.68 on the LoCO long-document retrieval benchmark
Question Answering System
Evaluates the relevance between questions and candidate answers
Scored over 96 in multiple subtasks on the COIR code retrieval benchmark
Code Retrieval
Code Search
Matches queries with code snippets for relevance
Scored over 98 in some subtasks on the COIR benchmark
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase