G

Gte Small

Developed by Supabase
GTE-small is a general text embedding model trained by Alibaba DAMO Academy, based on the BERT framework, suitable for tasks such as information retrieval and semantic text similarity.
Downloads 481.27k
Release Time : 8/1/2023

Model Overview

The GTE model series is trained on large-scale relevant text pairs, covering multi-domain scenarios, and can be applied to downstream tasks such as information retrieval, semantic text similarity, and text re-ranking.

Model Features

Multi-domain Applicability
Trained on large-scale relevant text pairs, covering multi-domain scenarios.
High Efficiency
Performs excellently in the MTEB benchmark with a comprehensive score of 61.36.
Lightweight
The model size is only 0.07GB, suitable for resource-constrained environments.

Model Capabilities

Text Feature Extraction
Semantic Text Similarity Calculation
Information Retrieval
Text Re-ranking

Use Cases

Information Retrieval
Document Search
Used to build efficient document search engines.
Improves the relevance of search results
Semantic Analysis
Text Similarity Calculation
Calculates the semantic similarity between two pieces of text.
Scores 82.07 in STS tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase