M

Multi Qa MiniLM BERT Tiny Distill L 2 H 128 A Cos V1

Developed by rawsh
This is a lightweight sentence embedding model based on BERT-Tiny, specifically designed for semantic search and sentence similarity tasks, with a model size of only 5MB.
Downloads 43
Release Time : 6/5/2023

Model Overview

This model maps sentences and paragraphs into a 128-dimensional dense vector space, suitable for tasks such as clustering or semantic search. It is based on the nreimers/BERT-Tiny_L-2_H-128_A-2 model and learns through knowledge distillation from the multi-qa-MiniLM-L6-cos-v1 teacher model.

Model Features

Lightweight Design
The model size is only 5MB, making it suitable for resource-constrained environments.
Knowledge Distillation
Learns from the more powerful multi-qa-MiniLM-L6-cos-v1 teacher model.
Efficient Semantic Representation
Maps text to a 128-dimensional vector space while preserving semantic information.

Model Capabilities

Sentence Similarity Calculation
Text Feature Extraction
Semantic Search
Text Clustering

Use Cases

Information Retrieval
QA Systems
Used to match user questions with candidate answers in a knowledge base.
Achieved a cosine similarity Pearson coefficient of 0.7336 on the STS-dev dataset.
Content Recommendation
Similar Content Recommendation
Recommends related content based on semantic similarity.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase