H

HPD TinyBERT F128

Developed by Xuandong
A sentence embedding model compressed via homomorphic projection distillation, containing only 14 million parameters with a model size of 55MB, suitable for semantic retrieval tasks
Downloads 24
Release Time : 5/10/2022

Model Overview

This is a lightweight sentence embedding model based on TinyBERT, learning compressed sentence representations through homomorphic projection distillation, significantly reducing model size while maintaining semantic retrieval quality

Model Features

Homomorphic Projection Distillation
An innovative training method that generates compact representations through learnable projection layers while mimicking large pre-trained language models
Lightweight and Efficient
Only 14 million parameters with a model size of 55MB, suitable for deployment in resource-constrained environments
Performance Retention
Despite its small size, it achieves an average score of 81.02 on STS tasks, approaching the performance of large models

Model Capabilities

Sentence Embedding Generation
Semantic Similarity Calculation
Text Clustering
Semantic Search

Use Cases

Information Retrieval
Document Similarity Search
Used to build efficient document retrieval systems
Quickly find semantically similar documents
Intelligent Customer Service
Question Matching
Match user questions with standard questions in the knowledge base
Improve response accuracy of customer service systems
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase