B

Bge M3 Distill 8l

Developed by altaidevorg
An 8-layer embedding model distilled from BAAI/bge-m3, achieving 2.5x speed improvement while maintaining retrieval performance
Downloads 249
Release Time : 1/19/2025

Model Overview

This model compresses the original 24-layer model to 8 layers through knowledge distillation, with 366 million parameters, suitable for semantic similarity calculation and retrieval tasks

Model Features

Efficient Compression
Distilled from 24 to 8 layers, reducing parameters by 67% and improving inference speed by 2.5x
Performance Retention
Maintains a Spearman cosine similarity of 0.965 on STS test sets, with negligible difference from the original model
Long Text Support
Supports sequences up to 8192 tokens, suitable for long document processing
Cross-language Capability
While primarily trained on Turkish data, it performs excellently in English and other languages

Model Capabilities

Semantic Similarity Calculation
Text Embedding Generation
Cross-language Text Retrieval
Long Text Processing

Use Cases

Information Retrieval
Semantic Search System
Building a search engine based on semantic matching
Improves relevance of search results
Recommendation System
Content Recommendation
Recommendation engine based on content similarity
Increases recommendation accuracy
RAG Applications
Retrieval-Augmented Generation
Providing relevant context retrieval for LLMs
Enhances relevance of generated content
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase