Japanese E5 Mistral 7b Slerp
A Japanese text embedding model created by merging E5 Mistral 7B Instruct and Japanese StableLM Base Gamma 7B, focusing on sentence similarity tasks
Downloads 94
Release Time : 1/4/2024
Model Overview
This model combines the embedding generation capability of E5 Mistral with the Japanese language understanding of Japanese StableLM, suitable for semantic similarity calculation and embedding representation generation in Japanese text
Model Features
Bilingual model fusion
Combines the embedding generation capability of E5 Mistral with the Japanese language understanding advantage of Japanese StableLM
Layered parameter fusion
Uses more of the Japanese model at the input layer and more of the embedding model at the output layer, optimizing performance
Optimized merging method
Adopts the slerp merging method, the best configuration selected after comparing various methods
Model Capabilities
Japanese text embedding generation
Sentence similarity calculation
Semantic search
Use Cases
Information retrieval
Japanese document similarity matching
Finding semantically similar documents in a collection of Japanese documents
Question answering systems
Japanese FAQ matching
Semantically matching user questions with a FAQ database
Featured Recommended AI Models