N

Nomic Embed Text V2 Moe

Developed by nomic-ai
Nomic Embed v2 is a high-performance multilingual Mixture of Experts (MoE) text embedding model supporting approximately 100 languages, excelling in multilingual retrieval tasks.
Downloads 242.32k
Release Time : 2/7/2025

Model Overview

This model adopts a Mixture of Experts architecture combined with Matryoshka embedding technology, offering flexible embedding dimension choices and achieving leading performance in multilingual text similarity and retrieval tasks.

Model Features

Multilingual Mixture of Experts Architecture
Utilizes an 8-expert MoE architecture to enhance multilingual performance while maintaining efficient inference
Matryoshka Embedding Technology
Supports flexible embedding dimension selection from 768 to 256 dimensions, significantly reducing storage costs
Extensive Language Support
Supports approximately 100 languages with over 1.6 billion training pairs
Open-Source Transparency
Fully open-sourced model weights, training code, and training data

Model Capabilities

Multilingual Text Embedding
Sentence Similarity Calculation
Feature Extraction
Cross-Language Retrieval

Use Cases

Information Retrieval
Cross-Language Document Retrieval
Establishes semantic relationships between documents in different languages for cross-language search
Achieved 65.80 points on the MIRACL benchmark
Semantic Analysis
Multilingual Text Similarity Calculation
Computes semantic similarity between texts in different languages
Supports similarity analysis for approximately 100 languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase