N

Nomic Embed Text V2 Moe GGUF

Developed by nomic-ai
A multilingual mixture of experts text embedding model that supports approximately 100 languages and performs excellently in multilingual retrieval.
Downloads 14.06k
Release Time : 4/30/2025

Model Overview

This model is a multilingual text embedding model that adopts the Mixture of Experts (MoE) architecture and supports a wide range of text similarity and feature extraction tasks.

Model Features

Multilingual Support
Supports approximately 100 languages and performs excellently in multilingual retrieval.
High Performance
Compared with models with approximately 300 million parameters, it achieves SOTA in multilingual performance, comparable to models twice its size.
Flexible Embedding Dimension
Adopts nested embedding technology, reducing storage costs by 3 times with minimal performance loss.
Completely Open Source
The model weights, code, and training data are publicly available.

Model Capabilities

Text Embedding
Sentence Similarity Calculation
Multilingual Feature Extraction

Use Cases

Information Retrieval
Multilingual Document Retrieval
Retrieve relevant content from documents in different languages.
Performs excellently in multilingual retrieval tasks.
Semantic Search
Cross-lingual Semantic Search
Perform semantic similarity search between texts in different languages.
Supports semantic search in approximately 100 languages.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase