AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Code Search Optimization

# Code Search Optimization

Codesearch ModernBERT Snake
Apache-2.0
A sentence transformer model specifically designed for code search, based on the ModernBERT architecture, supporting 8192 token long sequence processing
Text Embedding English
C
Shuu12121
36
2
Codemorph ModernBERT
Apache-2.0
A pre-trained model specifically trained from scratch for code search and code understanding tasks, supporting sequences up to 2048 tokens in length, with outstanding performance in Python code search tasks.
Large Language Model Other
C
Shuu12121
110
2
Codeberta Small V1
CodeBERTa is a code understanding model based on the RoBERTa architecture, specifically trained for multiple programming languages, capable of efficiently handling code-related tasks.
Large Language Model Transformers Other
C
claudios
16
1
Sentence T5 Base Nlpl Code Search Net
This is a model based on sentence-transformers that can map sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding
S
krlvi
297
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase