# Code Completion

Seed Coder 8B Base
MIT
Seed-Coder is an 8B-scale open-source code model family, including base, instruction, and inference versions, focusing on code generation and completion tasks.
Large Language Model Transformers
S
ByteDance-Seed
1,837
41
Codeberta Small V1
CodeBERTa is a code understanding model based on the RoBERTa architecture, specifically trained for multiple programming languages, capable of efficiently handling code-related tasks.
Large Language Model Transformers Other
C
claudios
16
1
Codegemma 1.1 2b
CodeGemma is a collection of lightweight open-source code models built on Gemma, supporting various tasks such as code completion, generation, and dialogue.
Large Language Model Transformers
C
google
426
18
Codegemma 7b It
CodeGemma is a lightweight open-source collection of code models based on Gemma, specializing in code generation, completion, and conversational tasks.
Large Language Model Transformers
C
google
3,286
217
Codegemma 7b
CodeGemma is a series of lightweight open-source code models based on Gemma, focusing on code completion and generation tasks.
Large Language Model Transformers
C
google
15.29k
186
Codegemma 7b It GGUF
CodeGemma is a lightweight open-source code model series based on Gemma, focusing on code completion, generation, and conversational tasks.
Large Language Model
C
google
46
56
Codegemma 2b GGUF
CodeGemma is a lightweight open-source code model series based on Gemma, featuring text-to-text and text-to-code decoder models, specializing in code completion and generation tasks.
Large Language Model
C
google
31
25
Deepseek Coder 1.3b Base Ov Int8
MIT
A multi-head attention code generation model with 1.3 billion parameters, trained on 1 trillion tokens, supporting code completion tasks with a 16K window
Large Language Model Transformers English
D
Intel
52
3
Replit Code V1 3b
A 2.7 billion parameter code generation model developed by Replit, supporting 20 programming languages
Large Language Model Transformers Other
R
replit
605
733
Codebert Base Mlm
CodeBERT is a pre-trained model for programming languages and natural languages, based on the RoBERTa architecture and trained with the Masked Language Modeling (MLM) objective.
Large Language Model
C
microsoft
8,848
46
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase