# 128K Long Text Processing
Viper Coder V1.5 R999
Apache-2.0
Viper-Coder-v1.5-r999 is a large language model based on the Qwen 2.5 14B architecture, optimized for coding and reasoning tasks, with strong chain-of-thought reasoning and logical problem-solving capabilities.
Large Language Model
Transformers Supports Multiple Languages

V
prithivMLmods
1,314
1
Chocolatine Fusion 14B
MIT
Chocolatine-Fusion-14B is a merged model that combines the strengths of the Chocolatine-2 series, enhancing reasoning capabilities and multi-turn dialogue performance through optimized fusion.
Large Language Model
Transformers

C
FINGU-AI
226
5
Buddhi 128k Chat 7b
Apache-2.0
Buddhi-128k-Chat is a pioneering general-purpose chat model with a 128K context window, finely tuned based on Mistral 7B Instruct and optimized through innovative YaRN technology to handle extended context lengths of up to 128,000 tokens.
Large Language Model
Transformers English

B
aiplanet
196
18
Featured Recommended AI Models