# 13 Billion Parameters
Bangla Llama 13b Instruct V0.1
A 13-billion-parameter Bangla large language model optimized based on LLaMA-2 architecture, supporting bilingual instruction-following tasks
Large Language Model
Transformers Supports Multiple Languages

B
BanglaLLM
44
3
Merlyn Education Corpus Qa V2 GPTQ
Apache-2.0
Merlyn Education Corpus Q&A v2 is a 13-billion-parameter decoder-style transformer model designed for the education domain. It is fine-tuned from the llama2-13b base model and specifically tailored to provide question answers based on given contexts.
Large Language Model
Transformers

M
TheBloke
15
3
Featured Recommended AI Models