# Precise Instruction Response
Falcon H1 1.5B Deep Instruct
Other
Falcon-H1 is a causal decoder model developed by the UAE's Technology Innovation Institute, featuring a hybrid Transformer and Mamba architecture, supporting English and multilingual tasks.
Large Language Model
Transformers

F
tiiuae
987
10
Fireblossom 32K 7B
A 7B-parameter language model merged from Mistral 7B v0.1, combining multiple fine-tuned models via task arithmetic, supporting 32K context length, balancing creativity and reasoning
Large Language Model
Transformers

F
grimjim
21
3
Locutusquexfelladrin TinyMistral248M Instruct
Apache-2.0
This model is created by merging Locutusque/TinyMistral-248M-Instruct and Felladrin/TinyMistral-248M-SFT-v4 using the mergekit tool, combining the strengths of both. It possesses programming capabilities and reasoning skills while maintaining low hallucination and strong instruction-following abilities.
Large Language Model
Transformers English

L
Locutusque
97
7
Synatra 42dot 1.3B
Synatra-42dot-1.3B is a 1.3B parameter-scale large language model supporting an 8k context window, suitable for dialogue generation tasks.
Large Language Model
Transformers

S
maywell
3,919
7
Featured Recommended AI Models