AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-turn chat optimization

# Multi-turn chat optimization

H2o Danube3 500m Chat
Apache-2.0
A 500M parameter dialogue fine-tuned model developed by H2O.ai, based on the Llama 2 architecture with Chinese dialogue support
Large Language Model Transformers English
H
h2oai
3,728
36
Tinyllama 1.1B Chat V0.4 GGUF
Apache-2.0
TinyLlama-1.1B is a compact large language model with 1.1 billion parameters, based on the Llama 2 architecture, optimized for computation and memory-constrained scenarios.
Large Language Model English
T
afrideva
65
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase