H

H2o Danube2 1.8b Chat

Developed by h2oai
An 1.8B parameter chat model fine-tuned by H2O.ai, based on adapted Llama 2 architecture with 8192 context length support
Downloads 948
Release Time : 4/5/2024

Model Overview

This is a chat model fine-tuned with SFT and DPO, suitable for dialogue generation tasks, based on Mistral tokenizer

Model Features

Long context support
Supports 8192 tokens context length, suitable for long conversations
Efficient inference
1.8B parameter scale enables efficient inference while maintaining performance
Multi-stage fine-tuning
Trained with SFT (Supervised Fine-Tuning) and DPO (Direct Preference Optimization) in two stages

Model Capabilities

Dialogue generation
Text completion
Question answering system

Use Cases

Customer service chatbot
Customer service dialogue
Handling customer inquiries and FAQ responses
Educational assistance
Learning tutoring
Helping students with study questions and concept explanations
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase