H

Hebrew Mistral 7B

Developed by yam-peleg
An open-source large language model with 7 billion parameters based on the Mistral-7B-v1.0 framework, supporting Hebrew and English pre-training
Downloads 5,532
Release Time : 4/26/2024

Model Overview

Extended with a Hebrew tokenizer containing 64,000 tokens and continuously pre-trained on English and Hebrew texts based on Mistral-7B, forming a powerful general-purpose language model

Model Features

Bilingual Support
Supports pre-training and processing in both Hebrew and English
Extended Tokenizer
Includes a Hebrew tokenizer with 64,000 tokens
Efficient Inference
Supports 4-bit precision mode for reduced hardware requirements

Model Capabilities

Text generation
Natural language understanding
Multilingual processing

Use Cases

Language Processing
Hebrew Content Generation
Generates high-quality Hebrew text content
Bilingual Translation Assistance
Provides translation assistance between English and Hebrew
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase