M

Meltemi 7B V1.5

Developed by ilsp
Meltemi is a large-scale Greek basic language model trained by the Athens Center for Research and Innovation, built on Mistral 7B and focused on Greek language processing capabilities.
Downloads 106
Release Time : 7/31/2024

Model Overview

Meltemi is a large language model for Greek. By expanding the Greek language processing capabilities of Mistral 7B, it supports long context processing and multilingual interaction.

Model Features

Greek vocabulary expansion
The Greek vocabulary of the tokenizer has been expanded, significantly improving the efficiency of Greek word segmentation (from 6.80 tokens per word to 1.52 tokens).
Long context support
Supports a context length of 8192 tokens, suitable for processing long documents.
Multilingual ability
Trained on Greek and English corpora, it has bilingual processing capabilities.
Efficient training
The v1.5 version achieves better performance with fewer training steps, improving training efficiency.

Model Capabilities

Greek text generation
English text generation
Long document processing
Multilingual interaction

Use Cases

Education
Greek language learning assistance
Helps learners understand and generate Greek content
Research
Greek text analysis
Supports Greek academic research and text processing
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase