G

Gams 27B Instruct

Developed by cjvt
GaMS-27B-Instruct is a multilingual large language model improved based on Google's Gemma 2 series, specifically optimized for languages in the Balkan Peninsula such as Slovenian.
Downloads 4,492
Release Time : 4/4/2025

Model Overview

GaMS-27B-Instruct is a supervised fine-tuning version with 27B parameters in the GaMS series. It supports Slovenian, English, and some Balkan Peninsula languages, and is suitable for tasks such as text generation and translation.

Model Features

Multilingual support
Specifically optimize the performance of Slovenian while supporting English and major Balkan Peninsula languages.
Continuous pre-training
Adopt a two-stage training strategy: first align parallel corpora, then strengthen with monolingual corpora.
Supervised fine-tuning optimization
Fine-tune with more than 25,000 instruction samples to improve task execution ability.
High-performance computing
Train on the Leonardo HPC using an A100 GPU cluster and optimize with the NeMo framework.

Model Capabilities

Slovenian text generation
Multilingual machine translation
Instruction following
Knowledge question answering
Mathematical problem solving

Use Cases

Content creation
Slovenian content generation
Generate marketing copy, press releases, etc. that conform to local language habits.
Outperform similar open-source models in the SloBench evaluation.
Education and research
Slovenian history question answering
Answer complex questions about Slovenian history and culture.
Achieve an average score of 0.76 in the SuperGLUE task.
Mathematical competition problem solving
Analyze Slovenian mathematical competition questions.
Fine-tune with 150 real questions.
Language services
English-Slovenian translation
Translate professional field documents.
The BERT score is 0.8734, outperforming most open-source translation models.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase