🚀 Viking 13B
Viking 13B is a decoder-only transformer with 13 billion parameters. It's pretrained on Finnish, English, Swedish, Danish, Norwegian, Icelandic languages, and code. As of this release, it has been trained on 1.3 trillion tokens out of a planned 2 trillion. It's a fully open - source model available under the Apache 2.0 License.
This model was created through a collaboration between the TurkuNLP group of the University of Turku, SiloGen from Silo AI, and High Performance Language Technologies (HPLT). The training was carried out on the [LUMI supercomputer](https://www.lumi - supercomputer.eu/), using compute resources provided by CSC - IT Center for Science, Finland.
This project is part of an ongoing effort to create open - source large language models for non - English and especially low - resource languages like Finnish. The model is fluent in Finnish, English, Scandinavian languages and can perform basic translations between them. It can also understand and generate code.
✨ Features
- Multilingual Capability: Fluent in Finnish, English, Swedish, Danish, Norwegian, and Icelandic, with basic translation ability between these languages.
- Code Understanding and Generation: Can understand and generate code.
- Open Source: Fully open - source and available under the Apache 2.0 License.
📦 Installation
No installation steps were provided in the original README.
💻 Usage Examples
Basic Usage
branch = "1700B"
model = transformers.AutoModelForCausalLM.from_pretrained(
"LumiOpen/Viking-13B",
torch_dtype=torch.bfloat16 if torch.cuda.is_bf16_supported() else torch.float16,
revision=branch,
)
📚 Documentation
Model Family
Viking is the second set of models released by LumiOpen and is available at 3 parameter counts:
Model Overview
NOTE: This is a base model which needs further fine - tuning for most use cases.
Viking is a generative pretrained transformer using a LLaMA - like GPT architecture, and makes use of rotary positional embeddings and flash attention.
Property |
Details |
n_parameters |
14B |
n_layers |
40 |
n_heads |
40 |
d_model |
5120 |
vocab_size |
131072 |
sequence_length |
4096 |
Training
Viking 13B was trained on the LUMI supercomputer, using 512 AMD MI250X GPUs. Each MI250X GPU has two Graphics Complex Dies (GCDs), resulting in a world size of 1024 during training. It used activation checkpointing, a micro - batch size of 1, gradient accumulation of 16, and a 3D parallelism strategy of TP = 2, PP = 4, DP = 128.
Training began in September 2023 using a custom fork of the Megatron - Deepspeed framework.
Training Hyperparameters
Property |
Details |
Comment |
Precision |
bfloat16 |
|
Optimizer |
AdamW |
|
Learning rate |
3e - 4 |
10B tokens warm - up, cosine decay to 3e - 5 |
Weight decay |
1e - 1 |
|
Batch size |
1024 |
1024 samples x 4096 tokens = 4194304 tokens |
Tokenizer
Viking uses a custom 128K Bloom tokenizer trained on the same English, Finnish, Swedish, Danish, Norwegian, Icelandic, and code dataset used to train the model.
Dataset
Viking is being trained on a 2 - trillion - token mixed dataset of English, Finnish, Swedish, Danish, Norwegian, Icelandic, and code. More details on the exact dataset will be published soon.
Evaluation Results
Full evaluation results will be published with the final model.
Training checkpoints
Training checkpoints are available as branches in the repository. Checkpoints will be released roughly every 100B tokens. The main branch will always point to the latest checkpoint. The following checkpoints are available:
Ethical Considerations and Limitations
⚠️ Important Note
Viking 13B is a release of a partially trained model, and special care should be taken when using any output.
Viking is an advanced language model, primarily optimized for English, Finnish, Swedish, Norwegian, Danish, Icelandic, and code, with no meaningful proficiency in any other languages. As with most AI - driven systems, Viking is a product of the vast data it has been trained on, which may reflect the imperfections, biases, and idiosyncrasies of the wider web. Viking may, at times, produce outputs that can be considered inaccurate, prejudiced, or controversial. Users and developers engaging with Viking should exercise discretion and consider additional evaluation and customization to ensure the model's responses align with their specific needs and ethical standards.
📄 License
Viking is released under the Apache 2.0 license.