L

Lucie 7B

Developed by OpenLLM-France
Lucie-7B is a multilingual causal language model with 7 billion parameters jointly built by LINAGORA and OpenLLM-France. It is based on the Llama-3 architecture and pre-trained on 3 trillion tokens of multilingual data.
Downloads 1,262
Release Time : 10/10/2024

Model Overview

Lucie-7B is a pre-trained causal language model with 7 billion parameters, supporting multiple languages and suitable for tasks such as text generation.

Model Features

Multilingual support
Supports multiple languages such as French, English, Italian, German, and Spanish.
Long context processing
Supports a context length of 32000, suitable for processing long texts.
Efficient inference
Supports 4-bit quantization inference with a low GPU memory requirement of at least 6GB.
Large-scale pre-training
Trained on 3 trillion multilingual data tokens, covering multiple languages and programming languages.

Model Capabilities

Text generation
Multilingual text processing
Long context understanding

Use Cases

Question answering system
Geographical knowledge Q&A
Answer geographical knowledge questions such as the capital of a country.
Example: Input 'Quelle est la capitale de l'Espagne ?', output 'Madrid'
Multilingual applications
Multilingual text generation
Generate text in multiple languages such as French and English.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase