🚀 mistral_7b_yo_instruct
mistral_7b_yo_instruct is a text generation model in Yorùbá, designed to address the need for high - quality text generation in the Yorùbá language.
🚀 Quick Start
Basic Usage
import requests
API_URL = "https://i8nykns7vw253vx3.us-east-1.aws.endpoints.huggingface.cloud"
headers = {
"Authorization": "Bearer hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"Content-Type": "application/json"
}
def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
output = query({
"inputs": "Pẹlẹ o. Bawo ni o se wa?",
})
print(output)
✨ Features
- Yorùbá Text Generation: Specifically tailored for generating text in the Yorùbá language.
📚 Documentation
Intended uses & limitations
How to use
The provided Python code demonstrates how to interact with the model through an API. You need to replace the hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
in the Authorization
header with your actual Hugging Face API token.
Eval results
Coming soon
Limitations and bias
This model is limited by its training dataset, which consists of entity - annotated news articles from a specific time period. As a result, it may not perform well across all use cases in different domains.
Training data
This model is fine - tuned on over 60k instruction - following demonstrations. These demonstrations are built from an aggregation of datasets, including AfriQA, XLSum, MENYO - 20k, and translations of [Alpaca - gpt4](https://huggingface.co/datasets/vicgalle/alpaca - gpt4).
Use and safety
We emphasize that mistral_7b_yo_instruct is intended only for research purposes and is not ready to be deployed for general use. This is mainly because we have not designed adequate safety measures.
📄 License
This model is licensed under the afl - 3.0 license.
Property |
Details |
Model Type |
Text generation model in Yorùbá |
Training Data |
Fine - tuned on 60k+ instruction - following demonstrations from AfriQA, XLSum, MENYO - 20k, and translations of [Alpaca - gpt4](https://huggingface.co/datasets/vicgalle/alpaca - gpt4) |