đ marroyo777/flan-t5-base-Q4_K_M-GGUF
This project provides a model converted to the GGUF format, offering text - to - text generation capabilities. It can handle various tasks like translation, question - answering, and logical reasoning, leveraging the power of the base google/flan-t5-base
model.
đ Quick Start
Install llama.cpp
Install llama.cpp
via brew
(compatible with Mac and Linux):
brew install llama.cpp
Use llama.cpp
CLI Usage
llama-cli --hf-repo marroyo777/flan-t5-base-Q4_K_M-GGUF --hf-file flan-t5-base-q4_k_m-imat.gguf -p "The meaning to life and the universe is"
Server Usage
llama-server --hf-repo marroyo777/flan-t5-base-Q4_K_M-GGUF --hf-file flan-t5-base-q4_k_m-imat.gguf -c 2048
Alternative Usage Steps
- Clone llama.cpp from GitHub:
git clone https://github.com/ggerganov/llama.cpp
- Build llama.cpp:
Move into the
llama.cpp
directory and build it with the LLAMA_CURL = 1
flag and other hardware - specific flags (e.g., LLAMA_CUDA = 1
for Nvidia GPUs on Linux).
cd llama.cpp && LLAMA_CURL=1 make
- Run Inference:
./llama-cli --hf-repo marroyo777/flan-t5-base-Q4_K_M-GGUF --hf-file flan-t5-base-q4_k_m-imat.gguf -p "The meaning to life and the universe is"
or
./llama-server --hf-repo marroyo777/flan-t5-base-Q4_K_M-GGUF --hf-file flan-t5-base-q4_k_m-imat.gguf -c 2048
⨠Features
- Multilingual Support: Supports languages such as English, French, Romanian, German, and offers multilingual capabilities.
- Diverse Tasks: Capable of handling a wide range of tasks including text - to - text generation, translation, question - answering, logical reasoning, and more.
đĻ Installation
The model was converted to the GGUF format from google/flan-t5-base
using llama.cpp
via the ggml.ai's GGUF - my - repo space.
đ Documentation
Model Details
- Base Model:
google/flan-t5-base
- Model Conversion: Converted to GGUF format using
llama.cpp
- Refer to: Original model card for more details on the model.
Supported Datasets
Property |
Details |
Datasets |
svakulenk0/qrecc , taskmaster2 , djaym7/wiki_dialog , deepmind/code_contests , lambada , gsm8k , aqua_rat , esnli , quasc , qed |
Widget Examples
The model comes with several widget examples to showcase its capabilities:
Example Title |
Input Text |
Translation |
'Translate to German: My name is Arthur' |
Question Answering |
'Please answer to the following question. Who is going to be the next Ballon d'or?' |
Logical reasoning |
'Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering.' |
Scientific knowledge |
'Please answer the following question. What is the boiling point of Nitrogen?' |
Yes/no question |
'Answer the following yes/no question. Can you write a whole Haiku in a single tweet?' |
Reasoning task |
'Answer the following yes/no question by reasoning step - by - step. Can you write a whole Haiku in a single tweet?' |
Boolean Expressions |
'Q: ( False or not False or False ) is? A: Let's think step by step' |
Math reasoning |
'The square root of x is the cube root of y. What is y to the power of 2, if x = 4?' |
Premise and hypothesis |
'Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?' |
đ License
This model is released under the apache - 2.0
license.