🚀 Uploaded Model
This is a llama-based text generation model, trained 2x faster with Unsloth and Huggingface's TRL library.
🚀 Quick Start
Model Information
Property |
Details |
Base Model |
unsloth/Llama-3.3-70B-Instruct-bnb-4bit |
Tags |
text-generation-inference, transformers, unsloth, llama, trl, sft |
License |
apache-2.0 |
Languages |
en, es, la, ar, fr |
Pipeline Tag |
text2text-generation |
Library Name |
transformers |
Model Development Details
- Developed by: ykarout
- License: apache-2.0
- Finetuned from model : unsloth/Llama-3.3-70B-Instruct-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.

Using with Ollama
To use with Ollama, create the Modelfile as per the below template:
FROM {__FILE_LOCATION__} (replace with your gguf file location)
TEMPLATE """{{ if .Messages }}
{{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|>
{{- if .System }}
{{ .System }}
{{- end }}
{{- if .Tools }}
You are a helpful assistant with tool calling capabilities. When you receive a tool call response, use the output to format an answer to the original use question.
{{- end }}
{{- end }}<|eot_id|>
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
{{- if and $.Tools $last }}
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables.
{{ $.Tools }}
{{- end }}
{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}
{{- else if eq .Role "assistant" }}<|start_header_id|>assistant<|end_header_id|>
{{- if .ToolCalls }}
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}
{{- else }}
{{ .Content }}{{ if not $last }}<|eot_id|>{{ end }}
{{- end }}
{{- else if eq .Role "tool" }}<|start_header_id|>ipython<|end_header_id|>
{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}
{{- end }}
{{- end }}
{{- else }}
{{- if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}{{ .Response }}{{ if .Response }}<|eot_id|>{{ end }}"""
PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
PARAMETER stop "<|eot_id|>"
PARAMETER stop "<|eom_id|>"
PARAMETER temperature 1.5 #can be changed for experimenting different generations
PARAMETER min_p 0.1 #can be changed for experimenting different generations
PARAMETER repeat_penalty 1.15 #can be changed for experimenting different generations
PARAMETER top_p 0.9 #can be changed for experimenting different generations
PARAMETER top_k 45 #can be changed for experimenting different generations