L

Llama 68m

Developed by JackFram
This is a small LLaMA-like language model with 68 million parameters, primarily used for speculative inference research in the SpecInfer paper.
Downloads 573.35k
Release Time : 7/19/2023

Model Overview

A lightweight language model trained on Wikipedia and partial C4 dataset, developed as the base speculative model for the SpecInfer paper.

Model Features

Lightweight Design
Compact model with only 68 million parameters, ideal as a base model for speculative inference
Multi-source Training
Trained on combined datasets including Wikipedia, C4-en and C4-realnewslike
Research-oriented
Specifically designed for speculative inference and token tree verification research in the SpecInfer paper

Model Capabilities

English text generation

Use Cases

Academic Research
Speculative Inference Research
Serves as the base small speculative model in the SpecInfer paper
Used to validate the effectiveness of speculative inference and token tree verification methods
Lightweight Applications
Text Generation in Resource-constrained Environments
Provides basic text generation capability under limited computational resources
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase