Gerbil A 32m
G
Gerbil A 32m
Developed by GerbilLab
Gerbil-A-32m is an A-grade model with 32 million parameters, trained on 640 million tokens, suitable for various natural language processing tasks.
Downloads 33
Release Time : 3/30/2023
Model Overview
This model is a medium-scale pre-trained language model suitable for tasks such as text generation and text classification.
Model Features
Medium-scale parameters
With 32 million parameters, it is suitable for running in resource-limited environments.
Efficient training
Trained on 640 million tokens with a batch size of 262,000 and a training loss of 4.048700.
Model Capabilities
Text generation
Text classification
Use Cases
Natural Language Processing
Text generation
Generate coherent text content.
Text classification
Perform classification tasks on text.
Featured Recommended AI Models