B

Babyberta 2

Developed by phueb
BabyBERTa is a lightweight version of RoBERTa, trained on child-directed input and specifically designed for language acquisition research.
Downloads 94
Release Time : 3/2/2022

Model Overview

BabyBERTa is a lightweight RoBERTa model trained on 5 million words of American English child-directed input, suitable for language acquisition research and operable without high-performance computing infrastructure.

Model Features

Lightweight design
Can run on a desktop with a single GPU, eliminating the need for high-performance computing infrastructure.
Trained on child-directed input
Trained on 5 million words of American English child-directed input, making it ideal for language acquisition research.
Grammar knowledge learning
The model is designed to learn grammar knowledge from child-directed input and is evaluated using the Zorro test suite.

Model Capabilities

Language modeling
Grammar knowledge learning
Masked language prediction

Use Cases

Language acquisition research
Grammar knowledge evaluation
Assess the model's grammar knowledge learning using the Zorro test suite.
The best model achieved an overall accuracy of 80.3, comparable to RoBERTa-base.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase