B

Bert Rand Small

Developed by aajrami
A small BERT language model using random pretraining objectives.
Downloads 14
Release Time : 11/9/2022

Model Overview

This model is a small BERT language model primarily designed for linguistic feature learning research, employing random pretraining objectives.

Model Features

Random Pretraining Objectives
Uses random pretraining objectives to study their impact on linguistic feature learning.
Compact Model
The model has a small scale, making it suitable for research and experimental purposes.

Model Capabilities

Language Model Pretraining
Linguistic Feature Learning

Use Cases

Academic Research
Pretraining Objective Research
Used to study the effects of different pretraining objectives on language model learning of linguistic features.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase