Bert Fc Small
B
Bert Fc Small
Developed by aajrami
A small BERT language model using first character prediction as the pre-training objective
Downloads 14
Release Time : 11/9/2022
Model Overview
This model is a small-scale language model based on the BERT architecture, focusing on pre-training through the first character prediction task, aiming to study the impact of different pre-training objectives on language model learning of linguistic features.
Model Features
First Character Prediction Pre-training
Uses unique first character prediction as the pre-training objective, differing from the traditional BERT masked language model task
Compact Design
The model is small in scale, making it potentially more suitable for resource-limited environments or specific research needs
Research-Oriented
Primarily used to study the impact of pre-training objectives on language model learning characteristics
Model Capabilities
Text Feature Extraction
Language Feature Learning Research
Use Cases
Language Model Research
Pre-training Objective Comparative Study
Comparing the impact of different pre-training objectives on model learning of language features
Related research findings were published in the ACL 2022 conference paper
Featured Recommended AI Models
Š 2025AIbase