Bert Ascii Small
B
Bert Ascii Small
Developed by aajrami
A small BERT language model pre-trained with the unique objective of predicting the sum of ASCII code values of characters in masked tokens.
Downloads 14
Release Time : 11/9/2022
Model Overview
This model is a small BERT language model that employs a distinctive pre-training objective, learning language attributes by predicting the sum of ASCII code values of characters in masked tokens.
Model Features
Unique pre-training objective
Uses the sum of ASCII code values of characters in masked tokens as the prediction target, rather than traditional token prediction.
Small model
Compared to the standard BERT model, this is a version with a smaller parameter scale.
Research-oriented
Primarily used for studying how pre-training objectives affect the learning of language attributes by language models.
Model Capabilities
Language model pre-training
Language attribute learning research
Use Cases
Language model research
Pre-training objective impact study
Research on the influence of different pre-training objectives on language models' learning of language attributes
Refer to experimental results in related papers
Featured Recommended AI Models
Š 2025AIbase