R

Rbt4 H312

Developed by hfl
MiniRBT is a Chinese small pre-trained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Downloads 34
Release Time : 11/14/2022

Model Overview

MiniRBT is a compressed Chinese pre-trained model via knowledge distillation, suitable for various natural language processing tasks, aiming to provide efficient text understanding capabilities.

Model Features

Knowledge Distillation Technology
Uses TextBrewer tool for knowledge distillation, reducing model size while maintaining performance.
Whole Word Masking Technology
Adopts Whole Word Masking to enhance pre-training effectiveness for Chinese text.
Efficient Inference
Compact design enables faster model inference and lower resource consumption.

Model Capabilities

Text Understanding
Text Classification
Named Entity Recognition
Question Answering System

Use Cases

Natural Language Processing
Chinese Text Classification
Can be used for text classification tasks such as news categorization and sentiment analysis.
Information Extraction
Suitable for tasks like named entity recognition and relation extraction.
Featured Recommended AI Models
ยฉ 2025AIbase