M

Minirbt H288

Developed by hfl
MiniRBT is a Chinese small pretrained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Downloads 405
Release Time : 11/14/2022

Model Overview

MiniRBT is a compressed Chinese pretrained model via knowledge distillation, suitable for various natural language processing tasks with high efficiency and performance.

Model Features

Knowledge Distillation Technology
Model compression and performance balance achieved through knowledge distillation using TextBrewer tool
Whole Word Masking Technology
Adopts Whole Word Masking to enhance Chinese text processing effectiveness
Lightweight Design
Compact design suitable for deployment in resource-constrained environments

Model Capabilities

Chinese Text Understanding
Text Classification
Named Entity Recognition
Question Answering System
Text Similarity Calculation

Use Cases

Natural Language Processing
Intelligent Customer Service
Used for building semantic understanding modules in Chinese intelligent customer service systems
Text Classification
Applied in scenarios such as news classification and sentiment analysis
Featured Recommended AI Models
ยฉ 2025AIbase