Zero Shot Implicit Binary Bert
Z
Zero Shot Implicit Binary Bert
Developed by claritylab
A BERT model specifically designed for zero-shot text classification, achieving zero-shot learning capabilities through implicit training within a binary classification framework
Downloads 22
Release Time : 5/15/2023
Model Overview
Proposed by Christopher Clarke et al. in the ACL'23 paper, this model is trained on the aspect-normalized UTCD dataset and can perform text classification without relying on labeled data
Model Features
Zero-shot learning capability
Capable of classification without task-specific labeled data
Implicit training framework
Uses a binary classification framework for training to enhance model generalization
Aspect-based normalization
Integrates classification aspect information through specific delimiters to improve classification accuracy
Model Capabilities
Zero-shot text classification
Intent recognition
Multi-label classification
Use Cases
Natural Language Processing
Intent recognition
Identify potential intents in user text, such as playing music, booking a restaurant, etc.
High-accuracy intent classification
Content classification
Classify texts with unseen labels
Achieves classification without specific training
Featured Recommended AI Models
Š 2025AIbase