Z

Zero Shot Vanilla Binary Bert

Developed by claritylab
This is a BERT-based zero-shot text classification model, specifically designed for zero-shot classification tasks, trained using the aspect-normalized UTCD dataset.
Downloads 26
Release Time : 5/13/2023

Model Overview

Proposed by Christopher Clarke et al. in the ACL'23 conference paper, this model is designed for zero-shot text classification. As a baseline model, it is based on a binary classification framework and is suitable for classification tasks without domain-specific training data.

Model Features

Zero-shot learning capability
Performs classification tasks without domain-specific training data
Binary classification framework
Adopts a binary classification architecture as the foundation for zero-shot classification
Based on UTCD dataset
Trained using the aspect-normalized UTCD dataset

Model Capabilities

Zero-shot text classification
Multi-label classification
Intent recognition

Use Cases

Natural language processing
Intent classification
Identifies user input intents, such as adding to a playlist, booking a restaurant, etc.
Example shows higher accuracy in recognizing music-related intents
Multi-label classification
Performs multi-label classification on text without domain-specific training data
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase