Zero Shot Explicit Bi Encoder
Z
Zero Shot Explicit Bi Encoder
Developed by claritylab
A BERT-based zero-shot text classification model trained on the UTCD dataset using explicit training methods
Downloads 31
Release Time : 5/15/2023
Model Overview
A dual-encoder model specifically designed for zero-shot text classification tasks, optimized for label-agnostic pre-training through aspect-normalized processing
Model Features
Zero-shot learning capability
Classify new categories without domain-specific training data
Explicit training framework
Optimizes classification performance through label-agnostic pre-training methods
Dual-encoder architecture
Independent encoding of text and labels enables efficient similarity calculation
Model Capabilities
Zero-shot text classification
Semantic similarity calculation
Multi-label classification
Use Cases
Intelligent assistants
User intent recognition
Identify potential intent categories in user queries
Accurately distinguishes similar intents like play music/add to playlist
Content classification
Dynamic labeling system
Automatically assign labels to unseen new content
Featured Recommended AI Models
Š 2025AIbase