Roberta Large Semeval2012 V2 Mask Prompt A Nce
RelBERT is a semantic relation understanding model fine-tuned on RoBERTa-large, specifically designed for lexical relation classification and analogy question tasks.
Downloads 16
Release Time : 8/17/2022
Model Overview
This model, by fine-tuning the RoBERTa-large architecture, focuses on understanding semantic relations between words and excels in various relation classification and analogy tasks.
Model Features
High-performance Relation Understanding
Achieves high accuracy in various semantic relation tasks, such as F1 score of 0.926 on the BLESS dataset.
Multi-task Adaptation
Capable of handling multiple tasks simultaneously, including relation classification, analogy questions, and relation mapping.
Prompt-based Fine-tuning
Uses specific templates for masked prompt fine-tuning to enhance relation representation capabilities.
Model Capabilities
Lexical Relation Classification
Analogy Question Answering
Semantic Relation Mapping
Relation Embedding Generation
Use Cases
Educational Technology
SAT Analogy Question Answering
Used to answer vocabulary analogy questions in standardized tests.
SAT accuracy rate of 71.8%
Knowledge Graph
Lexical Relation Classification
Entity relation classification when constructing knowledge graphs.
F1 score of 0.959 on the K&H+N dataset
Featured Recommended AI Models
Š 2025AIbase