R

Roberta Large Conceptnet Mask Prompt E Nce

Developed by research-backup
A relation embedding model fine-tuned on RoBERTa-large, specializing in lexical relation understanding and analogical reasoning tasks
Downloads 17
Release Time : 8/7/2022

Model Overview

This model is fine-tuned on high-confidence ConceptNet data based on RoBERTa-large, specifically designed for understanding lexical relations and performing analogical reasoning. Supports various relation classification and mapping tasks.

Model Features

High-Precision Relation Classification
Achieves F1 scores above 0.9 on multiple relation classification datasets
Multi-Task Support
Simultaneously supports various tasks including relation classification, analogical reasoning, and relation mapping
Prompt-Based Fine-Tuning
Trained using special mask prompt templates to enhance relation understanding capabilities

Model Capabilities

Lexical Relation Classification
Analogical Reasoning
Relation Embedding Generation
Relation Mapping

Use Cases

Natural Language Processing
Lexical Relation Classification
Identify semantic relations between words (e.g., synonymy, antonymy, hypernymy)
Achieves 93% F1 score on the BLESS dataset
Analogical Reasoning
Solve analogy problems like 'Tokyo is to Japan as Paris is to?'
Achieves 89.8% accuracy on the Google Analogy dataset
Knowledge Graph
Relation Mapping
Align and map relations across different knowledge bases
Achieves 93.25% accuracy in relation mapping
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase