Luke Base
LUKE is a Transformer-based pre-trained model specifically designed for words and entities, providing deep contextual representations through entity-aware self-attention mechanisms.
Downloads 2,358
Release Time : 3/2/2022
Model Overview
LUKE treats words and entities in text as independent tokens and outputs their context-dependent representations. The model employs an entity-aware self-attention mechanism that extends the Transformer's self-attention mechanism by considering token types (word or entity) when computing attention scores.
Model Features
Entity-aware Self-attention Mechanism
Extends the Transformer's self-attention mechanism by considering token types (word or entity) when computing attention scores.
Deep Contextual Representations
Provides context-dependent representations for words and entities, suitable for various NLP tasks.
Multi-task Support
Performs excellently in multiple tasks including named entity recognition, entity typing, relation classification, and question answering.
Model Capabilities
Named Entity Recognition
Entity Typing
Relation Classification
Question Answering
Use Cases
Natural Language Processing
Named Entity Recognition
Identify named entities in text (e.g., person names, locations, organizations).
Achieved F1 score of 94.3 on CoNLL-2003 dataset, surpassing previous best result of 93.5.
Relation Classification
Identify relationships between entities.
Achieved F1 score of 72.7 on TACRED dataset, surpassing previous best result of 72.0.
Question Answering
Answer text-based questions.
Achieved EM/F1 scores of 90.2/95.4 on SQuAD v1.1 dataset, surpassing previous best results of 89.9/95.1.
Featured Recommended AI Models