A

AMRBART Large V2

Developed by xfbai
AMRBART is a pre-trained semantic parser capable of converting sentences into Abstract Meaning Representation (AMR) graphs. The latest version, v2, is simpler, faster, and more powerful.
Downloads 49
Release Time : 4/25/2025

Model Overview

AMRBART is a pre-trained model based on the BART architecture, specifically designed for Abstract Meaning Representation (AMR) parsing and generation tasks. It can convert natural language sentences into AMR graphs or generate natural language text from AMR graphs.

Model Features

Efficient AMR parsing
Capable of efficiently and accurately converting natural language sentences into Abstract Meaning Representation (AMR) graphs.
Bidirectional conversion capability
Supports bidirectional conversion between AMR-to-text and text-to-AMR.
Improved version
Version v2 is simpler, faster, and more powerful compared to its predecessor.
Pre-training advantage
Enhances AMR parsing and generation performance through graph pre-training methods.

Model Capabilities

Text-to-AMR parsing
AMR-to-text generation
Semantic representation conversion

Use Cases

Natural Language Processing
Semantic parsing
Convert natural language sentences into formal AMR representations
Achieves SOTA performance on LDC2017T10 and LDC2020T02 datasets
Text generation
Generate natural language text from AMR graphs
Performs excellently in AMR-to-text generation tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase