Header shape illustration 1Header shape illustration 2
Back

AMRs Assemble! Learning to Ensemble with Autoregressive Models for AMR Parsing

Abelardo Carlos Martínez Lorenzo, Pere-Lluís Huguet Cabot, Roberto Navigli

Abstract

In this paper, we examine the current state-of-the-art in AMR parsing, which relies on ensemble strategies by merging multiple graph predictions. Our analysis reveals that the present models often violate AMR structural constraints. To address this issue, we develop a validation method, and show how ensemble models can exploit SMATCH metric weaknesses to obtain higher scores, but sometimes result in corrupted graphs. Additionally, we highlight the demanding need to compute the SMATCH score among all possible predictions. To overcome these challenges, we propose two novel ensemble strategies based on Transformer models, improving robustness to structural constraints, while also reducing the computational time. Our methods provide new insights for enhancing AMR parsers and metrics. Our code is available at https://www.github.com/babelscape/AMRs-Assemble.

July 2023, Association for Computational Linguistics

Your privacy choices

Save and continue
Sign up!
The best way to get the latest news from Babelscape and the NLP world!
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Thank you for subscribing!
You’ve been added to our mailing list, and you’ll receive our next newsletter to stay updated on the latest news from the NLP world!
Something went wrong
We are sorry, your request cannot be processed right now.
Please wait a bit and try again.
Unsubscribe
We're sorry to see you go. Please enter your email address to complete the unsubscription process.
You'll receive an email confirmation shortly.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Check your email
We have sent you a link to your email to complete the unsubscribe process.
Something went wrong
We are sorry, your request cannot be processed right now.
Please wait a bit and try again.