Header shape illustration 1Header shape illustration 2
Back

Cross-lingual AMR Aligner: Paying Attention to Cross-Attention

Abelardo Carlos Martínez Lorenzo, Pere-Lluís Huguet Cabot, Roberto Navigli

Abstract

This paper introduces a novel aligner for Abstract Meaning Representation (AMR) graphs that can scale cross-lingually, and is thus capable of aligning units and spans in sentences of different languages. Our approach leverages modern Transformer-based parsers, which inherently encode alignment information in their cross-attention weights, allowing us to extract this information during parsing. This eliminates the need for English-specific rules or the Expectation Maximization (EM) algorithm that have been used in previous approaches. In addition, we propose a guided supervised method using alignment to further enhance the performance of our aligner. We achieve state-of-the-art results in the benchmarks for AMR alignment and demonstrate our aligner’s ability to obtain them across multiple languages. Our code will be available at https://www.github.com/babelscape/AMR-alignment.

July 2023, Association for Computational Linguistics

Your privacy choices

Save and continue
Sign up!
The best way to get the latest news from Babelscape and the NLP world!
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Thank you for subscribing!
You’ve been added to our mailing list, and you’ll receive our next newsletter to stay updated on the latest news from the NLP world!
Something went wrong
We are sorry, your request cannot be processed right now.
Please wait a bit and try again.
Unsubscribe
We're sorry to see you go. Please enter your email address to complete the unsubscription process.
You'll receive an email confirmation shortly.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Check your email
We have sent you a link to your email to complete the unsubscribe process.
Something went wrong
We are sorry, your request cannot be processed right now.
Please wait a bit and try again.