Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation

Version 1 : Received: 3 August 2023 / Approved: 3 August 2023 / Online: 4 August 2023 (11:02:29 CEST)

A peer-reviewed article of this Preprint also exists.

Fan, Y.; Li, B.; Sataer, Y.; Gao, M.; Shi, C.; Gao, Z. Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation. Electronics 2023, 12, 3908. Fan, Y.; Li, B.; Sataer, Y.; Gao, M.; Shi, C.; Gao, Z. Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation. Electronics 2023, 12, 3908.

Abstract

Most natural language processing (NLP) tasks operate an input sentence as a sequence with token-level embeddings and features, despite its clausal structures. Taking Abstract Meaning Representation (AMR) parsing as an example, recent parsers are empowered by Transformers and pre-trained language models, but long-distance dependencies (LDDs) introduced by long sequences are still open problems. We argue that LDDs are not superficially blamed on the sequence length but are essentially related to the internal clause hierarchy. Typically, non-verb words in a clause cannot depend on words outside, and verbs from different but related clauses have much longer dependencies than those in the same clause. With this intuition, we introduce a type of clausal feature, hierarchical clause annotation (HCA), into AMR parsing and propose two HCA-based approaches, HCA-based self-attention (HCA-SA) and HCA-based curriculum learning (HCA-CL), to integrate HCA trees of complex sentences for addressing LDDs. We conduct extensive experiments on two in-distribution (ID) AMR datasets (AMR 2.0 and AMR 3.0) and three out-of-distribution (OOD) ones (TLP, New3, and Bio). Experimental results show that our HCA-based approaches achieve significant and explainable improvements against the baseline model and outperform the state-of-the-art (SOTA) model when encountering sentences with complex clausal structures that introduce most LDD cases.

Keywords

hierarchical clause annotation; long-distance dependencies; AMR parsing; self-attention; curriculum learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.