Hierarchical Curriculum Learning for AMR Parsing

10/15/2021
by   Peiyi Wang, et al.
0

Abstract Meaning Representation (AMR) parsing translates sentences to the semantic representation with a hierarchical structure, which is recently empowered by pretrained encoder-decoder models. However, the flat sentence-to-AMR training paradigm impedes the representation learning of concepts and relations in the deeper AMR sub-graph. To make the sequence-to-sequence models better adapt to the inherent AMR structure, we propose a hierarchical curriculum learning (HCL) which consists of (1) structure-level curriculum (SC) and (2) instance-level curriculum (IC). SC switches progressively from shallow to deep AMR sub-graphs while IC transits from easy to hard AMR instances during training. Extensive experiments show that BART trained with HCL achieves the state-of-the-art performance on the AMR-2.0 and AMR-3.0 benchmark, and significantly outperforms baselines on the structure-dependent evaluation metrics and hard instances.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro