Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

06/03/2018
by   Jean Maillard, et al.
0

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset