Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP

09/16/2020
by   Hao Fei, et al.
0

Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network. In this paper, we investigate a simple and effective method, Knowledge Distillation, to integrate heterogeneous structure knowledge into a unified sequential LSTM encoder. Experimental results on four typical syntax-dependent tasks show that our method outperforms tree encoders by effectively integrating rich heterogeneous structure syntax, meanwhile reducing error propagation, and also outperforms ensemble methods, in terms of both the efficiency and accuracy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset