Do RNN and LSTM have Long Memory?

06/06/2020
by   Jingyu Zhao, et al.
0

The LSTM network was proposed to overcome the difficulty in learning long-term dependence, and has made significant advancements in applications. With its success and drawbacks in mind, this paper raises the question - do RNN and LSTM have long memory? We answer it partially by proving that RNN and LSTM do not have long memory from a statistical perspective. A new definition for long memory networks is further introduced, and it requires the gradient to decay hyperbolically. To verify our theory, we convert RNN and LSTM into long memory networks by making a minimal modification, and their superiority is illustrated in modeling long-term dependence of various datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset