k-FFNN: A priori knowledge infused Feed-forward Neural Networks

04/24/2017
by   Sri Harsha Dumpala, et al.
0

Recurrent neural network (RNN) are being extensively used over feed-forward neural networks (FFNN) because of their inherent capability to capture temporal relationships that exist in the sequential data such as speech. This aspect of RNN is advantageous especially when there is no a priori knowledge about the temporal correlations within the data. However, RNNs require large amount of data to learn these temporal correlations, limiting their advantage in low resource scenarios. It is not immediately clear (a) how a priori temporal knowledge can be used in a FFNN architecture (b) how a FFNN performs when provided with this knowledge about temporal correlations (assuming available) during training. The objective of this paper is to explore k-FFNN, namely a FFNN architecture that can incorporate the a priori knowledge of the temporal relationships within the data sequence during training and compare k-FFNN performance with RNN in a low resource scenario. We evaluate the performance of k-FFNN and RNN by extensive experimentation on MediaEval 2016 audio data ("Emotional Impact of Movies" task). Experimental results show that the performance of k-FFNN is comparable to RNN, and in some scenarios k-FFNN performs better than RNN when temporal knowledge is injected into FFNN architecture. The main contributions of this paper are (a) fusing a priori knowledge into FFNN architecture to construct a k-FFNN and (b) analyzing the performance of k-FFNN with respect to RNN for different size of training data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/06/2018

Residual Memory Networks: Feed-forward approach to learn long temporal dependencies

Training deep recurrent neural network (RNN) architectures is complicate...
research
01/24/2019

Petrophysical property estimation from seismic data using recurrent neural networks

Reservoir characterization involves the estimation petrophysical propert...
research
04/11/2022

Time-Adaptive Recurrent Neural Networks

Data are often sampled irregularly in time. Dealing with this using Recu...
research
07/08/2016

Log-Linear RNNs: Towards Recurrent Neural Networks with Flexible Prior Knowledge

We introduce LL-RNNs (Log-Linear RNNs), an extension of Recurrent Neural...
research
04/10/2018

Recurrent Neural Networks for Person Re-identification Revisited

The task of person re-identification has recently received rising attent...
research
07/10/2018

Deep Learning for Audio Transcription on Low-Resource Datasets

In training a deep learning system to perform audio transcription, two p...
research
04/14/2016

Learning Visual Storylines with Skipping Recurrent Neural Networks

What does a typical visit to Paris look like? Do people first take photo...

Please sign up or login with your details

Forgot password? Click here to reset