From complex to simple : hierarchical free-energy landscape renormalized in deep neural networks

10/22/2019
by   Hajime Yoshino, et al.
0

We develop a statistical mechanical approach based on the replica method to study the solution space of deep neural networks. Specifically we analyze the configuration space of the synaptic weights in a simple feed-forward perceptron network within a Gaussian approximation for two scenarios : a setting with random inputs/outputs and a teacher-student setting. By increasing the strength of constraints, i. e. increasing the number of imposed patterns, successive 2nd order glass transition (random inputs/outputs) or 2nd order crystalline transition (teacher-student setting) take place place layer-by-layer starting next to the inputs/outputs boundaries going deeper into the bulk. For deep enough network the central part of the network remains in the liquid phase. We argue that in systems of finite width, weak bias field remain in the central part and plays the role of a symmetry breaking field which connects the opposite sides of the system. In the setting with random inputs/outputs, the successive glass transitions bring about a hierarchical free-energy landscape with ultra-metricity, which evolves in space: it is most complex close to the boundaries but becomes renormalized into progressively simpler one in deeper layers. These observations provide clues to understand why deep neural networks operate efficiently. Finally we present results of a set of numerical simulations to examine the theoretical predictions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro