Bayesian Optimisation for a Biologically Inspired Population Neural Network

04/13/2021
by   Mahak Kothari, et al.
0

We have used Bayesian Optimisation (BO) to find hyper-parameters in an existing biologically plausible population neural network. The 8-dimensional optimal hyper-parameter combination should be such that the network dynamics simulate the resting state alpha rhythm (8 - 13 Hz rhythms in brain signals). Each combination of these eight hyper-parameters constitutes a 'datapoint' in the parameter space. The best combination of these parameters leads to the neural network's output power spectral peak being constraint within the alpha band. Further, constraints were introduced to the BO algorithm based on qualitative observation of the network output time series, so that high amplitude pseudo-periodic oscillations are removed. Upon successful implementation for alpha band, we further optimised the network to oscillate within the theta (4 - 8 Hz) and beta (13 - 30 Hz) bands. The changing rhythms in the model can now be studied using the identified optimal hyper-parameters for the respective frequency bands. We have previously tuned parameters in the existing neural network by the trial-and-error approach; however, due to time and computational constraints, we could not vary more than three parameters at once. The approach detailed here, allows an automatic hyper-parameter search, producing reliable parameter sets for the network.

READ FULL TEXT

page 1

page 7

research
02/10/2021

Self-supervised learning for fast and scalable time series hyper-parameter tuning

Hyper-parameters of time series models play an important role in time se...
research
12/28/2019

A Genetic Algorithm based Kernel-size Selection Approach for a Multi-column Convolutional Neural Network

Deep neural network-based architectures give promising results in variou...
research
11/09/2022

Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning

Choosing the values of hyper-parameters in sparse Bayesian learning (SBL...
research
04/06/2020

Bayesian optimisation of large-scale photonic reservoir computers

Introduction. Reservoir computing is a growing paradigm for simplified t...
research
03/01/2022

Bayesian Optimisation for Robust Model Predictive Control under Model Parameter Uncertainty

We propose an adaptive optimisation approach for tuning stochastic model...
research
03/27/2019

Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere

Among the various architectures of Recurrent Neural Networks, Echo State...
research
06/28/2019

Mise en abyme with artificial intelligence: how to predict the accuracy of NN, applied to hyper-parameter tuning

In the context of deep learning, the costliest phase from a computationa...

Please sign up or login with your details

Forgot password? Click here to reset