DeepAI AI Chat
Log In Sign Up

Insect cyborgs: Biological feature generators improve machine learning accuracy on limited data

by   Charles B. Delahunt, et al.
University of Washington

Despite many successes, machine learning (ML) methods such as neural nets often struggle to learn given small training sets. In contrast, biological neural nets (BNNs) excel at fast learning. We can thus look to BNNs for tools to improve performance of ML methods in this low-data regime. The insect olfactory network, though simple, can learn new odors very rapidly. Its two key structures are a layer with competitive inhibition (the Antennal Lobe, AL), followed by a high dimensional sparse plastic layer (the Mushroom Body, MB). This AL-MB network can rapidly learn not only odors but also handwritten digits, better in fact than standard ML methods in the few-shot regime. In this work, we deploy the AL-MB network as an automatic feature generator, using its Readout Neurons as additional features for standard ML classifiers. We hypothesize that the AL-MB structure has a strong intrinsic clustering ability; and that its Readout Neurons, used as input features, will boost the performance of ML methods. We find that these "insect cyborgs", ie classifiers that are part-moth and part-ML method, deliver significantly better performance than baseline ML methods alone on a generic (non-spatial) 85-feature, 10-class task derived from the MNIST dataset. Accuracy improves by an average of 6 training samples per class, and by 6 moth-generated features increase ML accuracy even when the ML method's baseline accuracy already exceeds the AL-MB's own limited capacity. The two structures in the AL-MB, a competitive inhibition layer and a high-dimensional sparse layer with Hebbian plasticity, are novel in the context of artificial NNs but endemic in BNNs. We believe they can be deployed either prepended as feature generators or inserted as layers into deep NNs, to potentially improve ML performance.


Putting a bug in ML: The moth olfactory network learns to read MNIST

We seek to (i) characterize the learning architectures exploited in biol...

A Novel Chaos Theory Inspired Neuronal Architecture

The practical success of widely used machine learning (ML) and deep lear...

Intelligence plays dice: Stochasticity is essential for machine learning

Many fields view stochasticity as a way to gain computational efficiency...

The Impact of Feature Quantity on Recommendation Algorithm Performance: A Movielens-100K Case Study

Recent model-based Recommender Systems (RecSys) algorithms emphasize on ...

Model-to-Circuit Cross-Approximation For Printed Machine Learning Classifiers

Printed electronics (PE) promises on-demand fabrication, low non-recurri...