The Next Big Thing(s) in Unsupervised Machine Learning: Five Lessons from Infant Learning

by   Lorijn Zaadnoordijk, et al.

After a surge in popularity of supervised Deep Learning, the desire to reduce the dependence on curated, labelled data sets and to leverage the vast quantities of unlabelled data available recently triggered renewed interest in unsupervised learning algorithms. Despite a significantly improved performance due to approaches such as the identification of disentangled latent representations, contrastive learning, and clustering optimisations, the performance of unsupervised machine learning still falls short of its hypothesised potential. Machine learning has previously taken inspiration from neuroscience and cognitive science with great success. However, this has mostly been based on adult learners with access to labels and a vast amount of prior knowledge. In order to push unsupervised machine learning forward, we argue that developmental science of infant cognition might hold the key to unlocking the next generation of unsupervised learning approaches. Conceptually, human infant learning is the closest biological parallel to artificial unsupervised learning, as infants too must learn useful representations from unlabelled data. In contrast to machine learning, these new representations are learned rapidly and from relatively few examples. Moreover, infants learn robust representations that can be used flexibly and efficiently in a number of different tasks and contexts. We identify five crucial factors enabling infants' quality and speed of learning, assess the extent to which these have already been exploited in machine learning, and propose how further adoption of these factors can give rise to previously unseen performance levels in unsupervised learning.


page 1

page 2

page 3

page 4


Is 'Unsupervised Learning' a Misconceived Term?

Is all of machine learning supervised to some degree? The field of machi...

Mining Artifacts in Mycelium SEM Micrographs

Mycelium is a promising biomaterial based on fungal mycelium, a highly p...

Scale-invariant representation of machine learning

The success of machine learning stems from its structured data represent...

A new nature inspired modularity function adapted for unsupervised learning involving spatially embedded networks: A comparative analysis

Unsupervised machine learning methods can be of great help in many tradi...

Machine Learning String Standard Models

We study machine learning of phenomenologically relevant properties of s...

Continual task learning in natural and artificial agents

How do humans and other animals learn new tasks? A wave of brain recordi...

Unsupervised Representation Learning from Pathology Images with Multi-directional Contrastive Predictive Coding

Digital pathology tasks have benefited greatly from modern deep learning...

Please sign up or login with your details

Forgot password? Click here to reset