Convolutional Neural Networks Trained to Identify Words Provide a Good Account of Visual Form Priming Effects

02/08/2023
by   Dong Yin, et al.
0

A wide variety of orthographic coding schemes and models of visual word identification have been developed to account for masked priming data that provide a measure of orthographic similarity between letter strings. These models tend to include hand-coded orthographic representations with single unit coding for specific forms of knowledge (e.g., units coding for a letter in a given position or a letter sequence). Here we assess how well a range of these coding schemes and models account for the pattern of form priming effects taken from the Form Priming Project and compare these findings to results observed in with 11 standard deep neural network models (DNNs) developed in computer science. We find that deep convolutional networks perform as well or better than the coding schemes and word recognition models, whereas transformer networks did less well. The success of convolutional networks is remarkable as their architectures were not developed to support word recognition (they were designed to perform well on object recognition) and they classify pixel images of words (rather artificial encodings of letter strings). The findings add to the recent work of (Hannagan et al., 2021) suggesting that convolutional networks may capture key aspects of visual word identification.

READ FULL TEXT

page 11

page 26

page 28

research
03/01/2019

Object Recognition in Deep Convolutional Neural Networks is Fundamentally Different to That in Humans

Object recognition is a primary function of the human visual system. It ...
research
10/08/2016

Deep Convolutional Networks as Models of Generalization and Blending Within Visual Creativity

We examine two recent artificial intelligence (AI) based deep learning a...
research
02/14/2018

On the Blindspots of Convolutional Networks

Deep convolutional network has been the state-of-the-art approach for a ...
research
08/18/2022

Walking on Words

Take any word over some alphabet. If it is non-empty, go to any position...
research
05/06/2020

Avoiding 5/4-powers on the alphabet of nonnegative integers

We identify the structure of the lexicographically least word avoiding 5...
research
12/01/2020

Problems of representation of electrocardiograms in convolutional neural networks

Using electrocardiograms as an example, we demonstrate the characteristi...

Please sign up or login with your details

Forgot password? Click here to reset