Efficient Eye Typing with 9-direction Gaze Estimation

07/03/2017
by   Chi Zhang, et al.
0

Vision based text entry systems aim to help disabled people achieve text communication using eye movement. Most previous methods have employed an existing eye tracker to predict gaze direction and design an input method based upon that. However, these methods can result in eye tracking quality becoming easily affected by various factors and lengthy amounts of time for calibration. Our paper presents a novel efficient gaze based text input method, which has the advantage of low cost and robustness. Users can type in words by looking at an on-screen keyboard and blinking. Rather than estimate gaze angles directly to track eyes, we introduce a method that divides the human gaze into nine directions. This method can effectively improve the accuracy of making a selection by gaze and blinks. We build a Convolutional Neural Network (CNN) model for 9-direction gaze estimation. On the basis of the 9-direction gaze, we use a nine-key T9 input method which is widely used in candy bar phones. Bar phones were very popular in the world decades ago and have cultivated strong user habits and language models. To train a robust gaze estimator, we created a large-scale dataset with images of eyes sourced from 25 people. According to the results from our experiments, our CNN model is able to accurately estimate different people's gaze under various lighting conditions by different devices. In considering disable people's needs, we removed the complex calibration process. The input methods can run in screen mode and portable off-screen mode. Moreover, The datasets used in our experiments are made available to the community to allow further experimentation.

READ FULL TEXT

page 4

page 6

page 8

research
04/20/2019

A Differential Approach for Gaze Estimation

Non-invasive gaze estimation methods usually regress gaze directions dir...
research
05/02/2020

An Intelligent and Low-cost Eye-tracking System for Motorized Wheelchair Control

In the 34 developed and 156 developing countries, there are about 132 mi...
research
04/12/2002

Fast Hands-free Writing by Gaze Direction

We describe a method for text entry based on inverse arithmetic coding t...
research
07/16/2021

A New Robust Multivariate Mode Estimator for Eye-tracking Calibration

We propose in this work a new method for estimating the main mode of mul...
research
11/05/2022

HREyes: Design, Development, and Evaluation of a Novel Method for AUVs to Communicate Information and Gaze Direction

We present the design, development, and evaluation of HREyes: biomimetic...
research
10/25/2019

Prediction of gaze direction using Convolutional Neural Networks for Autism diagnosis

Autism is a developmental disorder that affects social interaction and c...
research
12/30/2017

H4-Writer: A Text Entry Method Designed For Gaze Controlled Environment with Low KSPC and Spatial Footprint

This paper presents a new text entry technique, namely H4-Writer, design...

Please sign up or login with your details

Forgot password? Click here to reset