Attention Monitoring and Hazard Assessment with Bio-Sensing and Vision: Empirical Analysis Utilizing CNNs on the KITTI Dataset

05/01/2019
by   Siddharth, et al.
0

Assessing the driver's attention and detecting various hazardous and non-hazardous events during a drive are critical for driver's safety. Attention monitoring in driving scenarios has mostly been carried out using vision (camera-based) modality by tracking the driver's gaze and facial expressions. It is only recently that bio-sensing modalities such as Electroencephalogram (EEG) are being explored. But, there is another open problem which has not been explored sufficiently yet in this paradigm. This is the detection of specific events, hazardous and non-hazardous, during driving that affects the driver's mental and physiological states. The other challenge in evaluating multi-modal sensory applications is the absence of very large scale EEG data because of the various limitations of using EEG in the real world. In this paper, we use both of the above sensor modalities and compare them against the two tasks of assessing the driver's attention and detecting hazardous vs. non-hazardous driving events. We collect user data on twelve subjects and show how in the absence of very large-scale datasets, we can still use pre-trained deep learning convolution networks to extract meaningful features from both of the above modalities. We used the publicly available KITTI dataset for evaluating our platform and to compare it with previous studies. Finally, we show that the results presented in this paper surpass the previous benchmark set up in the above driver awareness-related applications.

READ FULL TEXT

page 1

page 3

page 4

page 5

research
09/30/2019

On Assessing Driver Awareness of Situational Criticalities: Multi-modal Bio-sensing and Vision-based Analysis, Evaluations and Insights

Automobiles for our roadways are increasingly utilizing advanced driver ...
research
08/27/2020

DMD: A Large-Scale Multi-Modal Driver Monitoring Dataset for Attention and Alertness Analysis

Vision is the richest and most cost-effective technology for Driver Moni...
research
07/19/2019

Detection of Real-world Driving-induced Affective State Using Physiological Signals and Multi-view Multi-task Machine Learning

Affective states have a critical role in driving performance and safety....
research
04/25/2018

Multi-modal Approach for Affective Computing

Throughout the past decade, many studies have classified human emotions ...
research
05/04/2023

Neuromorphic Sensing for Yawn Detection in Driver Drowsiness

Driver monitoring systems (DMS) are a key component of vehicular safety ...
research
05/16/2019

Utilizing Deep Learning Towards Multi-modal Bio-sensing and Vision-based Affective Computing

In recent years, the use of bio-sensing signals such as electroencephalo...
research
03/02/2022

Robust Seatbelt Detection and Usage Recognition for Driver Monitoring Systems

Wearing a seatbelt appropriately while driving can reduce serious crash-...

Please sign up or login with your details

Forgot password? Click here to reset