Detecting Events of Daily Living Using Multimodal Data
Events are fundamental for understanding how people experience their lives. It is challenging, however, to automatically record all events in daily life. An understanding of multimedia signals allows recognizing events of daily living and getting their attributes as automatically as possible. In this paper, we consider the problem of recognizing a daily event by employing the commonly used multimedia data obtained from a smartphone and wearable device. We develop an unobtrusive approach to obtain latent semantic information from the data, and therefore an approach for daily event recognition based on semantic context enrichment. We represent the enrichment process through an event knowledge graph that semantically enriches a daily event from a low-level daily activity. To show a concrete example of this enrichment, we perform an experiment with eating activity, which may be one of the most complex events, by using 14 months of data for three users. In this process, to unobtrusively complement the lack of semantic information, we suggest a new food recognition/classification method that focuses only on a physical response to food consumption. Experimental results indicate that our approach is able to show automatic abstraction of life experience. These daily events can then be used to create a personal model that can capture how a person reacts to different stimuli under specific conditions.
READ FULL TEXT