Depressive, Drug Abusive, or Informative: Knowledge-aware Study of News Exposure during COVID-19 Outbreak

07/30/2020
by   Amanuel Alambo, et al.
0

The COVID-19 pandemic is having a serious adverse impact on the lives of people across the world. COVID-19 has exacerbated community-wide depression, and has led to increased drug abuse brought about by isolation of individuals as a result of lockdown. Further, apart from providing informative content to the public, the incessant media coverage of COVID-19 crisis in terms of news broadcasts, published articles and sharing of information on social media have had the undesired snowballing effect on stress levels (further elevating depression and drug use) due to uncertain future. In this position paper, we propose a novel framework for assessing the spatio-temporal-thematic progression of depression, drug abuse, and informativeness of the underlying news content across the different states in the United States. Our framework employs an attention-based transfer learning technique to apply knowledge learned on a social media domain to a target domain of media exposure. To extract news articles that are related to COVID-19 communications from the streaming news content on the web, we use neural semantic parsing, and background knowledge bases in a sequence of steps called semantic filtering. We achieve promising preliminary results on three variations of Bidirectional Encoder Representations from Transformers (BERT) model. We compare our findings against a report from Mental Health America and the results show that our fine-tuned BERT models perform better than vanilla BERT. Our study can benefit epidemiologists by offering actionable insights on COVID-19 and its regional impact. Further, our solution can be integrated into end-user applications to tailor news for users based on their emotional tone measured on the scale of depressiveness, drug abusiveness, and informativeness.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset