Human Interaction with Recommendation Systems: On Bias and Exploration

03/01/2017
by   Sven Schmit, et al.
0

Recommendation systems rely on historical user data to provide suggestions. We propose an explicit and simple model for the interaction between users and recommendations provided by a platform, and relate this model to the multi-armed bandit literature. First, we show that this interaction leads to a bias in naive estimators due to selection effects. This bias leads to suboptimal outcomes, which we quantify in terms of linear regret. We end the first part by discussing ways to obtain unbiased estimates. The second part of this work considers exploration of alternatives. We show that although agents are myopic, agents' heterogeneous preferences ensure that recommendation systems 'learn' about all alternatives without explicitly incentivizing this exploration. This work provides new and practical insights relevant to a wide range of systems designed to help users make better decisions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset