Quantifying Voter Biases in Online Platforms: An Instrumental Variable Approach

by   Himel Dev, et al.

In content-based online platforms, use of aggregate user feedback (say, the sum of votes) is commonplace as the "gold standard" for measuring content quality. Use of vote aggregates, however, is at odds with the existing empirical literature, which suggests that voters are susceptible to different biases – reputation (e.g., of the poster), social influence (e.g., votes thus far), and position (e.g., answer position). Our goal is to quantify, in an observational setting, the degree of these biases in online platforms. Specifically, what are the causal effects of different impression signals – such as the reputation of the contributing user, aggregate vote thus far, and position of content – on a participant's vote on content? We adopt an instrumental variable (IV) framework to answer this question. We identify a set of candidate instruments, carefully analyze their validity, and then use the valid instruments to reveal the effects of the impression signals on votes. Our empirical study using log data from Stack Exchange websites shows that the bias estimates from our IV approach differ from the bias estimates from the ordinary least squares (OLS) method. In particular, OLS underestimates reputation bias (1.6–2.2x for gold badges) and position bias (up to 1.9x for the initial position) and overestimates social influence bias (1.8–2.3x for initial votes). The implications of our work include: redesigning user interface to avoid voter biases; making changes to platforms' policy to mitigate voter biases; detecting other forms of biases in online platforms.


page 1

page 4


Quantifying the Impact of Cognitive Biases in Question-Answering Systems

Crowdsourcing can identify high-quality solutions to problems; however, ...

Bias in Internet Measurement Platforms

Network operators and researchers frequently use Internet measurement pl...

The Tail Wagging the Dog: Dataset Construction Biases of Social Bias Benchmarks

How reliably can we trust the scores obtained from social bias benchmark...

Measuring Social Biases in Grounded Vision and Language Embeddings

We generalize the notion of social biases from language embeddings to gr...

Uncovering and Quantifying Social Biases in Code Generation

With the popularity of automatic code generation tools, such as Copilot,...

One Instrument to Rule Them All: The Bias and Coverage of Just-ID IV

Two-stage least squares estimates in heavily over-identified instrumenta...

Offline Biases in Online Platforms: a Study of Diversity and Homophily in Airbnb

How diverse are sharing economy platforms? Are they fair marketplaces, w...

Please sign up or login with your details

Forgot password? Click here to reset