2013 UncertaintyinOnlineExperimentsw
- (Bakshy & Eckles, 2013) ⇒ Eytan Bakshy, and Dean Eckles. (2013). “Uncertainty in Online Experiments with Dependent Data: An Evaluation of Bootstrap Methods.” In: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ISBN:978-1-4503-2174-7 doi:10.1145/2487575.2488218
Subject Headings:
Notes
Cited By
- http://scholar.google.com/scholar?q=%222013%22+Uncertainty+in+Online+Experiments+with+Dependent+Data%3A+An+Evaluation+of+Bootstrap+Methods
- http://dl.acm.org/citation.cfm?id=2487575.2488218&preflayout=flat#citedby
Quotes
Author Keywords
- A/a tests; a/b testing; bootstrapping; field experiments; random effects; statistical computing; statistical inference; user-item data
Abstract
Many online experiments exhibit dependence between users and items. For example, in online advertising, observations that have a user or an ad in common are likely to be associated. Because of this, even in experiments involving millions of subjects, the difference in mean outcomes between control and treatment conditions can have substantial variance. Previous theoretical and simulation results demonstrate that not accounting for this kind of dependence structure can result in confidence intervals that are too narrow, leading to inaccurate hypothesis tests.
We develop a framework for understanding how dependence affects uncertainty in user-item experiments and evaluate how bootstrap methods that account for differing levels of dependence perform in practice. We use three real datasets describing user behaviors on Facebook - user responses to ads, search results, and News Feed stories - to generate data for synthetic experiments in which there is no effect of the treatment on average by design. We then estimate empirical Type I error rates for each bootstrap method. Accounting for dependence within a single type of unit (i.e., within-user dependence) is often sufficient to get reasonable error rates. But when experiments have effects, as one might expect in the field, accounting for multiple units with a multiway bootstrap can be necessary to get close to the advertised Type I error rates. This work provides guidance to practitioners evaluating large-scale experiments, and highlights the importance of analysis of inferential methods for complex dependence structures common to online experiments.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2013 UncertaintyinOnlineExperimentsw | Eytan Bakshy Dean Eckles | Uncertainty in Online Experiments with Dependent Data: An Evaluation of Bootstrap Methods | 10.1145/2487575.2488218 | 2013 |