2018 ExploreExploitandExplainPersona
- (McInerney et al., 2018) ⇒ James McInerney, Benjamin Lacker, Samantha Hansen, Karl Higley, Hugues Bouchard, Alois Gruson, and Rishabh Mehrotra. (2018). “Explore, Exploit, and Explain: Personalizing Explainable Recommendations with Bandits.” In: Proceedings of the 12th ACM Conference on Recommender Systems. ISBN:978-1-4503-5901-6 doi:10.1145/3240323.3240354
Subject Headings:
Notes
Cited By
- http://scholar.google.com/scholar?q=%222018%22+Explore%2C+Exploit%2C+and+Explain%3A+Personalizing+Explainable+Recommendations+with+Bandits
- http://dl.acm.org/citation.cfm?id=3240323.3240354&preflayout=flat#citedby
Quotes
Abstract
The multi-armed bandit is an important framework for balancing exploration with exploitation in recommendation. Exploitation recommends content (e.g., products, movies, music playlists) with the highest predicted user engagement and has traditionally been the focus of recommender systems. Exploration recommends content with uncertain predicted user engagement for the purpose of gathering more information. The importance of exploration has been recognized in recent years, particularly in settings with new users, new items, non-stationary preferences and attributes. In parallel, explaining recommendations (" recsplanations ") is crucial if users are to understand their recommendations. Existing work has looked at bandits and explanations independently. We provide the first method that combines both in a principled manner. In particular, our method is able to jointly (1) learn which explanations each user responds to; (2) learn the best content to recommend for each user; and (3) balance exploration with exploitation to deal with uncertainty. Experiments with historical log data and tests with live production traffic in a large-scale music recommendation service show a significant improvement in user engagement.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2018 ExploreExploitandExplainPersona | James McInerney Benjamin Lacker Samantha Hansen Karl Higley Hugues Bouchard Alois Gruson Rishabh Mehrotra | Explore, Exploit, and Explain: Personalizing Explainable Recommendations with Bandits | 10.1145/3240323.3240354 | 2018 |