User Embedding
Jump to navigation
Jump to search
A User Embedding is an item embedding for a system user.
- Example(s):
- User Embedding from Behavioral Events (based on user behavior events).
- …
- See: Online User Embedding.
References
2019
- (Wang et al., 2019) ⇒ Xiang Wang, Xiangnan He, Meng Wang, Fuli Feng, and Tat-Seng Chua. (2019). “Neural Graph Collaborative Filtering.” In: Proceedings of the 42nd international ACM SIGIR conference on Research and development in Information Retrieval.
- QUOTE: Learning vector representations (aka. embeddings) of users and items lies at the core of modern recommender systems. Ranging from early matrix factorization to recently emerged deep learning based methods, existing efforts typically obtain a user's (or an item's) embedding by mapping from pre-existing features that describe the user (or the item), such as ID and attributes. ...
2018a
- (Wang, He, et al., 2018) ⇒ Xiang Wang, Xiangnan He, Fuli Feng, Liqiang Nie, and Tat-Seng Chua. (2018). “TEM: Tree-enhanced Embedding Model for Explainable Recommendation.” In: Proceedings of the 2018 World Wide Web Conference.
- QUOTE: ... Embedding. Given the cross feature vector q generated by GBDT, we project each cross feature j into an embedding vector vj ∈ Rk , where k is the embedding size. After the operation, we obtain a set of embedding vectors V = {q1v1,···,qLvL}. Since q is a sparse vector with only a few nonzero elements, we only need to include the embeddings of nonzero features for a prediction, i.e., V = {vl } where ql ̸= 0. We use $p_u$ and $q_i$ to denote the user embedding and item embedding, respectively. ...
2018b
- (Tang & Wang, 2018) ⇒ Jiaxi Tang, and Ke Wang. (2018). “Personalized Top-n Sequential Recommendation via Convolutional Sequence Embedding.” In: Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, pp. 565-573.
- QUOTE: ... As explained in Section 3.4, the value $y^{(u,t)}_i$ in the output layer is associated with the probability of how likely user $u$ will interact with item $i$ at time step $t$. $z$ intends to capture short term sequential patterns, whereas the user embedding $P_u$ captures user’s long-term general preferences. Here we put the user embedding Pu in the last hidden layer for several reasons: ...
2018c
- (Grbovic & Cheng, 2018) ⇒ Mihajlo Grbovic, and Haibin Cheng. (2018). “Real-time Personalization Using Embeddings for Search Ranking at Airbnb.” In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery \& Data Mining.
- QUOTE: ... In this paper we describe Listing and User Embedding techniques we developed and deployed for purposes of Real-time Personalization in Search Ranking and Similar Listing Recommendations, two channels that drive 99% of conversions. ...
... User Type Embeddings - Previous work on training user embeddings to capture their long-term interest (Djuric et al., 2014, Weston et al., 2013) train a separate embedding for each user. When target signal is sparse, there is not enough data to train a good embedding representation for each user. Not to mention that storing embeddings for each user to perform online calculations would require lot of memory. For that reason we propose to train embeddings at a level of user type, where groups of users with same type will have the same embedding. ...
- QUOTE: ... In this paper we describe Listing and User Embedding techniques we developed and deployed for purposes of Real-time Personalization in Search Ranking and Similar Listing Recommendations, two channels that drive 99% of conversions. ...
2016
- (Covington et al., 2016) ⇒ Paul Covington, Jay Adams, and Emre Sargin. (2016). “Deep Neural Networks for Youtube Recommendations.” In: Proceedings of the 10th ACM conference on recommender systems, pp. 191-198.
2014
- (Djuric et al., 2014) ⇒ Nemanja Djuric, Vladan Radosavljevic, Mihajlo Grbovic, and Narayan Bhamidipati. (2014). “Hidden Conditional Random Fields with Deep User Embeddings for Ad Targeting.” In: The Proceedings of the 2014 IEEE International Conference on Data Mining, pp. 779-784 . IEEE,
2013
- (Weston et al., 2013) ⇒ Jason Weston, Ron J. Weiss, and Hector Yee. (2013). “Nonlinear Latent Factorization by Embedding Multiple User Interests.” In: Proceedings of the 7th ACM conference on Recommender systems, pp. 65-68.