Factorization Machines (FM) Algorithm
(Redirected from Factorization Machine)
Jump to navigation
Jump to search
A Factorization Machines (FM) Algorithm is a supervised prediction algorithm that
- Context:
- It can be represented by [math]\displaystyle{ \phi(\mathbf{w},\mathbf{x}) = ∑_{j_1, j_2∈C_2}〈\mathbf{w}_{j_1, f_1},\mathbf{w}_{j_2, f_2}〉x_{j_1} x_{j_2} }[/math] (plus bias terms), where $\mathbf{w}_{j_1}$ and $\mathbf{w}_{j_1}$ are two vectors with (user-defined) length $k$.
- It can be implemented by a Factorization Machine-based System.
- It can combine the advantages of Support Vector Machines (SVM) with Factorization Models.
- It can learn / model Feature Interactions (from high dimensional, sparse data).
- …
- Counter-Example(s):
- Counter-Example(s):
- See: Sparse High-Dimensional Data, libFM, fastFM.
References
2017
- (Prillo, 2017) ⇒ Sebastian Prillo. (2017). “An Elementary View on Factorization Machines.” In: Proceedings of the Eleventh ACM Conference on Recommender Systems. ISBN:978-1-4503-4652-8 doi:10.1145/3109859.3109892
- QUOTE: Factorization Machines (FMs) are a model class capable of learning pairwise (and in general higher order) feature interactions from high dimensional, sparse data. In this paper we adopt an elementary view on FMs. Specifically, we view FMs as a sum of simple surfaces - a hyperplane plus several squared hyperplanes - in the original feature space. This elementary view, although equivalent to that of low rank matrix factorization, is geometrically more intuitive and points to some interesting generalizations.
2016
- (Guo et al., 2016) ⇒ Weiyu Guo, Shu Wu, Liang Wang, and Tieniu Tan. (2016). “Personalized Ranking with Pairwise Factorization Machines.” In: Neurocomputing Journal, 214(C). doi:10.1016/j.neucom.2016.05.074
- QUOTE: … In addition, the cold start problem often perplexes pairwise learning methods, since most of traditional methods in personalized ranking only take explicit ratings or implicit feedbacks into consideration. For dealing with the above issues, this work proposes a novel personalized ranking model which incorporates implicit feedback with content information by making use of Factorization Machines. …
… Factorization Machines (FM) [2] is a generic method, which can mimic most of factorization models just using feature engineering [2]. In FM, all kinds of contents are concatenated into a design matrix [math]\displaystyle{ X = [x_1, x_2, ..., x_n] }[/math], where [math]\displaystyle{ x_i }[/math] is the feature vector of the i-th sample, and n is the number of training samples. …
- QUOTE: … In addition, the cold start problem often perplexes pairwise learning methods, since most of traditional methods in personalized ranking only take explicit ratings or implicit feedbacks into consideration. For dealing with the above issues, this work proposes a novel personalized ranking model which incorporates implicit feedback with content information by making use of Factorization Machines. …
2016
- (Juan et al., 2016) ⇒ Yuchin Juan, Yong Zhuang, Wei-Sheng Chin, and Chih-Jen Lin. (2016). “Field-aware Factorization Machines for CTR Prediction.” In: Proceedings of the 10th ACM Conference on Recommender Systems. ISBN:978-1-4503-4035-9 doi:10.1145/2959100.2959134
- QUOTE: Click-through rate (CTR) prediction plays an important role in computational advertising. Models based on degree-2 polynomial mappings and factorization machines (FMs) are widely used for this task. Recently, a variant of FMs, field-aware factorization machines (FFMs), outperforms existing models in some world-wide CTR-prediction competitions.
2015
- http://libfm.org/
- QUOTE: Factorization machines (FM) are a generic approach that allows to mimic most factorization models by feature engineering. This way, factorization machines combine the generality of feature engineering with the superiority of factorization models in estimating interactions between categorical variables of large domain. libFM is a software implementation for factorization machines that features stochastic gradient descent (SGD) and alternating least squares (ALS) optimization as well as Bayesian inference using Markov Chain Monte Carlo (MCMC).
2010
- (Rendle, 2010) ⇒ Steffen Rendle. (2010). “Factorization Machines.” In: Proceedings of the 2010 IEEE International Conference on Data Mining. ISBN:978-0-7695-4256-0 doi:10.1109/ICDM.2010.127
- ABSTRACT: In this paper, we introduce Factorization Machines (FM) which are a new model class that combines the advantages of Support Vector Machines (SVM) with factorization models. Like SVMs, FMs are a general predictor working with any real valued feature vector. In contrast to SVMs, FMs model all interactions between variables using factorized parameters. Thus they are able to estimate interactions even in problems with huge sparsity (like recommender systems) where SVMs fail. We show that the model equation of FMs can be calculated in linear time and thus FMs can be optimized directly. So unlike nonlinear SVMs, a transformation in the dual form is not necessary and the model parameters can be estimated directly without the need of any support vector in the solution. We show the relationship to SVMs and the advantages of FMs for parameter estimation in sparse settings. On the other hand there are many different factorization models like matrix factorization, parallel factor analysis or specialized models like SVD + +, PITF or FPMC. The drawback of these models is that they are not applicable for general prediction tasks but work only with special input data. Furthermore their model equations and optimization algorithms are derived individually for each task. We show that FMs can mimic these models just by specifying the input data (i.e. the feature vectors). This makes FMs easily applicable even for users without expert knowledge in factorization models.