Matrix Factorization-based Item Recommendation System
Jump to navigation
Jump to search
A Matrix Factorization-based Item Recommendation System is an items recommendation system that applies a matrix factorization-based recommendation algorithm to solve a matrix factorization-basd recommendation task.
- Context:
- …
- Example(s):
- …
- Counter-Example(s):
- See: Domain Specific Recommendation Task, Algorithm-Specific System, ALS System.
References
2016a
2016b
2016c
- (Poole, 2016m) ⇒ David Poole. (2016). “Recommendation System using Matrix Factorization.” In: “CPSC-522: Artificial Intelligence 2 - Acting in Uncertain Environments."
- QUOTE: When we understand the mathematical derivation process of matrix factorization, we won't be difficult for us to understand the Matrix Factorization-based Item Recommendation System in high-level language: [1]
1 import numpy 2 def matrix_factorization(R, P, Q, K, steps=5000, alpha=0.0002, beta=0.02): 3 Q = Q.T 4 for step in xrange(steps): 5 for i in xrange(len(R)): 6 for j in xrange(len(R[i])): 7 if R[i][j] > 0: 8 eij = R[i][j] - numpy.dot(P[i,:],Q[:,j]) 9 for k in xrange(K): 10 P[i][k] = P[i][k] + alpha * (2 * eij * Q[k][j] - beta * P[i][k]) 11 Q[k][j] = Q[k][j] + alpha * (2 * eij * P[i][k] - beta * Q[k][j]) 12 eR = numpy.dot(P,Q) 13 e = 0 14 for i in xrange(len(R)): 15 for j in xrange(len(R[i])): 16 if R[i][j] > 0: 17 e = e + pow(R[i][j] - numpy.dot(P[i,:],Q[:,j]), 2) 18 for k in xrange(K): 19 e = e + (beta/2) * (pow(P[i][k],2) + pow(Q[k][j],2)) 20 if e < 0.001: 21 break 22 return P, Q.T
Let's use some real data input and see actual result:
R = [ [5,3,0,1], [4,0,0,1], [1,1,0,5], [1,0,0,4], [0,1,5,4], ] R = numpy.array(R) N = len(R) M = len(R[0]) K = 2 P = numpy.random.rand(N,K) Q = numpy.random.rand(M,K) nP, nQ = matrix_factorization(R, P, Q, K) nR = numpy.dot(nP, nQ.T)
And the actual result is like this comparing to the previous table we have. As you can see, we use the gradient descent to calculate all the numbers that we need
D1 | D2 | D3 | D4 | |
---|---|---|---|---|
U1 | 4.97 | 2.98 | 2.18 | 0.98 |
U2 | 3.97 | 2.40 | 1.97 | 0.99 |
U3 | 1.02 | 0.93 | 5.32 | 4.93 |
U4 | 1.00 | 0.85 | 4.59 | 3.93 |
U5 | 1.36 | 1.07 | 4.89 | 4.12 |
We can see that the values are very close to the real values in the previous table and we predict the unknown values as we have already planned. It is quite clear that using matrix factorization can generate high-accuracy prediction data.