Large-scale Matrix Factorization and Its Extensions
Date Issued
2016
Date
2016
Author(s)
Chin, Wei-Sheng
Abstract
Matrix factorization (MF) is a popular technique in many applications include online recommendataion and social network analysis. Our work aims to address some issues to make MF a practically useful technique for large-scale cases. The first issue is the learning rate of stochastic gradient (SG) methods for matrix factorization. Currently, stochastic gradient methods are one of the most important training methods for MF, but how to effectively adjust the learning rate in SG remains a challenging issue. We propose a useful scheme to adjust the learning rate so that the convergence of stochastic gradient methods for MF is improved. Second, MF users do not benefit from the recent advances of shared-memory systems with multi-core CPUs because most existing packages do not support parallel training. Based on our recently developed parallel SG algorithms, we create a new MF library LIBMF for public use. LIBMF can solve several MF problems in a unified way. In the third part of this thesis, we investigate an extension of MF called field-aware factorization machine (FFM). It is useful for classification problems with highly sparse data.
Subjects
matrix factorization
stochastic gradient methods
parellel computation
factorization machine
field-aware factorization machine
Type
thesis
File(s)
Loading...
Name
ntu-105-D01944006-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):9af3530cd385a4b68fae31dd7387263a