Efficient Optimization Methods for Extreme Similarity Learning with Nonlinear Embeddings
Journal
Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Pages
2093-2103
Date Issued
2021
Author(s)
Abstract
We study the problem of learning similarity by using nonlinear embedding models (e.g., neural networks) from all possible pairs. This problem is well-known for its difficulty of training with the extreme number of pairs. For the special case of using linear embeddings, many studies have addressed this issue of handling all pairs by considering certain loss functions and developing efficient optimization algorithms. This paper aims to extend results for general nonlinear embeddings. First, we finish detailed derivations and provide clean formulations for efficiently calculating some building blocks of optimization algorithms such as function, gradient evaluation, and Hessian-vector product. The result enables the use of many optimization methods for extreme similarity learning with nonlinear embeddings. Second, we study some optimization methods in detail. Due to the use of nonlinear embeddings, implementation issues different from linear cases are addressed. In the end, some methods are shown to be highly efficient for extreme similarity learning with nonlinear embeddings. ? 2021 ACM.
Subjects
neural networks
newton methods
non-convex optimization
representation learning
similarity learning
Embeddings
Learning systems
Optimization
Building blockes
Hessian-vector products
Learning similarity
Loss functions
Nonlinear embedding
Optimization algorithms
Optimization method
Similarity learning
Data mining
Type
conference paper
