Yuan BLi Y.-SQuan PCHIH-JEN LIN2022-04-252022-04-252021https://www.scopus.com/inward/record.uri?eid=2-s2.0-85114949554&doi=10.1145%2f3447548.3467363&partnerID=40&md5=8fd57bcb7b148b8851e39d5ec8da7a1fhttps://scholars.lib.ntu.edu.tw/handle/123456789/607390We study the problem of learning similarity by using nonlinear embedding models (e.g., neural networks) from all possible pairs. This problem is well-known for its difficulty of training with the extreme number of pairs. For the special case of using linear embeddings, many studies have addressed this issue of handling all pairs by considering certain loss functions and developing efficient optimization algorithms. This paper aims to extend results for general nonlinear embeddings. First, we finish detailed derivations and provide clean formulations for efficiently calculating some building blocks of optimization algorithms such as function, gradient evaluation, and Hessian-vector product. The result enables the use of many optimization methods for extreme similarity learning with nonlinear embeddings. Second, we study some optimization methods in detail. Due to the use of nonlinear embeddings, implementation issues different from linear cases are addressed. In the end, some methods are shown to be highly efficient for extreme similarity learning with nonlinear embeddings. ? 2021 ACM.neural networksnewton methodsnon-convex optimizationrepresentation learningsimilarity learningEmbeddingsLearning systemsOptimizationBuilding blockesHessian-vector productsLearning similarityLoss functionsNonlinear embeddingOptimization algorithmsOptimization methodSimilarity learningData mining[SDGs]SDG9Efficient Optimization Methods for Extreme Similarity Learning with Nonlinear Embeddingsconference paper10.1145/3447548.34673632-s2.0-85114949554