Chung Y.-AYang S.-WHSUAN-TIEN LIN2021-09-022021-09-022020https://www.scopus.com/inward/record.uri?eid=2-s2.0-85103812376&doi=10.1109%2fTAAI51410.2020.00028&partnerID=40&md5=bb2a43d3b4440ef3b50027e9fb62c3f0https://scholars.lib.ntu.edu.tw/handle/123456789/581364While deep neural networks have succeeded in several applications, such as image classification, object detection, and speech recognition, by reaching very high classification accuracies, it is important to note that many real-world applications demand varying costs for different types of misclassification errors, thus requiring cost-sensitive classification algorithms. Current models of deep neural networks for cost-sensitive classification are restricted to some specific network structures and limited depth. In this paper, we propose a novel framework that can be applied to deep neural networks with any structure to facilitate their learning of meaningful representations for cost-sensitive classification problems. Furthermore, the framework allows end-to-end training of deeper networks directly. The framework is designed by augmenting auxiliary neurons to the output of each hidden layer for layer-wise cost estimation, and including the total estimation loss within the optimization objective. Experimental results on public benchmark data sets with two cost information settings demonstrate that the proposed framework outperforms state-of-the-art cost-sensitive deep learning models. ? 2020 IEEE.Cost estimating; Deep neural networks; Neural networks; Object detection; Speech recognition; Classification accuracy; Cost estimations; Cost information; Cost sensitive classifications; Learning models; Misclassification error; Network structures; State of the art; Deep learning[SDGs]SDG3Cost-Sensitive Deep Learning with Layer-Wise Cost Estimationconference paper10.1109/TAAI51410.2020.000282-s2.0-85103812376