https://scholars.lib.ntu.edu.tw/handle/123456789/558998
標題: | On Binary Statistical Classification from Mismatched Empirically Observed Statistics | 作者: | Hsu, H.-W. I-HSIANG WANG |
公開日期: | 2020 | 卷: | 2020-June | 起(迄)頁: | 2533-2538 | 來源出版物: | IEEE International Symposium on Information Theory - Proceedings | 摘要: | In this paper, we analyze the fundamental limit of statistical classification with mismatched empirically observed statistics. Unlike classical hypothesis testing where we have access to the distributions of data, now we only have two training sequences sampled i.i.d. from two unknown distributions P 0 and P 1 respectively. The goal is to classify a testing sequence sampled i.i.d. from one of the two candidate distributions, each of which is deviated slightly from P 0 and P 1 respectively. In other words, there is mismatch between how the training and testing sequences are generated. The amount of mismatch is measured by the norm of the deviation in the Euclidean space. Assuming the norm of deviation is not greater than δ, we derive an asymptotically optimal test in Chernoff's regime, and analyze its error exponents in both Stein's regime and Chernoff's regime. We also give both upper and lower bounds on the decrease of error exponents due to (i) unknown distributions (ii) mismatch in training and testing distributions. When δ is small, we show that the decrease in error exponents is linear in δ and characterize its first-order term. © 2020 IEEE. |
URI: | https://www.scopus.com/inward/record.url?eid=2-s2.0-85090401909&partnerID=40&md5=855ad861d307ec8140cc075a9fbe8693 https://scholars.lib.ntu.edu.tw/handle/123456789/558998 |
DOI: | 10.1109/ISIT44484.2020.9174520 | SDG/關鍵字: | Errors; Asymptotically optimal; Euclidean spaces; Hypothesis testing; Statistical classification; Testing sequences; Training and testing; Training sequences; Upper and lower bounds; Information theory |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。