https://scholars.lib.ntu.edu.tw/handle/123456789/558976
標題: | Interrupted and Cascaded Permutation Invariant Training for Speech Separation | 作者: | Yang, G.-P. Wu, S.-L. Mao, Y.-W. HUNG-YI LEE LIN-SHAN LEE |
關鍵字: | Cocktail Party Problem; Label Ambiguity Problem; Permutation Invariant Training; Speech Separation | 公開日期: | 2020 | 卷: | 2020-May | 起(迄)頁: | 6369-6373 | 來源出版物: | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings | 摘要: | Permutation Invariant Training (PIT) has long been a stepping stone method for training speech separation model in handling the label ambiguity problem. With PIT selecting the minimum cost label assignments dynamically, very few studies considered the separation problem to be optimizing both the model parameters and the label assignments, but focused on searching for good model architecture and parameters. In this paper, we investigate instead for a given model architecture the various flexible label assignment strategies for training the model, rather than directly using PIT. Surprisingly, we discover a significant performance boost compared to PIT is possible if the model is trained with fixed label assignments and a good set of labels is chosen. With fixed label training cascaded between two sections of PIT, we achieved the state-of-the-art performance on WSJ0-2mix without changing the model architecture at all. © 2020 IEEE. |
URI: | https://www.scopus.com/inward/record.url?eid=2-s2.0-85089221730&partnerID=40&md5=f0dc6b81044851b9ee6ad5cf5ca4c8dc https://scholars.lib.ntu.edu.tw/handle/123456789/558976 |
DOI: | 10.1109/ICASSP40776.2020.9053697 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。