https://scholars.lib.ntu.edu.tw/handle/123456789/631819
標題: | Viewing Bias Matters in 360° Videos Visual Saliency Prediction | 作者: | Chen, Peng Wen Yang, Tsung Shan Huang, Gi Luen Huang, Chia Wen Chao, Yu Chieh Lu, Chien Hung PEI-YUAN WU |
關鍵字: | 360° videos | Convolutional neural networks | Decoding | Deep learning | deep learning | Feature extraction | Predictive models | Three-dimensional displays | Videos | viewing bias | Visual saliency prediction | Visualization | 公開日期: | 1-一月-2023 | 卷: | 11 | 來源出版物: | IEEE Access | 摘要: | 360° video has been applied to many areas such as immersive contents, virtual tours, and surveillance systems. Compared to the field of view prediction on planar videos, the explosive amount of information contained in the omni-directional view on the entire sphere poses an additional challenge in predicting high-salient regions in 360° videos. In this work, we propose a visual saliency prediction model that directly takes 360° video in the equirectangular format. Unlike previous works that often adopted recurrent neural network (RNN) architecture for the saliency detection task, in this work, we utilize 3D convolution to a spatial-temporal encoder and generalize SphereNet kernels to construct a spatial-temporal decoder. We further study the statistical properties of viewing biases present in 360° datasets across various video types, which provides us with insights into the design of a fusing mechanism that incorporates the predicted saliency map with the viewing bias in an adaptive manner. The proposed model yields state-of-the-art performance, as evidenced by empirical results over renowned 360° visual saliency datasets such as Salient360!, PVS, and Sport360. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/631819 | ISSN: | 2169-3536 | DOI: | 10.1109/ACCESS.2023.3269564 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。