https://scholars.lib.ntu.edu.tw/handle/123456789/634360
標題: | An End-to-end Learning-based Approach to 3D Novel View Style Transfer | 作者: | Chang, Kai Cheng YI-PING HUNG CHU-SONG CHEN |
關鍵字: | 3D reconstruction | Novel View Synthesis | Stereo Video Generation | Style Transfer | Virtual Reality | 公開日期: | 1-一月-2022 | 來源出版物: | Proceedings - 2022 IEEE International Conference on Artificial Intelligence and Virtual Reality, AIVR 2022 | 摘要: | 3D novel view style transfer is a rising research topic. Recently developed methods aim to build globally optimized scene representations and stylize them directly on the scene. However, these methods are time-consuming because they need globally-consistent optimization or rendering fields reconstruction. In this paper, we introduce an end-to-end learning framework to handle the problem of stylized novel view synthesis, which can speed up the 3D style transfer by applying learning-based structure-of-motion (SfM) approaches. Experimental results show that our method can achieve comparable visual effects to the original style transfer module with higher efficiency. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/634360 | ISBN: | 9781665457255 | DOI: | 10.1109/AIVR56993.2022.00009 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。