https://scholars.lib.ntu.edu.tw/handle/123456789/607483
標題: | S3: Learnable sparse signal superdensity for guided depth estimation | 作者: | Huang Y.-K Liu Y.-C Wu T.-H Su H.-T Chang Y.-C Tsou T.-L Wang Y.-A WINSTON HSU |
關鍵字: | Optical radar;3D reconstruction;Dense depth estimation;Depth Estimation;Depth value;End to end;Estimation approaches;Lower density;Multiple applications;Sparse signals;Sparse sources;Augmented reality | 公開日期: | 2021 | 起(迄)頁: | 16701-16711 | 來源出版物: | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition | 摘要: | Dense depth estimation plays a key role in multiple applications such as robotics, 3D reconstruction, and augmented reality. While sparse signal, e.g., LiDAR and Radar, has been leveraged as guidance for enhancing dense depth estimation, the improvement is limited due to its low density and imbalanced distribution. To maximize the utility from the sparse source, we propose Sparse Signal Superdensity (S3 ) technique, which expands the depth value from sparse cues while estimating the confidence of expanded region. The proposed S3 can be applied to various guided depth estimation approaches and trained end-to-end at different stages, including input, cost volume and output. Extensive experiments demonstrate the effectiveness, robustness, and flexibility of the S3 technique on LiDAR and Radar signal. ? 2021 IEEE |
URI: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85124219826&doi=10.1109%2fCVPR46437.2021.01643&partnerID=40&md5=1dd925f8d4ec9d1ff3dba5fb7ec49a4e https://scholars.lib.ntu.edu.tw/handle/123456789/607483 |
ISSN: | 10636919 | DOI: | 10.1109/CVPR46437.2021.01643 |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。