Attention-based view selection networks for light-field disparity estimation
Journal
AAAI 2020 - 34th AAAI Conference on Artificial Intelligence
Pages
12095-12103
Date Issued
2020
Author(s)
Abstract
This paper introduces a novel deep network for estimating depth maps from a light field image. For utilizing the views more effectively and reducing redundancy within views, we propose a view selection module that generates an attention map indicating the importance of each view and its potential for contributing to accurate depth estimation. By exploring the symmetric property of light field views, we enforce symmetry in the attention map and further improve accuracy. With the attention map, our architecture utilizes all views more effectively and efficiently. Experiments show that the proposed method achieves state-of-the-art performance in terms of accuracy and ranks the first on a popular benchmark for disparity estimation for light field images. Copyright 2020, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
Subjects
Benchmarking; Attention-based views; Depth Estimation; Depth Map; Disparity estimations; Light fields; State-of-the-art performance; View selection; Artificial intelligence
Type
conference paper