Abstract
Image distortion is a main challenge for tasks on panoramas. In this work, we propose a Distortion-Aware Monocular Omnidirectional (DAMO) network to estimate dense depth maps from indoor panoramas. First, we introduce a distortion-aware module to extract semantic features from omnidirectional images. Specifically, we exploit deformable convolution to adjust its sampling grids to geometric distortions on panoramas. We also utilize a strip pooling module to sample against horizontal distortion introduced by inverse gnomonic projection. Second, we introduce a plug-and-play spherical-aware weight matrix for our loss function to handle the uneven distribution of areas projected from a sphere. Experiments on the 360D dataset show that the proposed method can effectively extract semantic features from distorted panoramas and alleviate the supervision bias caused by distortion. It achieves the state-of-the-art performance on the 360D dataset with high efficiency.
Original language | English |
---|---|
Article number | 9319252 |
Pages (from-to) | 334-338 |
Number of pages | 5 |
Journal | IEEE Signal Processing Letters |
Volume | 28 |
DOIs | |
Publication status | Published - 2021 |