Lightweight monocular depth with a novel neural architecture search method
Huynh, Lam; Nguyen, Phong; Matas, Jiri; Rahtu, Esa; Heikkilä, Janne (2022-02-15)
L. Huynh, P. Nguyen, J. Matas, E. Rahtu and J. Heikkilä, "Lightweight Monocular Depth with a Novel Neural Architecture Search Method," 2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2022, pp. 326-336, doi: 10.1109/WACV51458.2022.00040
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
https://urn.fi/URN:NBN:fi-fe2022060743942
Tiivistelmä
Abstract
This paper presents a novel neural architecture search method, called LiDNAS, for generating lightweight monocular depth estimation models. Unlike previous neural architecture search (NAS) approaches, where finding optimized networks is computationally demanding, the introduced novel Assisted Tabu Search leads to efficient architecture exploration. Moreover, we construct the search space on a pre-defined backbone network to balance layer diversity and search space size. The LiDNAS method outperforms the state-of-the-art NAS approach, proposed for disparity and depth estimation, in terms of search efficiency and output model performance. The LiDNAS optimized models achieve result superior to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and ScanNet, while being 7%-500% more compact in size, i.e the number of model parameters.
Kokoelmat
- Avoin saatavuus [34589]