RT Journal Article T1 Efficient semantic place categorization by a robot through active line-of-sight selection A1 Luis Matez-Bandera, Jose A1 Monroy, Javier A1 Gonzalez-Jimenez, Javier K1 Semantic knowledge K1 Mobile robots K1 Attention mechanism K1 Place categorization K1 Markov decision processes K1 Indoor environments K1 Integration K1 Vision K1 System AB In this paper, we present an attention mechanism for mobile robots to face the problem of place categorization. Our approach, which is based on active perception, aims to capture images with characteristic or distinctive details of the environment that can be exploited to improve the efficiency (quickness and accuracy) of the place categorization. To do so, at each time moment, our proposal selects the most informative view by controlling the line-of-sight of the robot's camera through a pan-only unit. We root our proposal on an information maximization scheme, formalized as a next-best-view problem through a Markov Decision Process (MDP) model. The latter exploits the short-time estimated navigation path of the robot to anticipate the next robot's movements and make consistent decisions. We demonstrate over two datasets, with simulated and real data, that our proposal generalizes well for the two main paradigms of place categorization (object-based and image-based), outperforming typical camera-configurations (fixed and continuously-rotating) and a pure-exploratory approach, both in quickness and accuracy. PB Elsevier SN 0950-7051 YR 2021 FD 2021-12-18 LK http://hdl.handle.net/10668/22348 UL http://hdl.handle.net/10668/22348 LA en NO Matez-Bandera, J. L., Monroy, J., & Gonzalez-Jimenez, J. (2022). Efficient semantic place categorization by a robot through active line-of-sight selection. Knowledge-Based Systems, 240, 108022 NO This work was supported by the research projects WISER (DPI2017-84827-R) and ARPEGGIO (PID2020-117057), as well as by the Spanish grant program FPU19/00704. Funding for open access charge: Universidad de Málaga / CBUA. DS RISalud RD May 11, 2025