Novel Solution of Topological Recognition of Indoor Objects Based on Optical Flow and Planar Attributes
Keywords:hierarchical segmentation, optical flow-based recognition, two-plane-built objects
An approach of qualitative optical flow processing for indoor object recognition based on planar attributes is presented. The qualitative processing is performed under hierarchical segmentations of optical flow vectors. The proposed solution for indoor object recognition is undertaken from identifying planar and atilt properties of optical flow images. The advantages of the proposed solutions are the use of much simpler arithmetic to obtain more 3D details about indoor objects.
RUCHI G. and G. POOJA. Robotics and Industry 4.0. In: A. NAYYAR and A. KUMAR. A Roadmap to Industry 4.0: Smart Production, Sharp Business and Sustainable Development. Berlin: Springer, 2020, pp 157-169. ISBN 978-3-030-14543-9.
GUO, S., Q. DIAO and F. XI. Vision Based Navigation for Omni-directional Mobile Industrial Robot. Procedia Computer Science, 2017, 105, pp. 20-26. DOI 10.1016/j.procs.2017.01.182.
AHMADI, A., L. NARDI, N. CHEBROLU and C. STACHNISS. Visual Servoing-based Navigation for Monitoring Row-Crop Fields. In: IEEE International Conference on Robotics and Automation (ICRA). New York: Cornell University, 2020.
ÖZKIL, A.G. Service Robots for Hospitals: Key Technical Issues [PhD thesis]. Kongens Lyngby: Technical University of Denmark, 2011.
GORDON, S.W., S. PANG, R. NISHIOKA, N. KASABOV and T. YAMAKAWA. Vision Based Mobile Robot for Indoor Environmental Security. In: KÖPPEN, M., N. KASABOV and G. COGHILL, eds. Advances in Neuro-Information Processing, ICONIP. Heidelberg: Springer, 2009, pp. 962-969. DOI 10.1007/978-3-642-02490-0_117.
SHWE, L.L.T. and W.Y. WIN. Vision-Based Mobile Robot Self-localization and Mapping System for Indoor Environment. American Scientific Research Journal for Engineering, Technology, and Sciences, 2017, 38(1), pp. 306-324. ISSN 2313-4410.
JING P., W. ZHENG and Q. XU. Vision-based Mobile Robot's Environment Outdoor Perception. In: Proceedings of the 3rd International Conference on Computer Science and Application Engineering. New York: Association for Computing Machinery, 2019, pp. 1-5. DOI 10.1145/3331453.3361655.
AL-MUTIB, K.N., A.M. EBRAHIM, M.M. ALSULAIMAN and H. RAMDANE. Stereo Vision SLAM based Indoor Autonomous Mobile Robot Navigation. In: IEEE International Conference on Robotics and Biomimetics. Bali: IEEE, 2014, pp. 1584-1589. DOI 10.1109/ROBIO.2014.7090560.
LI, Y. Multimodal Visual Image Processing of Mobile Robot in Unstructured Environment Based on Semi-Supervised Multimodal Deep Network. Journal of Ambient Intelligence and Humanized Computing, 2020, 11, pp. 6349-6359. DOI 10.1007/s12652-020-02037-4.
KASSIR, M.M., M. PALHANG and M.R. AHMADZADEH. Qualitative Vision-Based Navigation Based on Sloped Funnel Lane Concept. Intelligent Service Ro-botics, 2020, 13, pp. 235-250. DOI 10.1007/s11370-019-00308-4.
PATRUNO, C., R. COLELLA, M. NITTI, V. RENÒ, N. MOSCA and E. STELLA. A Vision-Based Odometer for Localization of Omnidirectional Indoor Robots. Sensors, 2020, 20(3), pp. 1-25. DOI 10.3390/s20030875.
BOKOVOY A., K. MURAVYEV and K. YAKOVLEV. Real-time Vision-based Depth Reconstruction with NVidia Jetson. In: European Conference on Mobile Robots (ECMR). New York: Cornell University, 2019. DOI 10.48550/arXiv.1907.07210.
BARBER R., J. CRESPO, C. GOMEZ, A.C. HERNÁNDEZ and M. GALLI. Mobile Robot Navigation in Indoor Environments: Geometric, Topological, and Semantic Navigation. In: E.G. HURTADO, ed. Applications of Mobile Robots. London: IntechOpen, 2019. ISBN 978-1-78985-755-4.
SHAH, H.N.M.S, Z. KAMIS, M.F. ABDOLLAH, A. KHAMIS, M.S.M. ARAS, M.R. BAHARON and I.N. AZNI. Vision Based Obstacle Avoidance for Mobile Robot using Optical Flow Process. International Journal of Innovative Technolo-gy and Exploring Engineering (IJITEE), 8(4S), 2019, pp. 466-470. ISSN 2278-3075.
NADOUR M., M. BOUMEHRAZ and L. CHERROUN. Mobile Robot Visual Navigation Based on Fuzzy Logic and Optical Flow Approaches. International Journal of System Assurance Engineering and Management, 2019, 10, pp. 1654-1667. DOI 10.1007/s13198-019-00918-2.
WANG, Y., P. WANG, Z. YANG, C. LUO, Y. YANG and W. XU. UnOS: Unified Unsupervised Optical-flow and Stereo-depth Estimation by Watching Videos. In: IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Long Beach: IEEE, 2019, pp. 8071-8081. DOI 10.1109/CVPR.2019.00826.
ALQUISIRIS-QUECHAAND, O. and J. MARTINEZ-CARRANZA. Depth Es-timation Using Optical Flow and CNN for the NAO Robot. Research in Computing Science, 2019, 148(11), pp. 49-58. DOI 10.13053/rcs-148-11-4.
LIU, M. and T. DELBRUCK. ABMOF: A Novel Optical Flow Algorithm for Dynamic Vision Sensors [online]. 2018 [viewed 2021-10-02]. Available from: https://arxiv.org/pdf/1805.03988.pdf
LÓPEZ-RUBIO, F.J. and E. LÓPEZ-RUBIO. Foreground Detection for Moving Cameras with Stochastic Approximation. Pattern Recognition Letters, 2015, 68(P1), pp.161-168. DOI 10.1016/j.patrec.2015.09.007.
SEKKAL, R., F. PASTEAU, M. BABEL, B. BRUN and I. LEPLUMEY. Simple Monocular Door Detection and Tracking. In: IEEE International Conference on Image Processing. Melbourne: IEEE, 2013, pp. 3929-3933. DOI 10.1109/ICIP.2013.6738809
How to Cite
Copyright (c) 2022 Advances in Military Technology
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Authors who publish with this journal agree to the following terms:
1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
Users can use, reuse and build upon the material published in the journal for any purpose, even commercially.