Collaborative SLAM based on WiFi fingerprint similarity and motion information
Liu, Ran; Marakkalage, Sumudu Hasala; Padmal, Madhushanka; Shaganan, Thiruketheeswaran; Yuen, Chau; Guan, Yong Liang (2019-12-03)
R. Liu et al., "Collaborative SLAM Based on WiFi Fingerprint Similarity and Motion Information," in IEEE Internet of Things Journal, vol. 7, no. 3, pp. 1826-1840, March 2020, doi: 10.1109/JIOT.2019.2957293
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
https://urn.fi/URN:NBN:fi-fe2020060440594
Tiivistelmä
Abstract
Simultaneous localization and mapping (SLAM) has been extensively researched in past years particularly with regard to range-based or visual-based sensors. Instead of deploying dedicated devices that use visual features, it is more pragmatic to exploit the radio features to achieve this task, due to their ubiquitous nature and the widespread deployment of the Wi-Fi wireless network. This article presents a novel approach for collaborative simultaneous localization and radio fingerprint mapping (C-SLAM-RF) in large unknown indoor environments. The proposed system uses received signal strengths (RSS) from Wi-Fi access points (APs) in the existing infrastructure and pedestrian dead reckoning (PDR) from a smartphone, without a prior knowledge about map or distribution of AP in the environment. We claim a loop closure based on the similarity of the two radio fingerprints. To further improve the performance, we incorporate the turning motion and assign a small uncertainty value to a loop closure if a matched turning is identified. The experiment was done in an area of 130 m by 70 m and the results show that our proposed system is capable of estimating the tracks of four users with an accuracy of 0.6 m with Tango-based PDR and 4.76 m with a step counter-based PDR.
Kokoelmat
- Avoin saatavuus [37298]