Concept for real-time localization based on smartphone camera and IMU
This paper introduces a concept for a real-time localization system based on sensors available on smartphones. The proposed method relies on inertial measuring units, on one or more cameras and a given floor plan. Each of the used sensor produces position and orientation (pose) which will be fused with a kalman filter to get the best possible result. 3D point clouds generated from images of the camera are used to derive partly floor plans. By comparing the result to a given map the scaling and positioning fixes are adopted. Different possible platforms, like standard smartphones with one or smartphones with more than one camera, can be used to realize this concept. First experiments were done in two different approaches (with inertial and visual navigation) in order to get an idea what accuracies are achievable. The concept will be implemented and tested on a Samsung Galaxy S8.