Augmented Reality Indoor Navigation System
Main Article Content
Abstract
- Augmented reality based indoor navigation system running on a smartphone is proposed to be used for in-building navigation. The system uses a built-in camera to capture the image of surroundings, detects a natural marker in the image, and calculates the pose of the camera with respect to the marker. The position and orientation of the camera (which are the same as the smartphone itself) with respect to the indoor map are then determined using the pose information of that marker—note that each marker must be pre-registered with pose information in the system. Once the destination is specified by the user, the shortest path to that destination will be calculated and the arrow pointing along the path to the destination will be augmented on the scene. The information message explaining the route will also be annotated on the screen and be read out to help guide users to the destination. In addition, the system can display a top view map of building, showing current position and facing direction of the user, and drawing the route to the destination—the top-view mode provides a better understanding and experience for the user. The accuracy of marker detection in the proposed system depends on the distance from the marker, the viewing angle, the type of the camera, and characteristics of the marker. Experimental results in real environments show that more than 70% of detection accuracy is achieved when the marker has high details and uniqueness, regardless of camera type. The viewing angle, on the other hand, has less impact on detection accuracy; except when many other irrelevant scene components appear in the view. The detection error is mostly caused by the ‘no matches found’—not the ‘mismatch’; a slight movement of the camera normally helps the system correctly recognize the place. The calculation of the shortest path to the destination, the display of route and arrow, and the voice guidance work perfectly without error.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
I/we certify that I/we have participated sufficiently in the intellectual content, conception and design of this work or the analysis and interpretation of the data (when applicable), as well as the writing of the manuscript, to take public responsibility for it and have agreed to have my/our name listed as a contributor. I/we believe the manuscript represents valid work. Neither this manuscript nor one with substantially similar content under my/our authorship has been published or is being considered for publication elsewhere, except as described in the covering letter. I/we certify that all the data collected during the study is presented in this manuscript and no data from the study has been or will be published separately. I/we attest that, if requested by the editors, I/we will provide the data/information or will cooperate fully in obtaining and providing the data/information on which the manuscript is based, for examination by the editors or their assignees. Financial interests, direct or indirect, that exist or may be perceived to exist for individual contributors in connection with the content of this paper have been disclosed in the cover letter. Sources of outside support of the project are named in the cover letter.
I/We hereby transfer(s), assign(s), or otherwise convey(s) all copyright ownership, including any and all rights incidental thereto, exclusively to the Journal, in the event that such work is published by the Journal. The Journal shall own the work, including 1) copyright; 2) the right to grant permission to republish the article in whole or in part, with or without fee; 3) the right to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4) the right to republish the work in a collection of articles in any other mechanical or electronic format.
We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf.
All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.
References
2. F. Evennou and F. Marx, “Advanced Integration of WiFi and Inertial Navigation Systems for Indoor Mobile Positioning,” EURASIP Journal on Applied Signal Processing, Vol. 2006, pp. 1-11, 2006.
3. ล้ำมณี ชะนะมา, ศุภกิจ ยงยศ, และโอฬาร วงศ์วิรัตน์, “การศึกษาแนวทางปรับปรุงวิธีการระบุตำแหน่งภายในอาคาร,” ใน รายงานสืบเนื่องการประชุมวิชาการระดับประเทศทางด้านเทคโนโลยีสารสนเทศ (National Conference on Information Technology: NCIT) ครั้งที่ 8, ตุลาคม 2559, หน้า 374-377.
4. กฤษฎา ทองเชื้อ และ ทินกฤต งามดี, "ระบบกำหนดตำแหน่งในร่มด้วยเทคโนโลยีบลูทูธ," ปริญญานิพนธ์วิศวกรรมศาสตรบัณฑิต สาขาวิชาวิศวกรรมคอมพิวเตอร์, สถาบันเทคโนโลยีพระจอมเกล้าเจ้าคุณทหารลาดกระบัง 2557.
5. Blognone, “รู้จักกับ iBeacon เทคโนโลยีบอกพิกัดแห่งอนาคตที่กำลังมาถึง,” blognone.com, Jun. 14, 2014. [Online]. Available: https://www.blognone.com/node/57349. [Accessed: Sep. 13, 2017].
6. I. Gorovyi, A. Roenko, A. Pitertsev, I. Chervonyak, and V. Vovk. "Real-Time System for Indoor User Localization and Navigation Using Bluetooth Beacons," In Proc. IEEE First Ukraine Conference on Electrical and Computer Engineering (UKRCON), 2017. pp. 1025-1030.
7. V. Renaudin and C. Combettes, “Magnetic, Acceleration Fields and Gyroscope Quaternion (MAGYQ)-Based Attitude Estimation with Smartphone Sensors for Indoor Pedestrian Navigation,” Sensors, vol. 14, pp. 22865-22890, 2014.
8. ฐากูล เทพศิริ, ธนบูรณ์ ยงทัศนีย์กุล, และศรัณย์ อินโกสุม. "โปรแกรมสร้างแผนที่ในอาคารบนสมาร์ทโฟน," ปริญญานิพนธ์วิทยาศาสตรบัณฑิต สาขาวิชาวิทยาการการคอมพิวเตอร์, สถาบันเทคโนโลยีพระจอมเกล้าเจ้าคุณทหารลาดกระบัง. 2557
9. V. Shivam, O. Rohit, V. Sreejith, and G.S. Meera, "A Smartphone Based Indoor Navigation System," In Proc. 28th International Conference on Microelectronics (ICM), 2016. pp.345-348.
10. E. Deretey, M. T. Ahmed, J. A. Marshall, and M. Greenspan, “Visual Indoor Positioning with a Single Camera Using PnP,” In Proc. International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2015, pp 1-9.
11. Y. Zheng, G. Shen, L. Li, C. Zhao, M. Li, and F. Zhao, "Travi-Navi: Self-Deployable Indoor Navigation System," IEEE/ACM Transactions on Networking, vol. 25, no. 5, pp. 2655 – 2669, Oct. 2017.
12. B. A. Delail, L. Weruaga, and M. J. Zemerly, "CAViAR: Context Aware Visual Indoor Augmented Reality for a University Campus," in Proc. IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology, vol. 3, pp. 286-290, 2012.
13. Cologne Intelligence, "Google Tango: Augmented Reality Indoor Navigation," Cologne Intelligence. [Online]. Available: https://www.cologne-intelligence.de/english/augmented-reality-indoor-navigation/. [Accessed: Sep. 15, 2017].
14. Cologne Intelligence, "ARCore, ARKit: Augmented Reality for everyone, everywhere!," Cologne Intelligence. [Online]. Available: https://www.cologne-intelligence.de/blog/arcore-arkit-augmented-reality-for-everyone-everywhere/. [Accessed: Dec. 15, 2017].
15. Wikipedia, "Dijkstra's algorithm," wikipedia.org. [Online]. Available: https://en.wikipedia.org/wiki/Dijkstra's_algorithm. [Accessed: Sep. 15, 2017].
16. Qualcomm Inc., "Vuforia SDK," vuforia.com, 2017. [Online]. Available: https://developer.vuforia.com/. [Accessed: Sep. 14, 2017].