Automated Mobility and Orientation System for Blind or Partially Sighted People


Share / Export Citation / Email / Print / Text size:

International Journal on Smart Sensing and Intelligent Systems

Professor Subhas Chandra Mukhopadhyay

Exeley Inc. (New York)

Subject: Computational Science & Engineering, Engineering, Electrical & Electronic


eISSN: 1178-5608



VOLUME 6 , ISSUE 2 (April 2013) > List of articles

Automated Mobility and Orientation System for Blind or Partially Sighted People

Abdel Ilah Nour Alshbatat *

Keywords : Assistive Technology, Global System for Mobile communication (GSM), Microcontroller, Short Message Service (SMS), AT Commands.

Citation Information : International Journal on Smart Sensing and Intelligent Systems. Volume 6, Issue 2, Pages 568-582, DOI:

License : (CC BY-NC-ND 4.0)

Received Date : 22-December-2012 / Accepted: 20-March-2013 / Published Online: 10-April-2013



Currently, blind people use a traditional cane as a tool for directing them when they move
from one place to another. Although, the traditional cane is the most widespread means that is used today by the visually impaired people, it could not help them to detect dangers from all levels of obstacles. In this context, we propose a new intelligent system for guiding individuals who are blind or partially sighted. The system is used to enable blind people to move with the same ease and confidence as a sighted people. The system is linked with a GSM-GPS module to pin-point the location of the blind person and to establish a two way communication path in a wireless fashion. Moreover, it provides the
direction information as well as information to avoid obstacles based on ultrasonic sensors. A beeper, an accelerometer sensor and vibrator are also added to the system. The whole system is designed to be small, light and is used in conjunction with the white cane. The results have shown that the blinds that used this system could move independently and safely.

Content not available PDF Share



[1] Chaudhry M., Kamran M., Afzal S., “Speaking monuments — design and implementation of an RFID based blind friendly environment.” Electrical Engineering, 2008. ICEE 2008. Second International Conference on 25-26 March 2008 Page(s):1 – 6.
[2] Velázquez R., “Wearable Assistive Devices for the Blind.” Chapter 17 in A. Lay-Ekuakille & Mukhopadhyay (Eds.), Wearable and Autonomous Biomedical Devices and Systems for Smart ronment: Issues and Characterization, LNEE 75, Springer, pp 331-349, 2010.
[3] Shinohara, K. “Designing assistive technology for blind users.” In Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility, ACM (2006) 293–294.
[4] Wagner C., Lederman S., and Howe R. “Design and performance of a tactile shape display using RC servomotors.” In Proc. of 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Orlando, Fl, USA, pp 354-355.
[5] Fischer H., Neisius B., and Trapp R. “Tactile feedback for endoscopic surgery.” In: Satava R, Morgan K, Sieburg H, Mattheus R, and Christensen J (Eds.), Interactive technology and the new paradigm for healthcare, IOS Press, pp 114-117.
[6] Summers I. and Chanter C. “A broadband tactile array on the fingertip.” In Journal of the Acoustical Society of America, vol. 112, pp 2118-2126.
[7] Vidal F., Madueño M., and Navas R. “Thermo-pneumatic actuator for tactile displays and smart actuation circuitry.” In Proc. of SPIE International Symposium on Microtechnologies for the New Millenium, Sevilla, Spain, pp 484-492.
[8] Bach-Rita P., Kaczmarek K., Tyler M., and Garcia-Lara J. “From perception with a 49-point electrotactile stimulus array on the tongue” a technical note. In Journal of Rehabilitation Research and Development, vol. 35, no. 4, pp 427-430.
[9] Ptito M., Moesgaard S., Gjedde A., and Kupers R. “Cross-modal plasticity revealed by electrotactile stimulation of the tongue in the congenitally blind.” In Brain, vol. 128, 606-614.
[10] Vuillerme N., Pinsault N., Chenu O., Fleury A., Payan Y., and Demongeot J. “A wireless embedded tongue tactile biofeedback system for balance control.” In Pervasive and Mobile Computing, vol. 5, pp 268-275.
[11] Kay L. “A sonar aid to enhance spatial perception of the blind: engineering design and evaluation.” In Radio and Electronic Engineer, vol. 44, no. 11, pp 605-627.
[12] Tsukada K. and Yasumrua M. “ActiveBelt: belt-type wearable tactile display for directiona l navigation.” In Proc. of UbiComp2004, Springer LNCS3205, pp 384-399.
[13] Allum J., Bloem B., Carpenter M., Hulliger M., and Hadders M. “Proprioceptive control of posture: a review of new concepts.” In Gait and Posture, vol. 8, pp 214-242.
[14] Benjamin J. M., Ali N. A., and Schepis A. F., “A Laser Cane for the Blind.” Proceedings of the San Diego Biomedical Symposium, Vol. 12, pp. 53-57.
[15] Johann B. and Iwan U. “The Guide Cane — A Computerized Travel Aid for the Active Guidance of Blind Pedestrians” Proceedings of the IEEE International Conference on Robotics and Automation, Albuquerque, NM, Apr. 21-27, 1997, pp. 1283-1288.
[16] Madad A. Shah, Sayed H. Abbas, Shahzad A. Malik, “ Blind Navigation via a DGPS-based Hand-held Unit” Australian Journal of Basic and Applied Sciences, 4(6): 1449-1458, 2010.
[17] Kulyukin V., Gharpure C., and Nicholson J., “RoboCart: Toward Robot-Assisted Navigation of Grocery Stores by the Visually Impaired,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, CA.
[18] Helal A., Moore S. E., and Ramachandran B., “Drishti: An Integrated Navigation System for the Visually Impaired and Disabled,” International Symposium on Wearable Computers,Zurich, Switzerland, October 2001, pp. 149-156.
[19] Mori H., and Kotani S., “Robotic Travel Aid for the Blind: HARUNOBU-6,” European Conference on Disability, Virtual Reality, and Assistive Technology, Sovde, Sweden, 1998.
[20] Sung Jae Kang, Young Ho Kim, and In Hyuk Moon “Development of an Intelligent Guide-Stick for the Blind” Proceedings of the 2001 IEEE International Conference on Robotics & Automation, Seoul, Korea, May 21-26, 2001.