Design and Implementation of Smart Glass with Voice Detection Capability to Help Visually Impaired People

Authors

  • Azizuddin Khan
  • Gyan Prakash

DOI:

https://doi.org/10.20894/IJMSR.117.009.003.008

Keywords:

Wearable, Smart Camera Device, Recognize Objects, Recognize Obstacles and Bluetooth

Abstract

The present innovation has wearable device answers for ease the issues of handicapped people. Student-GLASS is a wearable smart camera device built with a powerful microcontroller that has the ability to see what we, normal people, are seeing, understands the user voice requests and supplies the relevant information using auditory feedback through an earphone. The device aims to improve the quality of life for the blind and visually impaired people and makes them understand their surroundings in a clear way as close as to a normal person at an affordable cost. The device consists of an OV7670 camera sensor based vision module, which can capture up to VGA resolution pictures. The device depends on the services of a smart phone for its voice recognition capabilities. A special voice recognition app must be installed on the smart phone which will communicate with the device using a Bluetooth connection. The device supports memory card up to 4GB, to store the necessary audio and image files. It is capable of generating high quality MP3 audio using VS1011e, a DSP based audio codec chip. The user can interact with the device using a four input capacitive touch buttons built into it. This is helpful when the user doesn’t have a smart phone or if its data connectivity is lost.

Downloads

Download data is not yet available.

Author Biographies

Azizuddin Khan

Associate Professor, Psychophysiology Laboratory, Department of Humanities and
Social Sciences, Indian Institute of Technology Bombay, Mumbai, India

Gyan Prakash

Software Engineer, R&D Department, Vee Eee Technologies Solutions Pvt. Ltd, Chennai, Tamilnadu, India.

References

[1] Resnikoff, S., Pascolini, D., Etya'Ale, D., Kocur, I., et.al, (2004), “Global data on visual impairment in the year 2002”. Bulletin of the world health organization, 82(11), pp.844-851.

[2] Kef, S., Hox, J.J. and Habekothe, H.T., (2000), “Social networks of visually impaired and blind adolescents”. Structure and effect on well-being. Social Networks, 22(1), pp.73-91.

[3] Ulrich, I. and Borenstein, J., (2001), “The GuideCane-applying mobile robot technologies to assist the visually impaired”. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 31(2), pp.131-136.

[4] Shoval, S., Borenstein, J. and Koren, Y., (1998), “Auditory guidance with the navbelt-a computerized travel aid for the blind”. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 28(3), pp.459-467.

[5] Strakowski, M.R., Kosmowski, B.B., Kowalik, R. and Wierzba, P., (2006), “An ultrasonic obstacle detector based on phase beamforming principles”. IEEE Sensors Journal, 6(1), pp.179- 186.

[6] Pundlik, S., Tomasi, M. and Luo, G., (2013), “Collision detection for visually impaired from a body-mounted camera”. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (pp. 41-47).

[7] Pantofaru, C., Schmid, C. and Hebert, M., (2008), Object recognition by integrating multiple image segmentations. Computer Vision–ECCV 2008, pp.481-494.

[8] M., Pinho, P., Teixeira, D. and De Carvalho, N.B., (2012), “Indoor guidance system for theblind and the visually impaired”. IET microwaves, antennas & propagation, 6(10), pp.1149- 1157.

[9] Dakopoulos, D. and Bourbakis, N.G., (2010), “Wearable obstacle avoidance electronic travel aids for blind: a survey”. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 40(1), pp.25-35.

[10] Lessard, N., Pare, M., Lepore, F. and Lassonde, M., (1998), “Early-blind human subjects localize sound sources better than sighted subjects”. Nature, 395(6699), p.278.

Downloads

Published

2017-10-23

Issue

Section

Articles