AUDIOVISUAL SENSING OF HUMAN MOVEMENTS FOR HOME-CARE AND SECURITY IN A SMART ENVIRONMENT

Publications

Share / Export Citation / Email / Print / Text size:

International Journal on Smart Sensing and Intelligent Systems

Professor Subhas Chandra Mukhopadhyay

Exeley Inc. (New York)

Subject: Computational Science & Engineering, Engineering, Electrical & Electronic

GET ALERTS

eISSN: 1178-5608

DESCRIPTION

6
Reader(s)
12
Visit(s)
0
Comment(s)
0
Share(s)

VOLUME 1 , ISSUE 1 (March 2008) > List of articles

AUDIOVISUAL SENSING OF HUMAN MOVEMENTS FOR HOME-CARE AND SECURITY IN A SMART ENVIRONMENT

Liyanage C De Silva *

Keywords : video sensor based detection, audio sensor based detection, human movements, home-care, security, smart environments, background segmentation.

Citation Information : International Journal on Smart Sensing and Intelligent Systems. Volume 1, Issue 1, Pages 220-245, DOI: https://doi.org/10.21307/ijssis-2017-288

License : (CC BY-NC-ND 4.0)

Published Online: 13-December-2017

ARTICLE

ABSTRACT

This paper presents the necessity and possibility of smart sensing using a multitude of sensors such as audio and visual sensors for human movement detection for home-care and home security applications in a smart environment. We define an event and spatial relationship based approach to the problem. Use of multisensory information to even detection is proposed and its prototype implementation to detect events like falling, walking, standing, shouting etc. are presented. Use of audio sensor based event detection and video sensor based event detection helped to increase the number of different types of actions that can be detected within a smart environment. Video sensors detected the actions at 94.44% accuracy while the audio sensors detected the actions at 83.35%. In addition to that the video based fall detection accuracy was 93.3%. Currently we are working in real life data capture and analysis using multi-sensor integration

Content not available PDF Share

FIGURES & TABLES

REFERENCES

[1]Jongwoo Lim; D Kriegman, “Tracking humans using prior and learned representations of shape and appearance”, published in the proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. FG 2004, 17-19 May 2004, pp. 869-874.
[2]Ogawa, M., Togawa, T, “Monitoring Daily Activities and Behaviours at Home by Using Brief Sensors”, 1st Annual International IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine & Biology, October 12-14, 2000, Lyon, France, pp 611-614.
[3]Ogawa, M., Ochiai, S., Otsuka, K., Togawa, T., “Remote Monitoring of Daily Activities and Behaviors at Home”, 2001 Proceedings of the 23rd Annual EMBS International Conference, October 25-28, Istanbul, Turkey, pp 3973 – 3976 Vol. 4.
[4]Ogawa, M., Suzuki, R., Otake, S., Izutsu, T., Iwaya, T., Togawa, T., “Long-Term Remote Behavioural Monitoring of the Elderly using Sensors Installed in Domestic Houses”, Proceedings of the 2nd Joint EMBS/BMES Conference, October 23-26, 2002, TX, USA, pp 1853 – 1854, Vol. 3.
[5]Huang, J. H., Mishra, S., “A Sensor based Tracking System using Witnesses”, 25th IEEE International Conference on Distributed Computing System Workshops, 2005, pp 251-255.
[6]Kainka, B., “Passive-Optical Person Detector”, http://www.kfupm.edu.sa/club/ieee/Projects/proj007.pdf, visited on 02/12/2004
[7]Segen, J., Pingali, S., “A Camera-Based System for Tracking People in Real Time”, Proceedings of the 13th International Conference on Pattern Recognition, 1996, pp 63 – 67, Vol. 3
[8]Haritaoglu, I., Harwood, D., Davis, L. S., “W4: Who? When? Where? What? A Real Time System for Detecting and Tracking People”, Proceedings of the 3rd IEEE International Conference on Automatic Face and Gesture Recognition, April 14-16, 1998, pp 222 – 227.
[9]Utsumi, A., Mori, H., Ohya, J., Yachida, M., “Multiple-Human Tracking using Multiple Cameras”, Proceedings of the 3rd IEEE International Conference on Automatic Face and Gesture Recognition, April 14-16, 1998, pp 498-503.
[10]Liyanage C De Silva, and Tsutomu Miyasato, “Hierarchical Expression Recognition”, Japanese National Patent – No 2967058, awarded in August 1998.
[11]Liyanage C De Silva, Tsutomu Miyasato, and Ryohei Nakatsu, “Facial Emotion Recognition Using Multimodal Information”, in proceedings of IEEE Int'l Conf. on Information, Communications and Signal Processing (ICICS1997), pp. 397-401, Vol. 1, Singapore, Sep. 1997.
[12]Liyanage C De Silva, “Audiovisual emotion recognition”, Invited paper in the proceedings of IEEE International Conference on Systems, Man and Cybernetics (SMC2004), The Hague, The Netherlands, October 10-13, 2004.
[13]A. Pentland, T. Choudhury, “Face Recognition for Smart Environments”, Computer, pp. 50-55, IEEE Press, United Kingdom, February 2000.
[14]Henry CC Tan and Liyanage C De Silva, “Human Activity Recognition by Head Movement using Elman Network and Neuro-Markovian Hybrids”, in proc. of Image and Vision Computing New Zealand 2003 (IVCNZ 2003), pp. 320-326, Massey Univ., Palmerston North, New Zealand, Nov. 26-28, 2003.
[15]Henry CC Tan, E G Ruwan Janapriya, and Liyanage C De Silva, “An Automatic System for Multiple Human Tracking and Action Recog. in an Office Environment”, in proc. of IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP 2003), Hong Kong, IMSP-L2.6, April 6-10, 2003.
[16]A. Pentland, “Smart Rooms”, (http://vismod.www.media.mit.edu/vismod/demos/smartroom/ive.html).
[17]G. C. de Silva, “Traffic Flow Measurement Using Video image Sequences”, M. Eng. Thesis, Department of Computer Science and Engineering, Univ. of Moratuwa, Sri Lanka, 2001.
[18]A. Utsumi, H. Mori, J. Ohya, M. Yachide, “Multiple View-Based Tracking of multiple Humans”, in proc. of the 14th Int. Conf on Pattern Recognition, 1998, Vol.1, pp.597 -601, 1998.

EXTRA FILES

COMMENTS