Fusion of Body Sensors’ Data and Video Images in Assistive Technology
Original version
10.1109/SAS.2019.8706053Abstract
Modern innovations in the design of sensors and the convergence of computing, cognition and communications have led to many new possibilities in incorporating AI-techniques in Assistive Technology (AT) for elderly people. Combining wearable sensors (body sensors) with sensors and computing capabilities of smartphones, a set of experiments were performed to test various AI-algorithms for the detection of critical events such as accidental falls, prolonged stationary states and going astray from residence of “Elderly Living Independently At Home, (ELIAH)”. Selected results from studies related to both critical and trivial events are used to test different AI models (threshold, Artificial Neural Networks (ANN), Support Vector Machines (SVM), k-Nearest Neighbors algorithm (kNN). The AI models are versatile enough to identify clearly fall from non-fall events. After selecting suitable features based on sensor data fusion, AI model using only wrist-based sensors flawless detection of events related to fall. A proposed system architecture for implementing these detection models in an application software for smartwatch and smartphone can serve to alert accidental faults as well as going astray of ELIAH. Data fusion with video images is also discussed.
Description
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.