A smartphone sensors-based personalized human activity recognition system for sustainable smart cities

Abdul Rehman Javed, Raza Faheem, Muhammad Asim, Thar Baker, Mirza Omer Beg

Research output: Contribution to journalArticlepeer-review

Abstract

According to the Sustainable Development Agenda 2030 of the World Health Organization, maintaining physical activities have multiple societal privileges for healthier cities and societies. The amalgamation of the Internet of Things (IoT) and pervasive smartphones has become of paramount importance to produce a significant breakthrough in various domains of smart cities, including healthcare, fitness, skill assessment, and personal assistants, to support independent living. The IoT-supported devices capacitate, embedded with sensors, enabled numerous context-aware applications to recognize physical activities. There are some activity recognition applications; however, they are still deficient in recognizing activities accurately. In this paper, a novel framework for human activity recognition (HAR) is proposed using raw readings from a combination of fused smartphone sensors: accelerometer, gyroscope, magnetometer, and Google Fit activity tracking module. The proposed framework applies deep recurrent neural network (DRNN) to an extensive training dataset. The latter consists of five activity classes from 12 individuals using a deep recurrent neural network (DRNN). An extensive training dataset is used consisting of five activity classes from a group of 12 individuals. The designed android application (runs in the background) collects data from the smartphone's embedded sensors fused with the Google Fit API to validate the results proposed framework. The proposed framework shows promising results in recognizing human activities compared to other similar studies and achieves an accuracy of 99.43% for activity recognition using DRNN.
Original languageEnglish
Article number102970
JournalSustainable Cities and Society
Volume71
DOIs
Publication statusPublished - 26 Apr 2021

Cite this