Survey Paper
Websites: scholar.google, microsoft academic, scienceDirect, Publmed, ieee, refseek, web of science
Smartphone based human activity monitoring and recognition using ML and DL: a comprehensive survey
=> my goal to make this,,,
1st.2nd.3rd.etc Gen, Desc, pic, how communication done, pros and cons, diagram of users, top algorithms used to improvements, most useful sensors and benefit of it, top inventors, diagram of users, Machine learning and Data analysis how much improves ,
...
future plans in improvement from Google and big companies, conclusion , references from acticles and google search.
Abstract
POINTS
* Human activity monitoring and recognition (HAMR) based on smartphone sensor data is a feld that promotes a lot of obsevation in current era due to its notable desire in various Ambient Intelligent applications such as healthcare, sports, surveillance, and remote health monitoring.
* In this context, many research works have unveiled incredible results using various
smartphone sensors such as accelerometer, gyroscope, magnetometer, digital compass, microphone, GPS and camera.
* The waveform of sensor motion is quite diferent in several smartphone placements even for the identical activity. This makes it challenging to apprehend varied completely diferent activities with high precision. Due to the diference in behavioral habits, gender and age, the movement patterns of various individuals vary greatly, which boosts the problem of dividing boundaries of various activities.
*In HAMR, the main computational tasks are quantitative analysis of human motion and its automatic classifcation.
* These cause the inception of Machine Learning (ML) and Deep Learning (DL) techniques to automatically recognize various human activity signals collected using smartphone sensors.
* This paper presents a comprehensive survey of smartphone sensor based human activity monitoring and recognition using various ML and DL techniques to address the above mentioned challenges. This study unveils the “research gaps in the field of HAMR, to provide the future research directions in HAMR.
* Keywords Human activity monitoring and recognition · Healthcare · Machine learning · Smartphone · Sensors
* Automatic activity recognition systems aim to capture the state of the user and its environment by exploiting heterogeneous
sensors, and permit continuous monitoring of numerous physiological signals, where these sensors are attached to the subject's
body. This can be immensely useful in healthcare applications, for automatic and intelligent daily activity monitoring for elderly
people.
* In this paper, we present novel data analytic scheme for intelligent Human Activity Recognition (AR) using smartphone
inertial sensors based on information theory based feature ranking algorithm and classifiers based on random forests, ensemble
learning and lazy learning.
* Extensive experiments with a publicly available database[1] of human activity with smart phone inertial sensors show that the proposed approach can indeed lead to development of intelligent and automatic real time human activity monitoring for eHealth application scenarios for elderly, disabled and people with special needs.
points
* The first commercial hand-held mobile phones appeared in 1979, and since then there has been an unprecedented
growth in the adoption of mobile phone technology, reaching to more than 4.3 billions of the world population by 2023[1], whereas in 2016, There are 2.5 billions Smartphone users.
* Lately, smartphones, which are a new generation of mobile phones, are equipped with many powerful features including multitasking and a variety of sensors, in addition to the basic telephony.
The integration of these mobile devices in our daily life is growing rapidly, and it is envisaged that such devices can seamlessly monitor and keep track of our activities, learn from them and assist us in making decisions. Such assistive technologies can be of immense use for remote health care, for the elderly, the disabled and those with special needs, if there are autonomous and intelligent.
* However, currently, though there is good capacity for collecting the data with such
smart devices, there is limited capability in terms of automatic decision support capability and making sense out of
this large data repository. There is an urgent need for new data mining and machine learning techniques to be
developed to this end.
* In this paper we propose a new scheme for human activity recognition using smart phone data,
with potential applications in automatic assisted living technologies.
* Activity recognition systems aim to identify
the actions carried out by a human, from the data collected from the sensors and the surrounding environment. The
current smart phones have motion, acceleration or inertial sensors, and by exploiting the information retrieved from
these sensors, recognition of activities and events can be recognized.
* Automatic recognition of activities and events
is possible by processing this sensor data with appropriate machine learning and data mining approaches.
* Rest of the
paper is organized as follows.
The details of the publicly available activity recognition data set used in this work are
described in Section 2.
Section 3 discusses the relevant background work done in this area, and the proposed
automatic activity recognition approach is discussed in Section 4.
The experimental validation of the proposed
approach is described in Section 5, and
the paper concludes with some outcomes achieved from this work and the plan for future research.
link:
1. https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/
***********************************************
1. How many sensors are there in a smartphone?
=> Android supports three types of sensors -
1) Motion Sensors
These are used to measure acceleration forces and rotational forces along with three axes.
2) Position Sensors
These are used to measure the physical position of device.
3) Environmental Sensors
These are used to measure the environmental changes such as temperature, humidity etc.
link: https://www.javatpoint.com/android-sensor-tutorial
2. First handheld mobile phone launched?
=>
The first handheld mobile phone was demonstrated by John F. Mitchell and Martin Cooper of Motorola in 1973, using a handset weighing c. 2 kilograms (4.4 lbs). In 1979, Nippon Telegraph and Telephone (NTT) launched the world's first cellular network in Japan
link: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwjXt-ahk_TwAhXJzTgGHeTSBkYQFjABegQICBAD&url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FMobile_phone&usg=AOvVaw1FbGZNc9dfzOxM1nvb2Jjp
=> 25 years of mobility in India. India is celebrating 25 years of mobility. July 31 marks the day the first mobile call was made in India. This took place on July 31, 1995 and the call was made by the then West Bengal Chief Minister Jyoti Basu to Union Communications Minister Sukh Ram
link: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwjXt-ahk_TwAhXJzTgGHeTSBkYQFjAEegQICRAD&url=https%3A%2F%2Ftech.hindustantimes.com%2Ftech%2Fnews%2F25-years-of-mobility-in-india-the-first-mobile-phone-call-was-made-on-this-day-71596188645410.html&usg=AOvVaw0NcbPLWW7K_ZHNKsnpbwdl
Comments
Post a Comment
Please do not enter any spam link in the comment box.