SlideShare a Scribd company logo
A
SEMINAR REPORT
ON
“Smartphone-Based Remote Monitoring Tool for
e-Learning”
Submitted By
Mr./ ms. Thorat Pratik Ravindra
Roll No: 70
Exam No: . . . . . . . . . . . .
Class: TE Comp
UNDER THE GUIDANCE OF
Prof. A.S. Dumbre
DEPARTMENT OF COMPUTER ENGINEERING
Jaihind College of Engineering, Kuran
A/p-Kuran, Tal-Junnar, Dist-Pune-410511, State Maharashtra,India
2022-2023
DEPARTMENT OF COMPUTER ENGINEERING
Jaihind College of Engineering, Kuran
A/p-Kuran, Tal-Junnar, Dist-Pune-410511, State Maharashtra,India
CERTIFICATE
This is to certify that SEMINAR report entitled
“Smartphone-Based Remote Monitoring Tool for e-Learning
”
Submitted By
Mr./ms. Thorat Pratik Ravindra. Exam No:- . . . . . . . . . . . .
Is a bonafide work carried out by her under the supervision of Prof. A. S. Dumbre
and it is submitted towards the partial fulfillment of the requirement of Savitribai
Phule Pune University,Pune for the award of the degree of TE (Computer Engineer-
ing).
Prof. A.S. Dumbre Prof.S. Y. Mandlik
Seminar Guide Seminar Coordinator
JCOE, Kuran. JCOE, Kuran.
Dr.A. A. Khatri Dr.D. J. Garkal
HOD Principal
JCOE, Kuran. JCOE, Kuran.
Certificate By Guide
This is to certify that Mr./ Ms. Thorat Pratik Ravindra. has completed
the Seminar work under my guidance and supervision and that, I have verified the
work for its originality in documentation, problem statement in the SEMINAR. Any
reproduction of other necessary work is with the prior permission and has given due
ownership and included in the references.
Place: Kuran
Date: . . ./. . . /2022 ( Prof. A.S. Dumbre)
ACKNOWLEDGEMENT
Any attempt at any level can’t be satisfactorily completed without support and
guidance of learned people. I would like to take this opportunity to extend my deep
felt gratitude to all people who have been there at every step for my support.
First and foremost, I would like to express my immense gratitude to my Seminar
guide Prof. A. S. Dumbre and our HOD Dr. A. A. Khatri for their constant
support and motivation that has encouraged me to come up with this seminar. I
would also like to thank our seminar coordinator Prof. S. Y. Mandlik for con-
stantly motivating me and for giving me a chance to give a seminar on a creative
work.
I am extremely grateful to our principle Dr. D. J. Garkal for providing state
of the art facilities I take this opportunity to thank all professors of department for
providing the useful guidance and timely encouragement which helped me to com-
plete this seminar more confidently.
I am also very thankful to family, friend and mates who have rendered their whole
hearted support at all times for the successful completion of our seminar.
Mr./ms. Thorat Pratik Ravindra.
JCOE, Kuran.
i
ABSTRACT
In this paper, a smartphone-based learning monitoring system is presented. Dur-
ing pandemics, most of the parents are not used to simultaneously deal with their
home office activities and the monitoring of the home school activities of their chil-
dren. Therefore, a system allowing a parent, teacher or tutor to assign a task and its
corresponding execution time to children, could be helpful in this situation. In this
work, a mobile application to assign academic tasks to a child, measure execution
time, and monitor the child’s attention, is proposed. The children are the users of
a mobile application, hosted on a smartphone or tablet device, that displays an as-
signed task and keeps track of the time consumed by the child to perform this task.
Time measurement is performed using face recognition, so it is possible to infer the
attention of the child based on the presence or absence of a face. The app also mea-
sures the time that the application was in the foreground, as well as the time that the
application was sent to the background, to measure boredom. The parent or teacher
assigns a task using a desktop application specifically designed for this purpose. At
the end of the time set by the user, the application sends to the parent or teacher
statistics about the execution time of the task and the degree of attention of the child.
ii
INDEX
Acknowledgement i
Abstract ii
Index iii
List of Figures v
List of Tables vi
1 INTRODUCTION 1
1.1 Introduction to Smartphone-Based Remote Monitoring Tool for e-Learning 1
1.2 All About Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 LITERATURE SURVEY 4
2.1 Smartphone-Based Remote Monitoring Tool for e-Learning : . . . . . 4
2.2 Visual Attention Localization for Mobile Service Computing . . . . . 5
2.3 Automatic emotion and attention analysis of young children at home: 5
3 SYSTEM FRAMEWORK 7
3.1 . DALeMO MAIN MODULES . . . . . . . . . . . . . . . . . . . . . . 7
3.2 . MALeMO MAIN MODULES . . . . . . . . . . . . . . . . . . . . . 8
4 Algorithm 10
5 APPLICATIONS 11
5.1 Smartphone-Based Remote Monitoring Tool for e-Learning . . . . . 11
5.1.1 ATTENTION MEASUREMENT METRICS . . . . . . . . . 11
5.1.2 Time lapses measured by the MaLeMO . . . . . . . . . . . . . 11
6 CONCLUSION 12
iii
7 FUTURE SCOPE 13
BIBLIOGRAPHY 14
iv
List of Figures
v
List of Tables
vi
Chapter 1
INTRODUCTION
1.1 Introduction to Smartphone-Based Remote Monitoring Tool for e-
Learning
In this paper, a tool to monitor the learning activities of children whose parents
are working in the home is proposed. The developed tool monitors the attention
levels of children solving assigned tasks that cannot be supervised by a present adult.
The monitoring information is useful for both, parents and teachers, who can use it
to make decisions about the effectiveness of remote learning methods. The proposed
tools are focused on handwritten tasks, such as solving arithmetic or algebraic oper-
ations, writing some paragraphs or drawing, where children do not directly interact
with the smartphone/tablet, and use it only to read the task description and to report
image-based evidence of the carried out work, which will be later revised by a teacher
or a parent Several approaches have been introduced in order to develop software
tools for different gaze estimation purposes in e-learning. In 2018 Steil et al. [1],
presented a work related to the task of predicting users’ gaze behaviour (overt visual
attention) in the near future. Kangas et al. [2] described a study of combining gaze
gestures with vibrotactile feedback. In this study, gaze gestures were used as input
for a mobile device and vibrotactile feedback as a new alternative way to give con-
firmation of interaction events. Results show that vibrotactile feedback significantly
improved the use of gaze. Tonsen et al. [3] proposed an an eye tracking tool that
uses milimeter-size RGB cameras that can be fully embedded into normal glasses.
To compensate for the cameras’ low image resolution of only a few pixels, the pro-
posed approach uses multiple cameras to capture different views of the eye, as well as
learning-based gaze estimation to regress from eye images to gaze directions directly.
Mesquita and Lopes [4] introduced an image processing system to provide a valuable
aid to kindergarten teachers, helping them in the task of registering observation, by
automatically detecting and measuring head posture, including time records.
In 2014 Teyeb et al. [5] presented a drowsiness detection system based on video
1
Smartphone-Based Remote Monitoring Tool for e-Learning
approach to analyze two visual driver’s signs: eyes blinking and head movement.
Hutanu and Bertea in 2019 [6], described that one of the most accurate methods to
record user behavior is through eye tracking technology. Eye tracking is a method
that records real, natural and objective user behavior. Eye movements are fast and
subconscious, which may reveal information that is not accessible even to the respon-
der. The human gaze can tell exactly what has been seen, in what order and for how
long and, on the other side, what has been missed. Therefore, eye tracking data gives
valuable insights for improving the learning process. Chong et al. in 2018 [7] proposed
automated methods for detecting and quantifying visual attention from images and
video of gaze following. Gaze following can be applied to identify, categorize quick
glances to objects, and even identify when someone is not paying attention.
Copeland [8] stated that using eye gaze to control adaption of eye learning envi-
ronments provides the ability to go beyond the student’s surface answering behavior
or preferences and adapt to the student’s implicit behavior. Thus, eye tracking has
been shown to be a powerful tool for investigating how humans interact with com-
puter interfaces. In 2018 Waleed et al. [9], introduced an accurate, reliable and a
low-cost system based on laptop webcam for eye-gaze estimation. Ricciardelli and
Driver [10] described the impact of different sources of information by examining the
role of head orientation in gaze-direction judgments, based on past studies, where
head orientation may be combined with perceived gaze direction in judging where
someone else is attending. Fathi et al. [11] presented a probabilistic generative model
for simultaneously recognizing daily actions and predicting gaze locations in videos
recorded from an egocentric camer
1.2 All About Basics
• detection system based on video approach to analyze two visual driver’s signs:
eyes blinking and head movement. Hutanu and Bertea in 2019 [6], described
that one of the most accurate methods to record user behavior is through eye
tracking technology. Eye tracking is a method that records real, natural and
objective user behavior. Eye movements are fast and subconscious, which may
reveal information that is not accessible even to the responder. The human gaze
can tell exactly what has been seen, in what order and for how long and, on the
other side, what has been missed. Therefore, eye tracking data gives valuable
insights for improving the learning process. Chong et al. in 2018 [7] proposed
automated methods for detecting and quantifying visual attention from images
2
Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
Smartphone-Based Remote Monitoring Tool for e-Learning
and video of gaze following. Gaze following can be applied to identify, cate-
gorize quick glances to objects, and even identify when someone is not paying
attention.
1.3 Motivation
A model exists to estimate behavior and concentration of children while studying.
A platform can help in making connection of children, parents and teachers.
3
Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
Chapter 2
LITERATURE SURVEY
2.1 Smartphone-Based Remote Monitoring Tool for e-Learning :
In this paper, a tool to monitor the learning activities of children whose par-
ents are working in the home is proposed. The developed tool monitors the attention
levels of children solving assigned tasks that cannot be supervised by a present adult.
The monitoring information is useful for both, parents and teachers, who can use it
to make decisions about the effectiveness of remote learning methods. The proposed
tools are focused on handwritten tasks, such as solving arithmetic or algebraic oper-
ations, writing some paragraphs or drawing, where children do not directly interact
with the smartphone/tablet, and use it only to read the task description and to report
image-based evidence of the carried out work, which will be later revised by a teacher
or a parent Several approaches have been introduced in order to develop software
tools for different gaze estimation purposes in e-learning. In 2018 Steil et al. [1],
presented a work related to the task of predicting users’ gaze behaviour (overt visual
attention) in the near future. Kangas et al. [2] described a study of combining gaze
gestures with vibrotactile feedback. In this study, gaze gestures were used as input
for a mobile device and vibrotactile feedback as a new alternative way to give con-
firmation of interaction events. Results show that vibrotactile feedback significantly
improved the use of gaze. Tonsen et al. [3] proposed an an eye tracking tool that
uses milimeter-size RGB cameras that can be fully embedded into normal glasses.
To compensate for the cameras’ low image resolution of only a few pixels, the pro-
posed approach uses multiple cameras to capture different views of the eye, as well as
learning-based gaze estimation to regress from eye images to gaze directions directly.
Mesquita and Lopes [4] introduced an image processing system to provide a valuable
aid to kindergarten teachers, helping them in the task of registering observation, by
automatically detecting and measuring head posture, including time records.
4
Smartphone-Based Remote Monitoring Tool for e-Learning
2.2 Visual Attention Localization for Mobile Service Computing
Identifying and localizing the user’s visual attention can enable various intelli-
gent service computing paradigms in a mobile environment. However, existing solu-
tions can only compute the gaze direction, but without the distance to the intended
target. In addition, most of them rely on eye tracker or similar infrastructure support.
This paper explores the possibility of using portal mobile devices, e.g., smartphone,
to detect the visual attention of a user. i-VALS only requires the user to do one sim-
ple action to localize the intended object: gazing at the intended object and holding
up the smartphone so that the object and the user’s face can be simultaneously cap-
tured by the front and rear cameras. We develop efficient algorithms to obtain both
the distance between the camera and user, the user’s gaze direction and the object’s
direction from the camera. The object’s location can then be computed by solving
a trigonometric problem. i-VALS has been prototyped on commercial off-the-shelf
(COTS) devices. The extensive experiment results show that i-VALS achieves high
accuracy and small latency, effectively supporting a large variety of applications in
smart environments.
2.3 Automatic emotion and attention analysis of young children at home:
Current tools for objectively measuring young children’s observed behaviors
are expensive, time-consuming, and require extensive training and professional ad-
ministration. The lack of scalable, reliable, and validated tools impacts access to
evidence-based knowledge and limits our capacity to collect population-level data in
non-clinical settings. To address this gap, we developed mobile technology to collect
videos of young children while they watched movies designed to elicit autism-related
behaviors and then used automatic behavioral coding of these videos to quantify chil-
dren’s emotions and behaviors. We present results from our iPhone study Autism and
Beyond, built on Research Kit’s open-source platform. The entire study—from an
e-Consent process to stimuli presentation and data collection—was conducted within
an iPhone-based app available in the Apple Store. Over 1 year, 1756 families with
children aged 12–72 months old participated in the study, completing 5618 caregiver-
reported surveys and uploading 4441 videos recorded in the child’s natural settings.
Usable data were collected on 87.6Automatic coding identified significant differences
in emotion and attention by age, sex, and autism risk status. This study demonstrates
the acceptability of an app-based tool to caregivers, their willingness to upload videos
5
Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
Smartphone-Based Remote Monitoring Tool for e-Learning
of their children, the feasibility of caregiver-collected data in the home, and the appli-
cation of automatic behavioral encoding to quantify emotions and attention variables
that are clinically meaningful and may be refined to screen children for autism and
developmental disorders outside of clinical settings. This technology has the potential
to transform how we screen and monitor children’s development
6
Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
Chapter 3
SYSTEM FRAMEWORK
3.1 . DALeMO MAIN MODULES
The main modules of the DALeMO are depicted in figure 2. The details of each
module are described below: • Internet Connection Module (ICM). This module
checks if an internet connection is available, and activates the user interface com-
ponents when the connection is ready. This module also performs the WSLeMO
requests, gathers the results of these requests, and updates the UI with the requested
information. • Teacher Profile Management Module (TPMM). This module cap-
tures the teacher’s data and sends them to the WSLeMO through the ICM. When
a response is obtained, the app is notified and the process to create new groups is
enabled. • Group Management Module (GMM). This process is disabled until the
teacher successfully registers. Once this process is enabled, the teacher can add as
many courses as desired. For each created group, the WSLeMO sends a confirmation
of the creation of a group. For each created group, a unique ID is displayed.
7
Smartphone-Based Remote Monitoring Tool for e-Learning
3.2 . MALeMO MAIN MODULES
A. Internet Connection Module (ICM). This module performs the same tasks as its
corresponding module in the DaLeMO.Activities Acquisition Module (AAM).
This module interacts with the ICM to obtain the activities to be performed by
a student from the WSLeMO. The list of activities is stored in a data structure
array, including images and configuration settings. Activities Decoding Module
(ADM). The input of this module is the data structure produced by the AAM
and configures the UIM to visualize the activity in the screen of the device
correctly.User Interface Module (UIMC). This module receives data from ADM
and configures the user interface controls to allow the user to interact with the
MaLeMO. This module enables the user interface elements (buttons) and con-
trols the information to be displayed. Rear Camera Image Acquisition Module
(RCIAM). This module captures one image from the rear camera of the host
device when the student requires to finish the current activity, and the captured
image is attached as evidence of the current activity finalization. Evidence In-
tegration Module (EIM). This module takes the captured evidence and packs
it into a single structure along with the learning monitoring statistics (LMS).
Front Camera Image Acquisition Module (FCIAM). This module captures im-
ages from the front camera at some time interval, and stores them in memory
for their later analysis. Gaze Detection and Analysis Module (GDAM). This
module receives the image captured from the camera, detects face and eyes (if
possible) and attaches face width, face length and eye localization information
to the LMS data of face localization.
C. Test bed In the initial stages of IDS growth, the standard datasets are not avail-
able [13]. The KDD initiate the process and designed the standard intrusion
dataset. Evaluation of the model requires a standardized data, in this regard we
use the three different datasets obtained for network intrusion detection. The
following dataset are used for evaluation of the model.
1. KDDCUP 99 (simulated data)
2. HTTP CSIC (web traffic)
3. UNB ISCX (real time captured packets)
These dataset are prepared specifically to test the intrusion detection systems,
the network based intrusions (NIDS) are tested using these datasets These
8
Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
Smartphone-Based Remote Monitoring Tool for e-Learning
datasets are prepared from the different sources and consists of different num-
ber and varietics of features are there. Each of the dataset is consists millions
of records with training and testing data collections KDDCUP99 [5] dataset
is prepared from the lincon laboratory consists of Al features, it is prepared
by generating the simulated data on the defence laboratory, It consists derived
feature. This dataset presented in KDD conference in the year 1999, it cap-
ture many of the intrusions and won the challenges. The UNB ISCX dataset is
prepared from the real traffic in the year 2012
9
Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
Chapter 4
Algorithm
10
Chapter 5
APPLICATIONS
5.1 Smartphone-Based Remote Monitoring Tool for e-Learning
5.1.1 ATTENTION MEASUREMENT METRICS
In this work, four types of activities were designed to measure attention using
the proposed system. This activities must be hand-made on a sheet of paper
and are described below: 1) Writing. In this activity, the child writes a short
story and an image of the hand-written work should be reported as evidence.
2) Reading. In this activity, the child must analyze a short text, and some
questions about the text must be answered. The hand-written answer sheet
should be reported as evidence. 3) Math. In this activity, the child performs
some simple mathematics operations, and an image of the procedure and results
should be reported as evidence Drawing. In this activity, the child uses his/her
imagination to draw some situations. An image of the carried out drawing
should be reported as evidence.
5.1.2 Time lapses measured by the MaLeMO
Time to completion (TTC). This is an estimated time required for any particu-
lar activity to be completed. It is mainly based on the experience of the teacher
and is manually set when the activity is created into the DaLeMO. Net time to
completion (NTTC). This is the time computed from the MaLeMO to finish a
given activity. This is the result of the sum of all active time intervals. Gross
time to completion (GTTC). This is the time computed from the MaLeMO,
including lapses of interruption. This is the result of the sum of both active
and inactive time intervals. Average active time (AAT). This is an average of
the active time lapses. Number of Transitions between Foreground and Back-
ground States (NTFBS). This is a counter that is increased when the MaLeMo
alternates between foreground and background.
11
Chapter 6
CONCLUSION
In this work, a Remote Learning Monitoring Systems (RLMS) has been pro-
posed. The proposed systems allow the parent or teacher to use the DaLeMO to
assign some learning activities to be carried out by children. Children can use the
MaLeMO to read the instructions of these activities, and remotely send image-based
evidence of the performed work. The MaLeMo computes some statistics of the chil-
dren attention and reports them to the WSLeMO. The teacher/parent can obtain
the stored statistics by retrieving them from the WSLeMo and analyze them to make
better decisions about learning exercises and techniques to be employed.
12
Chapter 7
FUTURE SCOPE
The proposed system improves existing developments by avoiding the use of per-
sonal computers and external sensors to remotely monitor learning. In addition, this
system allows the professor to easily customize the activities to be performed by the
student, and to get practically instantaneous feedback through the statistics compiled
by the MaLeMO. Finally, the results show that the attention monitoring fails in some
isolated cases. This could be enhanced with the implementation of gaze tracking
and attention detection using deep-learning techniques. Although there are advanced
techniques to measure attention from gaze tracking, the focus of the proposed work
was to measure attention based on the localization of the face and eyes and the time
they are visible in front of the application, complemented with timers designed to
measure the interruptions of such activity as a result of sending the MaLeMo to the
background by opening other apps in the device. To improve the attention measure-
ment, better gaze tracking methods must be added to the proposed RLMS. Although
there are advanced techniques to measure attention from gaze tracking, the focus of
the proposed work was to measure attention based on the localization of the face and
eyes and the time they are visible in front of the application, complemented with
timers designed to measure the interruptions of such activity as a result of sending
the MaLeMo to the background by opening other apps in the device. To improve the
attention measurement, better gaze tracking methods must be added to the proposed
RLMS.
13
Bibliography
[1] H. L. Egger, G. Dawson, J. Hashemi, K. L. H. Carpenter, S. Espinosa, K. Camp-
bell, S. Brotkin, J. Schaich-Borg, Q. Qiu, M. Tepper, J. P. Baker, R. A. Bloom-
field, and G. Sapiro, “Automatic emotion and attention analysis of young chil-
dren at home: A ResearchKit autism feasibility study,” NPJ Digit. Med., vol. 1,
no. 1, p. 20, Dec. 2018
[2] P. Smutny and P. Schreiberova, “Chatbots for learning: A review of educational
chatbots for the Facebook messenger,” Comput. Educ., vol. 151, Jul. 2020, Art.
no. 103862.
[3] O. A. H. Jones, M. Spichkova, and M. J. S. Spencer, “Chirality-2: Develop-
ment of a multilevel mobile gaming app to support the teaching of introduc-
tory undergraduate-level organic chemistry,” J. Chem. Educ., vol. 95, no. 7, pp.
1216–1220, Jul. 2018.
[4] ] M. S. Hornbæk, J. Hellevik, C. Schaarup, M. D. Johansen, and O. K. Hejlesen,
“Usability of eye tracking for studying the benefits of e-learning tutorials on safe
moving and handling techniques,” in Proc. Linköping Electron. Conf., 2018, pp.
56–61.
14

More Related Content

Similar to Seminar report pratik70.pdf

DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...
DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...
DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...
IJITE
 
IRJET- Survey on Various Techniques of Attendance marking and Attention D...
IRJET-  	  Survey on Various Techniques of Attendance marking and Attention D...IRJET-  	  Survey on Various Techniques of Attendance marking and Attention D...
IRJET- Survey on Various Techniques of Attendance marking and Attention D...
IRJET Journal
 
Development of interactive instructional model using augmented reality based ...
Development of interactive instructional model using augmented reality based ...Development of interactive instructional model using augmented reality based ...
Development of interactive instructional model using augmented reality based ...
IJITE
 
Ch9visualtech
Ch9visualtechCh9visualtech
Ch9visualtech
dawklein
 
DIGITAL ATTENDANCE USING IBEACON AND FINGERPRINT
DIGITAL ATTENDANCE USING IBEACON AND FINGERPRINTDIGITAL ATTENDANCE USING IBEACON AND FINGERPRINT
DIGITAL ATTENDANCE USING IBEACON AND FINGERPRINT
IRJET Journal
 
Authoring of educational mobile apps for the mathematics-learning analysis
Authoring of educational mobile apps for the mathematics-learning analysisAuthoring of educational mobile apps for the mathematics-learning analysis
Authoring of educational mobile apps for the mathematics-learning analysis
Technological Ecosystems for Enhancing Multiculturality
 
Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...
Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...
Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...
IJECEIAES
 
Google glass IEEE Seminar report
Google glass  IEEE Seminar reportGoogle glass  IEEE Seminar report
Google glass IEEE Seminar report
Samana Rao
 
Primary education system in India
Primary education system in IndiaPrimary education system in India
Primary education system in India
National Management Olympiad
 
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
ijujournal
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
ijujournal
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
ijujournal
 
Automated ICT Literacy Skill Assessment Using RateSkill System
Automated ICT Literacy Skill Assessment Using RateSkill SystemAutomated ICT Literacy Skill Assessment Using RateSkill System
Automated ICT Literacy Skill Assessment Using RateSkill System
International Journal of Science and Research (IJSR)
 
UI/UX integrated holistic monitoring of PAUD using the TCSD method
UI/UX integrated holistic monitoring of PAUD using the TCSD methodUI/UX integrated holistic monitoring of PAUD using the TCSD method
UI/UX integrated holistic monitoring of PAUD using the TCSD method
journalBEEI
 
adrianorenzi_duxu2014
adrianorenzi_duxu2014adrianorenzi_duxu2014
adrianorenzi_duxu2014
Adriano Renzi
 
STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...
STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...
STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...
IRJET Journal
 
City i-Tick: The android based mobile application for students’ attendance at...
City i-Tick: The android based mobile application for students’ attendance at...City i-Tick: The android based mobile application for students’ attendance at...
City i-Tick: The android based mobile application for students’ attendance at...
journalBEEI
 
Face recognition smart cane using haar-like features and eigenfaces
Face recognition smart cane using haar-like features and eigenfacesFace recognition smart cane using haar-like features and eigenfaces
Face recognition smart cane using haar-like features and eigenfaces
TELKOMNIKA JOURNAL
 
Literature Review .docx
Literature Review                                                 .docxLiterature Review                                                 .docx
Literature Review .docx
SHIVA101531
 
A-MOBILE-GRADING-APP.pptx
A-MOBILE-GRADING-APP.pptxA-MOBILE-GRADING-APP.pptx
A-MOBILE-GRADING-APP.pptx
LucioParcutela1
 

Similar to Seminar report pratik70.pdf (20)

DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...
DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...
DEVELOPMENT OF INTERACTIVE INSTRUCTIONAL MODEL USING AUGMENTED REALITY BASED ...
 
IRJET- Survey on Various Techniques of Attendance marking and Attention D...
IRJET-  	  Survey on Various Techniques of Attendance marking and Attention D...IRJET-  	  Survey on Various Techniques of Attendance marking and Attention D...
IRJET- Survey on Various Techniques of Attendance marking and Attention D...
 
Development of interactive instructional model using augmented reality based ...
Development of interactive instructional model using augmented reality based ...Development of interactive instructional model using augmented reality based ...
Development of interactive instructional model using augmented reality based ...
 
Ch9visualtech
Ch9visualtechCh9visualtech
Ch9visualtech
 
DIGITAL ATTENDANCE USING IBEACON AND FINGERPRINT
DIGITAL ATTENDANCE USING IBEACON AND FINGERPRINTDIGITAL ATTENDANCE USING IBEACON AND FINGERPRINT
DIGITAL ATTENDANCE USING IBEACON AND FINGERPRINT
 
Authoring of educational mobile apps for the mathematics-learning analysis
Authoring of educational mobile apps for the mathematics-learning analysisAuthoring of educational mobile apps for the mathematics-learning analysis
Authoring of educational mobile apps for the mathematics-learning analysis
 
Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...
Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...
Optimized Active Learning for User’s Behavior Modelling based on Non-Intrusiv...
 
Google glass IEEE Seminar report
Google glass  IEEE Seminar reportGoogle glass  IEEE Seminar report
Google glass IEEE Seminar report
 
Primary education system in India
Primary education system in IndiaPrimary education system in India
Primary education system in India
 
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
 
Automated ICT Literacy Skill Assessment Using RateSkill System
Automated ICT Literacy Skill Assessment Using RateSkill SystemAutomated ICT Literacy Skill Assessment Using RateSkill System
Automated ICT Literacy Skill Assessment Using RateSkill System
 
UI/UX integrated holistic monitoring of PAUD using the TCSD method
UI/UX integrated holistic monitoring of PAUD using the TCSD methodUI/UX integrated holistic monitoring of PAUD using the TCSD method
UI/UX integrated holistic monitoring of PAUD using the TCSD method
 
adrianorenzi_duxu2014
adrianorenzi_duxu2014adrianorenzi_duxu2014
adrianorenzi_duxu2014
 
STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...
STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...
STUDENT ENGAGEMENT MONITORING IN ONLINE LEARNING ENVIRONMENT USING FACE DETEC...
 
City i-Tick: The android based mobile application for students’ attendance at...
City i-Tick: The android based mobile application for students’ attendance at...City i-Tick: The android based mobile application for students’ attendance at...
City i-Tick: The android based mobile application for students’ attendance at...
 
Face recognition smart cane using haar-like features and eigenfaces
Face recognition smart cane using haar-like features and eigenfacesFace recognition smart cane using haar-like features and eigenfaces
Face recognition smart cane using haar-like features and eigenfaces
 
Literature Review .docx
Literature Review                                                 .docxLiterature Review                                                 .docx
Literature Review .docx
 
A-MOBILE-GRADING-APP.pptx
A-MOBILE-GRADING-APP.pptxA-MOBILE-GRADING-APP.pptx
A-MOBILE-GRADING-APP.pptx
 

Recently uploaded

Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
NidhalKahouli2
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
KrishnaveniKrishnara1
 
Engineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdfEngineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdf
abbyasa1014
 
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
IJECEIAES
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
Recycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part IIRecycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part II
Aditya Rajan Patra
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
insn4465
 
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Sinan KOZAK
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
kandramariana6
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
IJECEIAES
 
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
171ticu
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
RadiNasr
 
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have oneISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
Las Vegas Warehouse
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
VICTOR MAESTRE RAMIREZ
 
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTCHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
jpsjournal1
 
TIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEM
TIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEMTIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEM
TIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEM
HODECEDSIET
 
Casting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdfCasting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdf
zubairahmad848137
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
MDSABBIROJJAMANPAYEL
 

Recently uploaded (20)

Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
 
Engineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdfEngineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdf
 
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
Recycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part IIRecycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part II
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
 
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
 
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
 
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have oneISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
 
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTCHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
 
TIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEM
TIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEMTIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEM
TIME DIVISION MULTIPLEXING TECHNIQUE FOR COMMUNICATION SYSTEM
 
Casting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdfCasting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdf
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
 

Seminar report pratik70.pdf

  • 1. A SEMINAR REPORT ON “Smartphone-Based Remote Monitoring Tool for e-Learning” Submitted By Mr./ ms. Thorat Pratik Ravindra Roll No: 70 Exam No: . . . . . . . . . . . . Class: TE Comp UNDER THE GUIDANCE OF Prof. A.S. Dumbre DEPARTMENT OF COMPUTER ENGINEERING Jaihind College of Engineering, Kuran A/p-Kuran, Tal-Junnar, Dist-Pune-410511, State Maharashtra,India 2022-2023
  • 2. DEPARTMENT OF COMPUTER ENGINEERING Jaihind College of Engineering, Kuran A/p-Kuran, Tal-Junnar, Dist-Pune-410511, State Maharashtra,India CERTIFICATE This is to certify that SEMINAR report entitled “Smartphone-Based Remote Monitoring Tool for e-Learning ” Submitted By Mr./ms. Thorat Pratik Ravindra. Exam No:- . . . . . . . . . . . . Is a bonafide work carried out by her under the supervision of Prof. A. S. Dumbre and it is submitted towards the partial fulfillment of the requirement of Savitribai Phule Pune University,Pune for the award of the degree of TE (Computer Engineer- ing). Prof. A.S. Dumbre Prof.S. Y. Mandlik Seminar Guide Seminar Coordinator JCOE, Kuran. JCOE, Kuran. Dr.A. A. Khatri Dr.D. J. Garkal HOD Principal JCOE, Kuran. JCOE, Kuran.
  • 3. Certificate By Guide This is to certify that Mr./ Ms. Thorat Pratik Ravindra. has completed the Seminar work under my guidance and supervision and that, I have verified the work for its originality in documentation, problem statement in the SEMINAR. Any reproduction of other necessary work is with the prior permission and has given due ownership and included in the references. Place: Kuran Date: . . ./. . . /2022 ( Prof. A.S. Dumbre)
  • 4. ACKNOWLEDGEMENT Any attempt at any level can’t be satisfactorily completed without support and guidance of learned people. I would like to take this opportunity to extend my deep felt gratitude to all people who have been there at every step for my support. First and foremost, I would like to express my immense gratitude to my Seminar guide Prof. A. S. Dumbre and our HOD Dr. A. A. Khatri for their constant support and motivation that has encouraged me to come up with this seminar. I would also like to thank our seminar coordinator Prof. S. Y. Mandlik for con- stantly motivating me and for giving me a chance to give a seminar on a creative work. I am extremely grateful to our principle Dr. D. J. Garkal for providing state of the art facilities I take this opportunity to thank all professors of department for providing the useful guidance and timely encouragement which helped me to com- plete this seminar more confidently. I am also very thankful to family, friend and mates who have rendered their whole hearted support at all times for the successful completion of our seminar. Mr./ms. Thorat Pratik Ravindra. JCOE, Kuran. i
  • 5. ABSTRACT In this paper, a smartphone-based learning monitoring system is presented. Dur- ing pandemics, most of the parents are not used to simultaneously deal with their home office activities and the monitoring of the home school activities of their chil- dren. Therefore, a system allowing a parent, teacher or tutor to assign a task and its corresponding execution time to children, could be helpful in this situation. In this work, a mobile application to assign academic tasks to a child, measure execution time, and monitor the child’s attention, is proposed. The children are the users of a mobile application, hosted on a smartphone or tablet device, that displays an as- signed task and keeps track of the time consumed by the child to perform this task. Time measurement is performed using face recognition, so it is possible to infer the attention of the child based on the presence or absence of a face. The app also mea- sures the time that the application was in the foreground, as well as the time that the application was sent to the background, to measure boredom. The parent or teacher assigns a task using a desktop application specifically designed for this purpose. At the end of the time set by the user, the application sends to the parent or teacher statistics about the execution time of the task and the degree of attention of the child. ii
  • 6. INDEX Acknowledgement i Abstract ii Index iii List of Figures v List of Tables vi 1 INTRODUCTION 1 1.1 Introduction to Smartphone-Based Remote Monitoring Tool for e-Learning 1 1.2 All About Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 LITERATURE SURVEY 4 2.1 Smartphone-Based Remote Monitoring Tool for e-Learning : . . . . . 4 2.2 Visual Attention Localization for Mobile Service Computing . . . . . 5 2.3 Automatic emotion and attention analysis of young children at home: 5 3 SYSTEM FRAMEWORK 7 3.1 . DALeMO MAIN MODULES . . . . . . . . . . . . . . . . . . . . . . 7 3.2 . MALeMO MAIN MODULES . . . . . . . . . . . . . . . . . . . . . 8 4 Algorithm 10 5 APPLICATIONS 11 5.1 Smartphone-Based Remote Monitoring Tool for e-Learning . . . . . 11 5.1.1 ATTENTION MEASUREMENT METRICS . . . . . . . . . 11 5.1.2 Time lapses measured by the MaLeMO . . . . . . . . . . . . . 11 6 CONCLUSION 12 iii
  • 7. 7 FUTURE SCOPE 13 BIBLIOGRAPHY 14 iv
  • 10. Chapter 1 INTRODUCTION 1.1 Introduction to Smartphone-Based Remote Monitoring Tool for e- Learning In this paper, a tool to monitor the learning activities of children whose parents are working in the home is proposed. The developed tool monitors the attention levels of children solving assigned tasks that cannot be supervised by a present adult. The monitoring information is useful for both, parents and teachers, who can use it to make decisions about the effectiveness of remote learning methods. The proposed tools are focused on handwritten tasks, such as solving arithmetic or algebraic oper- ations, writing some paragraphs or drawing, where children do not directly interact with the smartphone/tablet, and use it only to read the task description and to report image-based evidence of the carried out work, which will be later revised by a teacher or a parent Several approaches have been introduced in order to develop software tools for different gaze estimation purposes in e-learning. In 2018 Steil et al. [1], presented a work related to the task of predicting users’ gaze behaviour (overt visual attention) in the near future. Kangas et al. [2] described a study of combining gaze gestures with vibrotactile feedback. In this study, gaze gestures were used as input for a mobile device and vibrotactile feedback as a new alternative way to give con- firmation of interaction events. Results show that vibrotactile feedback significantly improved the use of gaze. Tonsen et al. [3] proposed an an eye tracking tool that uses milimeter-size RGB cameras that can be fully embedded into normal glasses. To compensate for the cameras’ low image resolution of only a few pixels, the pro- posed approach uses multiple cameras to capture different views of the eye, as well as learning-based gaze estimation to regress from eye images to gaze directions directly. Mesquita and Lopes [4] introduced an image processing system to provide a valuable aid to kindergarten teachers, helping them in the task of registering observation, by automatically detecting and measuring head posture, including time records. In 2014 Teyeb et al. [5] presented a drowsiness detection system based on video 1
  • 11. Smartphone-Based Remote Monitoring Tool for e-Learning approach to analyze two visual driver’s signs: eyes blinking and head movement. Hutanu and Bertea in 2019 [6], described that one of the most accurate methods to record user behavior is through eye tracking technology. Eye tracking is a method that records real, natural and objective user behavior. Eye movements are fast and subconscious, which may reveal information that is not accessible even to the respon- der. The human gaze can tell exactly what has been seen, in what order and for how long and, on the other side, what has been missed. Therefore, eye tracking data gives valuable insights for improving the learning process. Chong et al. in 2018 [7] proposed automated methods for detecting and quantifying visual attention from images and video of gaze following. Gaze following can be applied to identify, categorize quick glances to objects, and even identify when someone is not paying attention. Copeland [8] stated that using eye gaze to control adaption of eye learning envi- ronments provides the ability to go beyond the student’s surface answering behavior or preferences and adapt to the student’s implicit behavior. Thus, eye tracking has been shown to be a powerful tool for investigating how humans interact with com- puter interfaces. In 2018 Waleed et al. [9], introduced an accurate, reliable and a low-cost system based on laptop webcam for eye-gaze estimation. Ricciardelli and Driver [10] described the impact of different sources of information by examining the role of head orientation in gaze-direction judgments, based on past studies, where head orientation may be combined with perceived gaze direction in judging where someone else is attending. Fathi et al. [11] presented a probabilistic generative model for simultaneously recognizing daily actions and predicting gaze locations in videos recorded from an egocentric camer 1.2 All About Basics • detection system based on video approach to analyze two visual driver’s signs: eyes blinking and head movement. Hutanu and Bertea in 2019 [6], described that one of the most accurate methods to record user behavior is through eye tracking technology. Eye tracking is a method that records real, natural and objective user behavior. Eye movements are fast and subconscious, which may reveal information that is not accessible even to the responder. The human gaze can tell exactly what has been seen, in what order and for how long and, on the other side, what has been missed. Therefore, eye tracking data gives valuable insights for improving the learning process. Chong et al. in 2018 [7] proposed automated methods for detecting and quantifying visual attention from images 2 Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
  • 12. Smartphone-Based Remote Monitoring Tool for e-Learning and video of gaze following. Gaze following can be applied to identify, cate- gorize quick glances to objects, and even identify when someone is not paying attention. 1.3 Motivation A model exists to estimate behavior and concentration of children while studying. A platform can help in making connection of children, parents and teachers. 3 Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
  • 13. Chapter 2 LITERATURE SURVEY 2.1 Smartphone-Based Remote Monitoring Tool for e-Learning : In this paper, a tool to monitor the learning activities of children whose par- ents are working in the home is proposed. The developed tool monitors the attention levels of children solving assigned tasks that cannot be supervised by a present adult. The monitoring information is useful for both, parents and teachers, who can use it to make decisions about the effectiveness of remote learning methods. The proposed tools are focused on handwritten tasks, such as solving arithmetic or algebraic oper- ations, writing some paragraphs or drawing, where children do not directly interact with the smartphone/tablet, and use it only to read the task description and to report image-based evidence of the carried out work, which will be later revised by a teacher or a parent Several approaches have been introduced in order to develop software tools for different gaze estimation purposes in e-learning. In 2018 Steil et al. [1], presented a work related to the task of predicting users’ gaze behaviour (overt visual attention) in the near future. Kangas et al. [2] described a study of combining gaze gestures with vibrotactile feedback. In this study, gaze gestures were used as input for a mobile device and vibrotactile feedback as a new alternative way to give con- firmation of interaction events. Results show that vibrotactile feedback significantly improved the use of gaze. Tonsen et al. [3] proposed an an eye tracking tool that uses milimeter-size RGB cameras that can be fully embedded into normal glasses. To compensate for the cameras’ low image resolution of only a few pixels, the pro- posed approach uses multiple cameras to capture different views of the eye, as well as learning-based gaze estimation to regress from eye images to gaze directions directly. Mesquita and Lopes [4] introduced an image processing system to provide a valuable aid to kindergarten teachers, helping them in the task of registering observation, by automatically detecting and measuring head posture, including time records. 4
  • 14. Smartphone-Based Remote Monitoring Tool for e-Learning 2.2 Visual Attention Localization for Mobile Service Computing Identifying and localizing the user’s visual attention can enable various intelli- gent service computing paradigms in a mobile environment. However, existing solu- tions can only compute the gaze direction, but without the distance to the intended target. In addition, most of them rely on eye tracker or similar infrastructure support. This paper explores the possibility of using portal mobile devices, e.g., smartphone, to detect the visual attention of a user. i-VALS only requires the user to do one sim- ple action to localize the intended object: gazing at the intended object and holding up the smartphone so that the object and the user’s face can be simultaneously cap- tured by the front and rear cameras. We develop efficient algorithms to obtain both the distance between the camera and user, the user’s gaze direction and the object’s direction from the camera. The object’s location can then be computed by solving a trigonometric problem. i-VALS has been prototyped on commercial off-the-shelf (COTS) devices. The extensive experiment results show that i-VALS achieves high accuracy and small latency, effectively supporting a large variety of applications in smart environments. 2.3 Automatic emotion and attention analysis of young children at home: Current tools for objectively measuring young children’s observed behaviors are expensive, time-consuming, and require extensive training and professional ad- ministration. The lack of scalable, reliable, and validated tools impacts access to evidence-based knowledge and limits our capacity to collect population-level data in non-clinical settings. To address this gap, we developed mobile technology to collect videos of young children while they watched movies designed to elicit autism-related behaviors and then used automatic behavioral coding of these videos to quantify chil- dren’s emotions and behaviors. We present results from our iPhone study Autism and Beyond, built on Research Kit’s open-source platform. The entire study—from an e-Consent process to stimuli presentation and data collection—was conducted within an iPhone-based app available in the Apple Store. Over 1 year, 1756 families with children aged 12–72 months old participated in the study, completing 5618 caregiver- reported surveys and uploading 4441 videos recorded in the child’s natural settings. Usable data were collected on 87.6Automatic coding identified significant differences in emotion and attention by age, sex, and autism risk status. This study demonstrates the acceptability of an app-based tool to caregivers, their willingness to upload videos 5 Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
  • 15. Smartphone-Based Remote Monitoring Tool for e-Learning of their children, the feasibility of caregiver-collected data in the home, and the appli- cation of automatic behavioral encoding to quantify emotions and attention variables that are clinically meaningful and may be refined to screen children for autism and developmental disorders outside of clinical settings. This technology has the potential to transform how we screen and monitor children’s development 6 Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
  • 16. Chapter 3 SYSTEM FRAMEWORK 3.1 . DALeMO MAIN MODULES The main modules of the DALeMO are depicted in figure 2. The details of each module are described below: • Internet Connection Module (ICM). This module checks if an internet connection is available, and activates the user interface com- ponents when the connection is ready. This module also performs the WSLeMO requests, gathers the results of these requests, and updates the UI with the requested information. • Teacher Profile Management Module (TPMM). This module cap- tures the teacher’s data and sends them to the WSLeMO through the ICM. When a response is obtained, the app is notified and the process to create new groups is enabled. • Group Management Module (GMM). This process is disabled until the teacher successfully registers. Once this process is enabled, the teacher can add as many courses as desired. For each created group, the WSLeMO sends a confirmation of the creation of a group. For each created group, a unique ID is displayed. 7
  • 17. Smartphone-Based Remote Monitoring Tool for e-Learning 3.2 . MALeMO MAIN MODULES A. Internet Connection Module (ICM). This module performs the same tasks as its corresponding module in the DaLeMO.Activities Acquisition Module (AAM). This module interacts with the ICM to obtain the activities to be performed by a student from the WSLeMO. The list of activities is stored in a data structure array, including images and configuration settings. Activities Decoding Module (ADM). The input of this module is the data structure produced by the AAM and configures the UIM to visualize the activity in the screen of the device correctly.User Interface Module (UIMC). This module receives data from ADM and configures the user interface controls to allow the user to interact with the MaLeMO. This module enables the user interface elements (buttons) and con- trols the information to be displayed. Rear Camera Image Acquisition Module (RCIAM). This module captures one image from the rear camera of the host device when the student requires to finish the current activity, and the captured image is attached as evidence of the current activity finalization. Evidence In- tegration Module (EIM). This module takes the captured evidence and packs it into a single structure along with the learning monitoring statistics (LMS). Front Camera Image Acquisition Module (FCIAM). This module captures im- ages from the front camera at some time interval, and stores them in memory for their later analysis. Gaze Detection and Analysis Module (GDAM). This module receives the image captured from the camera, detects face and eyes (if possible) and attaches face width, face length and eye localization information to the LMS data of face localization. C. Test bed In the initial stages of IDS growth, the standard datasets are not avail- able [13]. The KDD initiate the process and designed the standard intrusion dataset. Evaluation of the model requires a standardized data, in this regard we use the three different datasets obtained for network intrusion detection. The following dataset are used for evaluation of the model. 1. KDDCUP 99 (simulated data) 2. HTTP CSIC (web traffic) 3. UNB ISCX (real time captured packets) These dataset are prepared specifically to test the intrusion detection systems, the network based intrusions (NIDS) are tested using these datasets These 8 Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
  • 18. Smartphone-Based Remote Monitoring Tool for e-Learning datasets are prepared from the different sources and consists of different num- ber and varietics of features are there. Each of the dataset is consists millions of records with training and testing data collections KDDCUP99 [5] dataset is prepared from the lincon laboratory consists of Al features, it is prepared by generating the simulated data on the defence laboratory, It consists derived feature. This dataset presented in KDD conference in the year 1999, it cap- ture many of the intrusions and won the challenges. The UNB ISCX dataset is prepared from the real traffic in the year 2012 9 Jaihind College Of Engineering, Department of Computer Engineering - 2022-23
  • 20. Chapter 5 APPLICATIONS 5.1 Smartphone-Based Remote Monitoring Tool for e-Learning 5.1.1 ATTENTION MEASUREMENT METRICS In this work, four types of activities were designed to measure attention using the proposed system. This activities must be hand-made on a sheet of paper and are described below: 1) Writing. In this activity, the child writes a short story and an image of the hand-written work should be reported as evidence. 2) Reading. In this activity, the child must analyze a short text, and some questions about the text must be answered. The hand-written answer sheet should be reported as evidence. 3) Math. In this activity, the child performs some simple mathematics operations, and an image of the procedure and results should be reported as evidence Drawing. In this activity, the child uses his/her imagination to draw some situations. An image of the carried out drawing should be reported as evidence. 5.1.2 Time lapses measured by the MaLeMO Time to completion (TTC). This is an estimated time required for any particu- lar activity to be completed. It is mainly based on the experience of the teacher and is manually set when the activity is created into the DaLeMO. Net time to completion (NTTC). This is the time computed from the MaLeMO to finish a given activity. This is the result of the sum of all active time intervals. Gross time to completion (GTTC). This is the time computed from the MaLeMO, including lapses of interruption. This is the result of the sum of both active and inactive time intervals. Average active time (AAT). This is an average of the active time lapses. Number of Transitions between Foreground and Back- ground States (NTFBS). This is a counter that is increased when the MaLeMo alternates between foreground and background. 11
  • 21. Chapter 6 CONCLUSION In this work, a Remote Learning Monitoring Systems (RLMS) has been pro- posed. The proposed systems allow the parent or teacher to use the DaLeMO to assign some learning activities to be carried out by children. Children can use the MaLeMO to read the instructions of these activities, and remotely send image-based evidence of the performed work. The MaLeMo computes some statistics of the chil- dren attention and reports them to the WSLeMO. The teacher/parent can obtain the stored statistics by retrieving them from the WSLeMo and analyze them to make better decisions about learning exercises and techniques to be employed. 12
  • 22. Chapter 7 FUTURE SCOPE The proposed system improves existing developments by avoiding the use of per- sonal computers and external sensors to remotely monitor learning. In addition, this system allows the professor to easily customize the activities to be performed by the student, and to get practically instantaneous feedback through the statistics compiled by the MaLeMO. Finally, the results show that the attention monitoring fails in some isolated cases. This could be enhanced with the implementation of gaze tracking and attention detection using deep-learning techniques. Although there are advanced techniques to measure attention from gaze tracking, the focus of the proposed work was to measure attention based on the localization of the face and eyes and the time they are visible in front of the application, complemented with timers designed to measure the interruptions of such activity as a result of sending the MaLeMo to the background by opening other apps in the device. To improve the attention measure- ment, better gaze tracking methods must be added to the proposed RLMS. Although there are advanced techniques to measure attention from gaze tracking, the focus of the proposed work was to measure attention based on the localization of the face and eyes and the time they are visible in front of the application, complemented with timers designed to measure the interruptions of such activity as a result of sending the MaLeMo to the background by opening other apps in the device. To improve the attention measurement, better gaze tracking methods must be added to the proposed RLMS. 13
  • 23. Bibliography [1] H. L. Egger, G. Dawson, J. Hashemi, K. L. H. Carpenter, S. Espinosa, K. Camp- bell, S. Brotkin, J. Schaich-Borg, Q. Qiu, M. Tepper, J. P. Baker, R. A. Bloom- field, and G. Sapiro, “Automatic emotion and attention analysis of young chil- dren at home: A ResearchKit autism feasibility study,” NPJ Digit. Med., vol. 1, no. 1, p. 20, Dec. 2018 [2] P. Smutny and P. Schreiberova, “Chatbots for learning: A review of educational chatbots for the Facebook messenger,” Comput. Educ., vol. 151, Jul. 2020, Art. no. 103862. [3] O. A. H. Jones, M. Spichkova, and M. J. S. Spencer, “Chirality-2: Develop- ment of a multilevel mobile gaming app to support the teaching of introduc- tory undergraduate-level organic chemistry,” J. Chem. Educ., vol. 95, no. 7, pp. 1216–1220, Jul. 2018. [4] ] M. S. Hornbæk, J. Hellevik, C. Schaarup, M. D. Johansen, and O. K. Hejlesen, “Usability of eye tracking for studying the benefits of e-learning tutorials on safe moving and handling techniques,” in Proc. Linköping Electron. Conf., 2018, pp. 56–61. 14