SlideShare a Scribd company logo
1 of 59
INTELLECTUAL ANALYSIS OF STUDENT'S OBSERVATION
USING DATA MINING AND PATTERN MATCHING
A PROJECTREPORT
Submitted by
PRADHEEBA.B (422711104060)
SASIREKA.R (422711104082)
VANMATHI.S (422711104100)
In partial fulfilmentfor the award of the degree
of
BACHELOR OF ENGINEERING
in
COMPUTER SCIENCEAND ENGINEERING
V.R.S. COLLEGE OF ENGINEERING AND TECHNOLOGY
ARASUR
ANNA UNIVERSITY::CHENNAI 600 025
APRIL 2015
ANNA UNIVERSITY:CHENNAI 600 025
BONAFIDE CERTIFICATE
Certified that this project report “INTELLECTUALANALYSIS OF
STUDENT'S OBSERVATION USING DATA MINING AND PATTERN
MATCHING”is the bonafide work of R.SASIREKA
[422711104082] who carried out the project work under my supervision.
SIGNATURE SIGNATURE
Mr. A.PARTHASARATHY, Prof. J.K.JOTHIKALPANA,
M.E., MBA., (Ph.D.,) M.Tech., (Ph.D.,)
HEAD OF THE DEPARTMENT SUPERVISOR
Associate Professor, Professor,
Department of Computer Department of Computer
Science and Engineering Science and Engineering
V.R.S. College of Engineering V.R.S. College of Engineering
and Technology, and Technology,
Arasur - 607107. Arasur - 607107.
Submitted for the University Examination held on 8.4.2015
INTERNAL EXAMINER EXTERNAL EXAMINER
ACKNOWLEDGEMENT
We express our sincere deep sense of gratitude to our project guide
Prof.J.K.JothiKalpana M.Tech.,(Ph.D.,)Associate Professor, Department
of Computer Science and Engineering, for her proficient and meticulous to
consummate our project.
We acknowledge our grateful and sincere thanks to our project co-
coordinator and Head of the Department Mr.A.Parthasarathy M.E,
M.B.A.,(Ph.D)., Computer Science and Engineering for his valuable
suggestions and help toward us.
With profoundness, we are indebted to our Principal
Dr.N.Anbazhaghan M.E.,Ph.D., for giving constant motivation in
succeeding our goal.
We extend our deep sense of thanks to our Chief Executive Officer
Er.M.SaravananM.E (Ph.D)., for providing us an opportunity to take up his
valuable advice to develop technical knowledge in planning our project
confidently.
We express our extreme gratitude and heart full thanks to our Chair
Person Tmt.Vijaya Muthuvanan Secretary and Correspondent
Rtn.S.R.Ramanathan and Director, Board of Governor
Thiru.N.Muthuvanan for providing facilities to do our project successfully.
We wish to express our sincere thanks to all those who helped us in
making this project successful.
ABSTRACT
In this project a fatigue detection technique is based on
computer vision. Fatigue is detected from face and facial features of
student. Hybrid method is used for face and facial feature detection which
not only increase the accuracy of the system but also decrease the
processing time. Skin colour pixels detection and viola Jones methods is
used for face detection and knowledge based division method is used to
increase the accuracy of facial feature detection. Also a dynamic
threshold value is used for yawning and eyes status detection.
This also addresses two issues for mitigating student
distraction/inattention by using novel video analysis techniques: (a) inside
an ego class room, student inattention is monitored through first tracking
students face/eye region using particle filters, followed by recognition of
dynamic eye states using computer vision system. Frequencies of eye
blinking and eye closure are used as the indication of sleepy and warning
sign is then generated for recommendation; (b) outside an ego class room
is analysed. Surrounding class rooms (in both directions) are tracked, and
their states are analyzed by sensors. These pieces of information are
provided for mitigating student’s inattention. The main novelties of the
proposed scheme include facial geometry based eye region detection for
eye closure identification, combined tracking and detection of class
rooms.
CHAPTER NO. TITLE PAGE NO.
ABSTRACT i
LIST OF FIGURES ii
LIST OF TABLES iii
LIST OF ABBREVATIONS iv
1 INTRODUCTION 1
1.1 Objectives 1
2 LITERATURE REVIEW 2
2.1 EXISTING SYSTEM 2
2.1.1 Demerits of Existing System 2
2.2 PROPOSEDSYSTEM 2
2.2.1Merits of Proposed System 3
2.3 APPLICATIONS 3
2.4 BLOCK DIAGRAM 4
TABLE OF CONTENTS
4 SOFTWARE PROFILE 6
4.1 INTRODUCTION 6
4.2 DEFINITION OF MATLAB 7
4.2.1 Starting MATLAB 7
4.2.2 Setting your initial current
directory
8
4.2.3 Setting up MATLAB
environment options
8
4.2.4 Configuration certain
products
8
4.2.5 Excel link versions 9
4.2.6 Finding information about
MATLAB
10
4.2.7 Syntax 11
3 SYSTEM REQUIREMENTS 5
3.1HARDWARE REQUIREMENT 5
3.2 SOFTWAREREQUIREMENT 5
4.2.8 Variables 11
4.2.9 Vectors/Matrices 12
4.2.10 Graphical User Interface
Programming
15
4.2.11Object Oriented
Programming
17
4.2.12 Interfacing with other
Languages
17
4.2.13 License 18
4.2.14 Alternatives 19
5 COMPUTER VISION TOOL
BOX
20
5.1COMPUTER VISION SYSTEM
TOOLBOX
20
5.2 IMAGE PROCESSING TOOL
BOX
21
5.3REGION OF INTEREST 22
6 MODULE DESCRIPTION 23
6.1 DETECTED FACIAL
FEATURES IN FACE REGION
23
6.2 DETECTION OF OPEN EYES 24
6.3DETECTION OF CLOSED 26
EYES
6.4 FACE AND MOUTH
DETECTION
28
7 CONCLUSION 34
8 FUTURE ENHANCEMENT 35
APPENDIX I 36
APPENDIX II 39
REFERENCES 43
LIST OF FIGURES
FIGURES NO. TITLE PAGE NO.
2.4 Block Diagram 04
4.1 Three dimensional
graphics
15
4.2 Normalized sinc function 16
4.3 Unnormalized sinc
function
16
6.1 Detected Facial Feature
In Face Region
24
6.2 Detection of Open Eyes 25
6.3 Detection of Closed Eyes 28
6.4 Face detection using
viola Jones method on
from skin color region.
31
6.5 Face and facial feature
detection.
32
ii
LIST OF TABLES
TABLE NO. TITLE PAGE NO.
4.1 Product and
Commands
9
4.2 Excel file used with
Excel Version
9
4.3 Finding information
about matlab
10
6.3 Yawning state 27
6.6 Detection of yawning
state of test users.
33
6.7 Closed Eyes Detection of
Test Users.
34
iii
LIST OF ABBREVATIONS
MATLAB MATrix LABoratory
GPU Graphics Processing Unit
DCT Discrete Cosine Transform
FFT Fast Fourier Transform
LINPACK Linear Pack
EISPACK EIgen System PACKage
LAPACK Linear Algebra PACKage
IDL Interactive Data Language
APL Applied Physics Letter
IL ILlinois
GUIDE GUI Development Environment
CHAPTER 1
INTRODUCTION
Now a days the students’ mentality is changed based on their interest in
studies. In the class room the staff members going to deliver lecture class and
observe how the students’ can understand the concept. For students the class
should be interesting and more interactive when the staff delivers the lecture.
From this we can analyze the average student face and their eyes; finally we
conclude both the qualitative and quantitative results.
When a face image is act as a input to the system, face detection will be
performed to locate the rough face region. The second step is to locate two
rough regions of eyes in the face. There are two default eye states: open and
closed. In our project we are going to discuss about four different stages like
Partially opened, Fully opened, and Fully closed. If a man can see something, it
is considered that his eyes are open. Our criterion is that if the iris and the white
of eye are visible, the eye is open. Otherwise, the eye is closed. The student data
is compared with student eye. The CGPA (Cumulative Grade Point Average)
attribute in the data set contains a large number of continuous values. For
example, we grouped all GPAs into five categorical segments; Excellent, Very
good, Good, Average and Poor.
1.1 OBJECTIVE
The main objective of our project was to analyze the student performance
in a class room. From this we are going to track the face, detect the eye region,
detect the mouth region whether the eye is opened or not and mouth is
yawning.
CHAPTER 2
LITERATURE REVIEW
2.1 EXISTING SYSTEM
In the existing system the Riemannian Manifold is impossible to give
the output and this give more time for the calculation. For differentiable
manifolds, it is impossible to define the derivatives of the curves on the
manifold.
2.1.1 Demerits of Existing System
• Speculiar output is impossible
• More time efficient should take for calculations
• This cannot be designed for any other objects
2.2 PROPOSED SYSTEM
In the proposed system we are using the students face for the analysis.
From this analysis the student face is tracked and eye is detected for the
execution. We are using the Viola-Jones algorithm for the face tracking
system.
The fatigue detection system consists of 6 levels which can be classified
as
• Image acquisition
• Face detection
• Facial features detection
• Eyes status detection
• Yawning status detection
• Drowsiness detection
2.2.1 Merits of Proposed System
• Extremely fast feature computation.
• Efficient feature selection.
• Scale and location invariant detector.
• Instead of scaling the image itself (e.g. pyramid-filters), we scale
the features.
2.3 APPLICATIONS
• It can be used in schools ,colleges and other institutions.
• It is used in railways, airways and roadways.
• It is used for analysis of face.
2.4 BLOCK DIAGRAM
Fig. 2.4 Block Diagram
CHAPTER 3
REQUIREMENTS SPECIFICATION
3.1 HARDWARE REQUIREMENTS
The hardware requirements may serve as the basis for a contract for the
implementation of the system and should therefore be a complete and consistent
specification of the whole system. It is used by software engineers as the
starting point for the system design. It should what the system do and how it
should be implemented.
• Processor : core i3
• RAM : 5.2 GHz
• Hard Disk : 40 GB and Above.
• Monitor : 15" COLOR
• Web Camera : 9 megapixels(MP)
3.2 SOFTWARE REQUIREMENTS
The software requirement document is the specification of the system. It
should include both a definition and a specification of requirements. The basis
of creating the software. It is useful in estimating cost, planning team activities,
performing tasks and tracking the teams and tracking the team progress
throughout the development activity.
• MATLAB 8.1.0 Version R2013a
• Window 7
CHAPTER 4
SOFTWARE PROFILE
4.1 INTRODUCTION
Cleve Moler, the chairman of the computer science department at the
University of New Mexico, started developing MATLAB in the late 1970s. He
designed it to give his students access to LINPACK and EISPACK without
them having to learn Fortran. It soon spread to other universities and found a
strong audience within the applied mathematics community. Jack Little, an
engineer, was exposed to it during a visit Moler made to Stanford University in
1983. Recognizing its commercial potential, he joined with Moler and Steve
Bangert. They rewrote MATLAB in C and founded MathWorks in 1984 to
continue its development. These rewritten libraries were known as JACKPAC.
In 2000, MATLAB was rewritten to use a newer set of libraries for matrix
manipulation, LAPACK.
MATLAB was first adopted by researchers and practitioners in control
engineering, Little's specialty, but quickly spread to many other domains. It is
now also used in education, in particular the teaching of linear algebra,
numerical analysis, and is popular amongst scientists involved in image
processing.
4.2 DEFINITION OF MATLAB
MATLAB (MATrix LABoratory) is a programming language for
technical computing from the math works, Natick, MA. It is used for a wide
variety of scientific and engineering calculation, especially for automatic
control and signal processing. MATLAB runs on windows, Mac and a variety
of unit based systems Developed by Cleve Moler in the late of 1970's and based
on the original LINPACK and EISPACK FORTRAN libraries, it was initially
used for factoring matrices and solving linear equations. Moler commercialized
the product with two colleagues in 1984. MATLAB is also noted for its
extensive graphics capabilities.
4.2.1 Starting MATLAB
To start MATLAB, you can use any of these methods.
Double-click on the MATLAB icon (called a "shortcut") that the installer
creates on your desktop.
Click on the Start button, select Programs, and click on the MATLAB 6.5
entry. Select MATLAB 6.5 from this menu.Using Windows Explorer, open
your top-level MATLAB installation directory and double-click on the shortcut
to the MATLAB executable, MATLAB 6.5.
4.2.2 Setting your initial current directory
By default, when you start MATLAB using the shortcut the installer puts
on your desktop, the initial current directory is the $MATLABwork directory,
where $MATLAB represents the name of your installation directory. The work
directory is a good place to store the M-files you modify and create because the
MATLAB uninstaller does not delete the work directory if it contains files.
You can, however, use any directory as your MATLAB initial current directory.
To specify another directory as your initial current directory, right-click on the
MATLAB shortcut that the installer creates on your desktop and select the
Properties option. Specify the name of the directory in the Start in field.
4.2.3 Setting up matlab environment options
To include welcome messages, default definitions, or any MATLAB
expressions that you want executed every time MATLAB is invoked, create a
file named startup.m in the $MATLABtoolboxlocal directory. MATLAB
executes this file each time it is invoked.
4.2.4 Configuring certain products
Certain products require additional configuration. The following table
lists these products and the commands used to configure them. If you installed
any of these products, see the documentation for that product for detailed
configuration information.
PRODUCT COMMAND
MATLAB Notebook Notebook-setup
MATLAB Runtime server Rtsetup
Real-Time Windows Target rtwintgt -setup
Table 4.1 Products and the Commands
4.2.5 Excel link versions
By default, Excel Link (a separately orderable product) supports two
versions of Excel. This table lists which Excel Link files to use with each Excel
version.
Use the appropriate files for your version of Excel. You can find these
files in the $MATLABtoolboxexlink subdirectory.
EXCEL VERSION EXCEL LINK FILE
Excel 97 (Default) excllink.xla ExliSamp.xls
Excel 7 excllink95.xla ExliSamp95.xls
Table 4.2 Excel files used with Excel version
4.2.6 Finding information about MATLAB
After successfully installing MATLAB, you are probably eager to get
started using it. This list provides pointers to sources of information and other
features you may find helpful in getting started with MATLAB.
TASKS DESCRIPTION
To get an overview of
MATLAB and its
capabilities
Read the Release Notes documentation
To find out what's new in
this release
Read the Release Notes documentation
To start a productor run one
of the demonstration
programs
Use the Launch Pad in the MATLAB desktop
To get information about
specific MATLAB feature
Choose the Help item in the MATLAB menu bar
to view reference and tutorial information in
hyperlinked HTML form.
To get help with specific
questions you can't find
answered in the
documentation
Go to the Math Works Web site
(www.mathworks.com), click on Support, and use
the Technical Support solution search area to find
more information
Table4.3 Finding information about MATLAB
4.2.7 Syntax
The MATLAB application is built around the MATLAB language, and
most use of MATLAB involves typing MATLAB code into the Command
Window (as an interactive mathematical shell), or executing text files
containing MATLAB code, including scripts and/or functions.
4.2.8 Variables
Variables are defined using the assignment operator, =. MATLAB is a
weakly typed programming language because types are implicitly converted. It
is an inferred typed language because variables can be assigned without
declaring their type, except if they are to be treated as symbolic objects, and that
their type can change. Values can come from constants, from computation
involving values of other variables, or from the output of a function. For
example:
>> x = 17
x = 17
>> x = 'hat'
x = hat
>> y = x + 0
y = 104 97 116
>> x = [3*4, pi/2]
x = 12.0000 1.5708
y = -1.6097 3.0000
4.2.9 Vectors/Matrices
A simple array is defined using the colon syntax:
init:increment:terminator. For instance:
>> array = 1:2:9
array = 1 3 5 7 9
defines a variable named array (or assigns a new value to an existing variable
with the name array) which is an array consisting of the values 1, 3, 5, 7, and 9.
That is, the array starts at 1 (the init value), increments with each step from the
previous value by 2 (the increment value), and stops once it reaches (or to avoid
exceeding) 9 (the terminator value).
>> array = 1:3:9
array = 1 4 7
the increment value can actually be left out of this syntax (along with one of the
colons), to use a default value of 1.
>> ari = 1:5
ari = 1 2 3 4 5
assigns to the variable named ari an array with the values 1, 2, 3, 4, and 5,
since the default value of 1 is used as the incrementer.
Indexing is one-based, which is the usual convention for matrices in
mathematics, although not for some programming languages such as C, C++,
and Java.
Matrices can be defined by separating the elements of a row with blank
space or comma and using a semicolon to terminate each row. The list of
elements should be surrounded by square brackets: []. Parentheses: () are used
to access elements and subarrays (they are also used to denote a function
argument list).
>> A = [16 3 2 13; 5 10 11 8; 9 6 7 12; 4 15 14 1]
A = 16 3 2 13
5 10 11 8
9 6 7 12
4 15 14 1
>> A(2,3)
ans = 11
Sets of indices can be specified by expressions such as "2:4", which
evaluates to [2, 3, 4]. For example, a submatrix taken from rows 2 through 4 and
columns 3 through 4 can be written as:
>> A(2:4,3:4)
ans = 11 8
7 12
14 1
A square identity matrix of size n can be generated using the function eye,
and matrices of any size with zeros or ones can be generated with the functions
zeros and ones, respectively.
>> eye(3,3)
ans = 1 0 0
0 1 0
0 0 1
>> zeros(2,3)
ans = 0 0 0
0 0 0
>> ones(2,3)
ans = 1 1 1
1 1 1
Most MATLAB functions can accept matrices and will apply themselves
to each element. For example, mod(2*J,n) will multiply every element in "J"
by 2, and then reduce each element modulo "n". MATLAB does include
standard "for" and "while" loops, but (as in other similar applications such as
R), using the vectorized notation often produces code that is faster to execute.
This code, excerpted from the function magic.m, creates a magic square
M for odd values of n (MATLAB function meshgrid is used here to generate
square matrices I and J containing 1:n).
[J,I] = meshgrid(1:n);
A = mod(I + J - (n + 3) / 2, n);
B = mod(I + 2 * J - 2, n);
M = n * A + B + 1;
Method call behavior is different between value and reference classes. For
example, a call to a method
object.method();
can alter any member of object only if object is an instance of a reference class.
4.2.10 Graphics and graphical user interface programming
MATLAB supports developing applications with graphical user interface
features. MATLAB includes GUIDE (GUI development environment) for
graphically designing GUIs. It also has tightly integrated graph-plotting
features. For example the function plot can be used to produce a graph from two
vectors x and y. The code:
x = 0:pi/100:2*pi;
y = sin(x);
plot(x,y)
Fig.4.1 Three Dimensional Graphics
A MATLAB program can produce three-dimensional graphics using the
functions surf, plot3 or mesh.
[X,Y] = meshgrid(-10:0.25:10,-
10:0.25:10);
f = sinc(sqrt((X/pi).^2+(Y/pi).^2));
mesh(X,Y,f);
axis([-10 10 -10 10 -0.3 1])
xlabel('{bfx}')
ylabel('{bfy}')
zlabel('{bfsinc} ({bfR})')
hidden off
[X,Y] = meshgrid(-10:0.25:10,-
10:0.25:10);
f = sinc(sqrt((X/pi).^2+(Y/pi).^2));
surf(X,Y,f);
axis([-10 10 -10 10 -0.3 1])
xlabel('{bfx}')
ylabel('{bfy}')
zlabel('{bfsinc} ({bfR})')
4.2 Two-dimensionalnormalized 4.3 Two-dimensional unnormalized
sinc function sinc function
In MATLAB, graphical user interfaces can be programmed with the GUI design
environment (GUIDE) tool.
4.2.11 Object-oriented programming
MATLAB's support for object-oriented programming includes classes,
inheritance, virtual dispatch, packages, pass-by-value semantics, and pass-by-
reference semantics.
classdef hello
methods
function greet(this)
disp('Hello!')
end
end
end
When put into a file named hello.m, this can be executed with the following
commands:
>> x = hello;
>> x.greet();
Hello!
4.2.12 Interfacing with other languages
MATLAB can call functions and subroutines written in the C
programming language or Fortran. A wrapper function is created allowing
MATLAB data types to be passed and returned. The dynamically loadable
object files created by compiling such functions are termed "MEX-files" (for
MATLAB executable).
Libraries written in Perl, Java, ActiveX or .NET can be directly called
from MATLAB, and many MATLAB libraries (for example XML or SQL
support) are implemented as wrappers around Java or ActiveX libraries. Calling
MATLAB from Java is more complicated, but can be done with a MATLAB
toolbox which is sold separately by MathWorks, or using an undocumented
mechanism called JMI (Java-to-MATLAB Interface), (which should not be
confused with the unrelated Java Metadata Interface that is also called JMI).As
alternatives to the MuPAD based Symbolic Math Toolbox available from
MathWorks, MATLAB can be connected to Maple or Mathematica.
4.2.13 License
MATLAB is a proprietary product of MathWorks, so users are subject to
vendor lock-in. Although MATLAB Builder products can deploy MATLAB
functions as library files which can be used with. NETor Java application
building environment, future development will still be tied to the MATLAB
language. Libraries also exist to import and export MathML.
Each toolbox is purchased separately. If an evaluation license is
requested, the MathWorks sales department requires detailed information about
the project for which MATLAB is to be evaluated. If granted (which it often is),
the evaluation license is valid for two to four weeks. A student version of
MATLAB is available as is a home-use license for MATLAB, SIMULINK, and
a subset of Mathwork's Toolboxes at substantially reduced prices.It has been
reported that EU competition regulators are investigating whether MathWorks
refused to sell licenses to a competitor.
4.2.14 Alternatives
See also: list of numerical analysis software and comparison of numerical
analysis software.
MATLAB has a number of competitors. Commercial competitors include
Mathematica, TK Solver, Maple, and IDL. There are also free open source
alternatives to MATLAB, in particular GNU Octave, Scilab, FreeMat, Julia, and
Sage which are intended to be mostly compatible with the MATLAB language.
Among other languages that treat arrays as basic entities (array
programming languages) are APL, Fortran 90 and higher, S-Lang, as well as the
statistical languages R and S. There are also libraries to add similar functionality
to existing languages, such as IT++ for C++, Perl Data Language for Perl,
ILNumerics for .NET, NumPy/SciPy for Python, and Numeric.js for JavaScript.
GNU Octave stands out as it treats incompatibility with MATLAB as a
bug (see GNU Octave#Matlab), therefore it aims to provide a software clone.
CHAPTER 5
COMPUTER VISION TOOL BOX
Using images and video to detect, classify, and track objects or events in
order to “understand” a real-world scene
Image Processing
 Remove noise Adjust contrast Measure
Computer Vision
 Detect Identify Classify Recognize Trac
5.1 COMPUTER VISION SYSTEM TOOLBOX
Design and simulate computer vision and video processing systems
Feature detection
• Feature extraction and matching
• Feature-based registration
• Motion estimation and tracking
• Stereo vision
• Video processing
• Video file I/O, display, and graphics9
5.2 IMAGE PROCESSINGTOOLBOX
Image Processing Toolbox™ provides a comprehensive set of reference-
standard algorithms, functions, and apps for image processing, analysis,
visualization, and algorithm development. You can perform image analysis,
image segmentation, image enhancement, noise reduction, geometric
transformations, and image registration. Many toolbox functions support
multicore processors, GPUs, and C-code generation.
Image Processing Toolbox supports a diverse set of image types,
including high dynamic range, gigapixel resolution, embedded ICC profile, and
tomographic. Visualization functions and apps let you explore images and
videos, examine a region of pixels, adjust color and contrast, create contours or
histograms, and manipulate regions of interest (ROIs). The toolbox supports
workflows for processing, displaying, and navigating large images.
Key Features
• Image analysis, including segmentation, morphology, statistics, and
measurements.
• Image enhancement, filtering, and deblurring.
• Geometric transformations and intensity-based image registration
methods.
• Image transforms, including FFT, DCT, Radon, and fan-beam
projection.
• Large image workflows, including block processing, tiling, and
multiresolution display.
• Visualization apps, including Image Viewer and Video Viewer.
• Multicore- and GPU-enabled functions, and C-codegeneration support.
5.3 A REGION OF INTEREST (ROI)
A region of interest (ROI) is a portion of an image that you want to filter
or perform some other operation on. You define an ROI by creating a binary
mask, which is a binary image that is the same size as the image you want to
process with pixels that define the ROI set to 1 and all other pixels set to 0.
You can define more than one ROI in an image. The regions can be
geographic in nature, such as polygons that encompass contiguous pixels, or
they can be defined by a range of intensities. In the latter case, the pixels are not
necessarily contiguous.
CHAPTER 6
MODULE DESCRIPTION
6.1 DETECTED FACIAL FEATURES IN FACE REGION
Once face is detected, the second step is to detect eyes and mouth in the
face area. The detected face area is extracted from input image. The system
divides the extracted face image into three sub portions upper left half, upper
right half and lower half. These divisions are made on the basis of physical
approximation of eyes and mouth locations. After division of image into three
parts, Viola Jones method is applied on upper half part of the image for eyes
detection and on lowers half part of the image for mouth detection. The
advantage of division of image into three parts is that when we apply viola
Jones method for one eye detection, the detector only search left upper part of
the image for left eye detection instead of searching the whole image. Similarly
for second eye detection, the detector searches the right upper part of the image
and for mouth detection the detector searches in the lower part of the image
only. This not only increases the accuracy of the detection of the facial features
but also decreases the processing time.
The detection of the facial feature is shown in figure 3. After detection of
facial features, the next step is to check weather eyes are open or closed. Since
every person has his own facial features so it becomes difficult to determine
eyes status by using general data. To do so the system detected eyes information
from first fifty frames of every coming one thousand frames. The detected eye
region is extracted and converted into binary image. Then the number of black
and white pixels in that region are calculated and stored in data. In case of open
eyes the ratio of black pixels in the region is greater than in case of closed eyes.
Since blinking is negligible, the data of first fifty frames can provide the
approximate ratio of black and white pixel in case of open eyes.
Fig.6.1 Detected Facial Feature in Face Region
Fig.6.1 Methodology
6.2.DETECTION OF OPEN EYES
This approximate ratio is used to calculate a threshold value which can
be used in coming frames to determine state of eyes. The mathematical
expression to calculate threshold value is:
Thresholdeyes=black/whiteavg of 30 frames- Black/white avg of 30 frames/3 (1)
Once the threshold value is calculated, the ratio of black and white pixels
in coming frames is compared to the threshold value. If the ratio of black and
white pixels in coming frame is less then threshold value, the eyes are
considered to be closed and if ratio is greater than the threshold value it means
eyes are open. The mathematical representation of closed eyes detection is
given in equation 2. Figure 4 shows the detection of closed eyes after comparing
the black and white pixels of eye region with threshold value.
Real time image = Closed if black/whiteeye region < Thresholdeyes
Open eyes other wise (2)
Fig.6.2 Detection of Open Eyes
6.3 DETECTION OF CLOSED EYES
The next step is to analyze the yawning condition of mouth. Yawning
condition can be identified by calculating the height of mouth. When a person is
in yawning state height of his/her mouth is increases by a specific amount.
Since every person has different mouth size, so in yawning condition the height
of mouth for every person will be different that’s why we need a dynamic
threshold value for every person separately that can be compared to the normal
size of mouth of that particular person. To calculate the threshold value
dynamically, the system stores the information of first fifty frames of the video.
The threshold value is then calculated from the first fifty frames shown in
equation 3.
Thresholdmouth= Mouth heightAvg of 30 frames +
Mouth heightAvg of 30 frames/3 (3)
Once the threshold value is calculated, the system compares the height of
the mouth in coming frames with the threshold value. If the height of the mouth
in the coming frame is greater than threshold value it means mouth is in
yawning state and if the value of the height of the mouth in coming frame is less
than the threshold value it means mouth is in normal condition. This can be
expressed in mathematical form as,
Real time image = Yawning if current height > Thresholdmouth
Not Yawning other wise (4)
Table 6.3 show that in case of closed mouth the height of the mouth in
pixels is 26 and in case of open mouth the height of the mouth in pixels is 39.
The system compare the value of the height of the mouth with threshold value
(which is calculated from first fifty frames) and analyze if the height of mouth is
less then threshold then it is detected as normal condition otherwise it is
addressed as yawning state.
Table 6.3 Yawning state
The system continuously monitors the status of the eyes and mouth. If
eyes are detected as close for more than 1.5 seconds or if yawning condition is
detected while lessening. The system reports drowsiness and give an alarm to
diver.
Fig.6.3 Detection of Closed Eyes
Algorithm for Segmentation
1. Define the neighbourhood of each feature (random variable in MRF terms).
Generally this includes 1st order or 2nd order neighbours.
2. Set initial probabilities i for each feature as 0 or 1, where i is the
set containing features extracted for pixel and define an initial set of clusters.
3. Using the training data compute the mean ( li) and variance ( li) for each
label. This is termed as class statistics.
4. Compute the marginal distribution for the given labelling scheme i i
using Bayes' theorem and the class statistics calculated earlier. A Gaussian
model is used for the marginal distribution.
5. Calculate the probability of each class label given the neighbourhood defined
previously. Clique potentials are used to model the social impact in labelling.
6. Iterate over new prior probabilities and redefine clusters such that these
probabilities are maximized. This is done using a variety of optimization
algorithms described below.
7. Stop when probability is maximized and labelling scheme does not change.
The calculations can be implemented in log likelihood terms as well.
6.4 FACE AND MOUTH DETECTION
A cascade of boosted classifiers working with Haar-like features is used
to detect a user’s face in images captured by a web camera. It is a very efficient
and effective algorithm for visual object detection.
Each classifier in the cascade consists of a set of weak classifiers based
on one image feature each. Features used for face detection are grey-level
differences between sums of pixel values in different, rectangle regions in an
image window. The window slides over the image and changes its scale. Image
features may be computed rapidly for any scale and location in a video frame
using integral images.
For each window, the decision is made whether the window contains a
face; all classifiers in the cascade must detect a face for the classification result
to be positive. If any classifiers fails to detect a face, the classification process is
halted and the final result is negative. Classifiers in the cascade are trained with
AdaBoost algorithm that is tuned to minimize false negatives error ratio.
Classifiers in the cascade are combined in the order of increased
complexity; initial classifies are based on a few features only. This makes
possible for the algorithm to work in the real time because it allows background
regions of the image to be quickly discarded while spending more computation
on promising regions. Face detection algorithm finds location of all faces in
every video frame. It is assumed, that only one person is present in the camera
field of view therefore only the first face location is used for further processing.
In order to increase speed of the face detection and to make sure that the
face is large enough to recognize lip gestures, the minimal width of a face was
set to the half of the image frame width. Sample results of face detection and
mouth region finding are pictured in the figure. The mouth region is localized
arbitrary in the lower part of the face region detected. It is defined by the half-
ellipse horizontally centered in the lower half of the face region. The width and
the height of the half-ellipse is equal to the half of the height and half of the
width of the face region, respectively.
Fig.6.4 Face detection using viola Jones method on from skin color
region.
Fig.6.5 Faceand facialfeature detection
Fig.6.6 Detection of yawning state of test users
Fig.6.7 Closed eyes detection of test users
CHAPTER 7
CONCLUSION
This paper describes a fatigue detection system based on lessoning
behaviour. The system uses skin color pixels detection and VJ method for face
detection. Facial feature detection is achieved by dividing the image into three
parts and applying VJ method on each part of the image. For accurate detection
of yawning and eyes status a threshold value is calculated dynamically and each
coming frame is compared to the threshold value for drowsiness detection. The
processing time for drowsiness detection decreases due to fast detection rate of
face and facial features. The accuracy level of face and facial feature detection
is increased due to application of hybrid method, which in turn increases the
accuracy level of the whole system. The system is designed on software bases
only and Matlab software is used for simulation. Since VGA camera is used for
image acquisition, hence the system works in daylight only. Using of night
vision camera in future might make the system able to detect drowsiness level
of student in night time as well.
CHAPTER 8
FUTURE ENHANCEMENT
This analysis will be useful in schools, colleges, educational institutes
and institutions to record the presence of students in their class session. In our
project we choose to detect a single student presence or activity in the class by
detecting his eyes, but in future we may increase the number of student
observation by using Sensors. Using Sensors we can easily trace the eyes action
of each student and the Alert will be sent to the Class handling person. This
project is very helpful to the teachers to trace the student observation in the
classroom and they will change their teaching style and make student to
concentrate in the class.
APPENDIX I
COMPARISON OF STUDENT DATA WITH STUDENT EYE
%% Drowsy Detection System
%% Clear and Close Everything
clc;
clear all;
close all;
%% Main Code
EyeDetect = vision.CascadeObjectDetector('EyePairBig');
MouthDetect = vision.CascadeObjectDetector('Mouth','MergeThreshold',150);
vidDevice = imaq.VideoDevice('winvideo', 1, 'YUY2_640x480', ... % Acquire
input video stream
'ROI', [1 1 640 480], ...
'ReturnedColorSpace', 'rgb');
vidInfo = imaqhwinfo(vidDevice); % Acquire input video property
reqToolboxes = {'Computer Vision System Toolbox', 'Image Processing
Toolbox'};
hVideoIn = vision.VideoPlayer('Name', 'Final Video', ... % Output video player
'Position', [100 100 vidInfo.MaxWidth+20 vidInfo.MaxHeight+30]);
nFrame = 0; % Frame number initialization
while(nFrame < 12)
if nFrame==0
img= step(vidDevice); % Acquire single frame
pause(2);
end
img= step(vidDevice); % Acquire single frame
BB=step(EyeDetect,img);
CC=step(MouthDetect,img);
imshow(img);
rectangle('Position',BB,'LineWidth',2,'LineStyle','-','EdgeColor','b');
rectangle('Position',CC,'LineWidth',2,'LineStyle','-','EdgeColor','r');
title('Drowsy Detection');
eye=imcrop(img,BB);
mouth=imcrop(img,CC);
imwrite(mouth,['mouth',num2str(nFrame),'.jpg']);
imwrite(eye,['eye',num2str(nFrame),'.jpg']);
%step(hVideoIn,BB); % Output video stream
eye=rgb2gray(eye);
eye=im2bw(eye,.15);
%imshow(eye);
[m,n]=size(eye);
White_pix=0;
Black_pix=0;
for j=1:n
for i=1:m
if eye(i,j)==1
White_pix=White_pix+1;
else
Black_pix=Black_pix+1;
end
end
end
Black_pix
mouth=rgb2gray(mouth);
mouth=im2bw(mouth,.15);
%imshow(eye);
[mm,nn]=size(mouth);
White_pix1=0;
Black_pix1=0;
for j=1:nn
for i=1:mm
if mouth(i,j)==1
White_pix1=White_pix1+1;
else
Black_pix1=Black_pix1+1;
end
end
end
Black_pix1
if Black_pix < 1000 || Black_pix1 < 1000
msgbox('Alert! Alert!');
end
pause(.5)
nFrame = nFrame+1;
end
%% Clearing Memory
release(hVideoIn); % Release all memory and buffer used
release(vidDevice);
APPENDIX II
SCREEN SHOT
REFERENCES
[1] S. Abtahi, Hariri, B , Shirmohammadi, S., "Student drowsiness monitoring
based on yawning detection," in Instrumentation and Measurement Technology
Conference (I2MTC), 2011 IEEE, 2011, pp. 1-4.
[2] L. M. N. Bergasa, J. Sotelo, M. A. Barea, R. Lopez, M. E., "Real-time
system for monitoring student vigilance," Intelligent Transportation Systems,
IEEE Transactions on, vol. 7, pp. 63-77, 2006.
[3] S. G. H. Charles C. Liua, Michael G. Lennéa, "Predicting student
drowsiness using class room measures: Recent insights and future challenges,"
Journal of Safety Research, vol. 40, pp. 239-245,2009.
[4] L. P. Danghui, Sun YanQing, Xiao Yunxia, Yin,"Drowsiness Detection
Based on Eyelid Movement," in Education Technology and Computer Science
(ETCS), 2010 Second International Workshop on, 2010, pp. 49-52.
[5] B. G. S. Hyun Jae, Chung Ko Keun, Kim ,Kwang-Suk, Park, "A Smart
Health Monitoring Chair for Nonintrusive Measurement of Biological Signals,"
Information Technology in Biomedicine, IEEE Transactions on, vol. 16, pp.
150-158, 2012.
[6] T. H. Laurence Hartley, Nick Mabbott, "Review of Fatigue Detection and
Prediction Technologies,"Institute for Research in Safety & Transport Murdoch
University Western Australia and Gerald P Krueger Krueger Ergonomics
Consultants, 2000.
[7] M. J. J. PAUL VIOLA, "Robust Real-Time Face Detection," International
Journal of Computer Vision, vol. 57, pp. 137-154, 2004.
[8] S. B. M. T. SzeSeen Kee, YongMeng Goh, "Lesioning Fatigue and
Performance among Occupational Students in Simulated Prolonged Lesioning,"
Global Journal of Health Science, vol. 2, 2010.
[9] H. S. Weiwei Liu, Weijie Shen, "Student Fatigue Detection through Pupil
Detection and Yawing Analysis," Bioinformatics and Biomedical Technology
(ICBBT), 2010 International Conference, 2010.
[10] W. Z. U. Yufeng, "Detection student yawning in successive images," in
proc first International Cont. on Bioinformatics and Biomedical Engineering,
pp. 581-583, 2007.
[11] T.-W. L. Yu-Shan Wu, Quen-Zong Wu,Heng-Sung Liu, "An Eye State
Recognition Method for Drowsiness Detection," IEEE international
Conference, vol. 978-1-4244-25, 2010.
[12] J. Z. Zutao Zhang, "A new real-time eye tracking based on nonlinear
unscented Kalman filter for monitoring student fatigue," Journal of Control
Theory and Applications vol. 8, pp. 181-188, 2010.
SOURSE CODE
%% Drowsy Detection System
%% Clear and Close Everything
clc;
clear all;
close all;
%% Main Code
EyeDetect = vision.CascadeObjectDetector('EyePairBig');
MouthDetect = vision.CascadeObjectDetector('Mouth','MergeThreshold',150);
vidDevice = imaq.VideoDevice('winvideo', 1, 'YUY2_640x480', ... % Acquire
input video stream
'ROI', [1 1 640 480], ...
'ReturnedColorSpace', 'rgb');
vidInfo = imaqhwinfo(vidDevice); % Acquire input video property
reqToolboxes = {'Computer Vision System Toolbox', 'Image Processing
Toolbox'};
hVideoIn = vision.VideoPlayer('Name', 'Final Video', ... % Output video
player
'Position', [100 100 vidInfo.MaxWidth+20
vidInfo.MaxHeight+30]);
nFrame = 0; % Frame number initialization
while(nFrame < 11)
if nFrame==0
img= step(vidDevice); % Acquire single frame
pause(2);
end
img= step(vidDevice); % Acquire single frame
BB=step(EyeDetect,img);
CC=step(MouthDetect,img);
imshow(img);
rectangle('Position',BB,'LineWidth',2,'LineStyle','-','EdgeColor','b');
rectangle('Position',CC,'LineWidth',2,'LineStyle','-','EdgeColor','r');
title('Drowsy Detection');
eye=imcrop(img,BB);
mouth=imcrop(img,CC);
imwrite(mouth,['mouth',num2str(nFrame),'.jpg']);
imwrite(eye,['eye',num2str(nFrame),'.jpg']);
%step(hVideoIn,BB); % Output video stream
eye=rgb2gray(eye);
eye=im2bw(eye,.15);
%imshow(eye);
[m,n]=size(eye);
White_pix=0;
Black_pix=0;
for j=1:n
for i=1:m
if eye(i,j)==1
White_pix=White_pix+1;
else
Black_pix=Black_pix+1;
end
end
end
Black_pix
mouth=rgb2gray(mouth);
mouth=im2bw(mouth,.15);
%imshow(eye);
[mm,nn]=size(mouth);
White_pix1=1;
Black_pix1=1;
for j=1:nn
for i=1:mm
if mouth(i,j)==1
White_pix1=White_pix1+3;
else
Black_pix1=Black_pix1+3;
end
end
end
Black_pix1
if Black_pix < 1000 || Black_pix1 < 1000
msgbox('Alert! Alert!');
end
pause(.100)
nFrame = nFrame+1;
end
%% Clearing Memory
release(hVideoIn); % Release all memory and buffer used
release(vidDevice);

More Related Content

What's hot

Facial expression recognition based on local binary patterns final
Facial expression recognition based on local binary patterns finalFacial expression recognition based on local binary patterns final
Facial expression recognition based on local binary patterns finalahmad abdelhafeez
 
Review of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithmsReview of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithmsijma
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentIJERD Editor
 
IRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection MethodsIRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection MethodsIRJET Journal
 
Facial Expression Identification System
Facial Expression Identification SystemFacial Expression Identification System
Facial Expression Identification SystemIRJET Journal
 
IRJET- Design of an Automated Attendance System using Face Recognition Algorithm
IRJET- Design of an Automated Attendance System using Face Recognition AlgorithmIRJET- Design of an Automated Attendance System using Face Recognition Algorithm
IRJET- Design of an Automated Attendance System using Face Recognition AlgorithmIRJET Journal
 
IRJET - Face Recognition based Attendance System: Review
IRJET -  	  Face Recognition based Attendance System: ReviewIRJET -  	  Face Recognition based Attendance System: Review
IRJET - Face Recognition based Attendance System: ReviewIRJET Journal
 
IRJET- Free & Generic Facial Attendance System using Android
IRJET- Free & Generic Facial Attendance System using AndroidIRJET- Free & Generic Facial Attendance System using Android
IRJET- Free & Generic Facial Attendance System using AndroidIRJET Journal
 
Face detection using template matching
Face detection using template matchingFace detection using template matching
Face detection using template matchingBrijesh Borad
 
Prevention of Spoofing Attacks in Face Recognition System Using Liveness De...
Prevention of Spoofing Attacks in Face Recognition System Using   Liveness De...Prevention of Spoofing Attacks in Face Recognition System Using   Liveness De...
Prevention of Spoofing Attacks in Face Recognition System Using Liveness De...IRJET Journal
 
Techniques for Face Detection & Recognition Systema Comprehensive Review
Techniques for Face Detection & Recognition Systema Comprehensive ReviewTechniques for Face Detection & Recognition Systema Comprehensive Review
Techniques for Face Detection & Recognition Systema Comprehensive ReviewIOSR Journals
 
A Hybrid Approach to Face Detection And Feature Extraction
A Hybrid Approach to Face Detection And Feature ExtractionA Hybrid Approach to Face Detection And Feature Extraction
A Hybrid Approach to Face Detection And Feature Extractioniosrjce
 
Mining of Images Based on Structural Features Correlation for Facial Annotation
Mining of Images Based on Structural Features Correlation for Facial AnnotationMining of Images Based on Structural Features Correlation for Facial Annotation
Mining of Images Based on Structural Features Correlation for Facial AnnotationIRJET Journal
 
online test system project report
online test system project reportonline test system project report
online test system project reportabhishek kumar
 
Face Detection
Face DetectionFace Detection
Face Detectionamar kakde
 
NEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONNEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONcscpconf
 

What's hot (19)

Facial expression recognition based on local binary patterns final
Facial expression recognition based on local binary patterns finalFacial expression recognition based on local binary patterns final
Facial expression recognition based on local binary patterns final
 
Week6 face detection
Week6 face detectionWeek6 face detection
Week6 face detection
 
Review of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithmsReview of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithms
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and Development
 
IJET-V2I6P21
IJET-V2I6P21IJET-V2I6P21
IJET-V2I6P21
 
IRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection MethodsIRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection Methods
 
Facial Expression Identification System
Facial Expression Identification SystemFacial Expression Identification System
Facial Expression Identification System
 
IRJET- Design of an Automated Attendance System using Face Recognition Algorithm
IRJET- Design of an Automated Attendance System using Face Recognition AlgorithmIRJET- Design of an Automated Attendance System using Face Recognition Algorithm
IRJET- Design of an Automated Attendance System using Face Recognition Algorithm
 
Face detection
Face detectionFace detection
Face detection
 
IRJET - Face Recognition based Attendance System: Review
IRJET -  	  Face Recognition based Attendance System: ReviewIRJET -  	  Face Recognition based Attendance System: Review
IRJET - Face Recognition based Attendance System: Review
 
IRJET- Free & Generic Facial Attendance System using Android
IRJET- Free & Generic Facial Attendance System using AndroidIRJET- Free & Generic Facial Attendance System using Android
IRJET- Free & Generic Facial Attendance System using Android
 
Face detection using template matching
Face detection using template matchingFace detection using template matching
Face detection using template matching
 
Prevention of Spoofing Attacks in Face Recognition System Using Liveness De...
Prevention of Spoofing Attacks in Face Recognition System Using   Liveness De...Prevention of Spoofing Attacks in Face Recognition System Using   Liveness De...
Prevention of Spoofing Attacks in Face Recognition System Using Liveness De...
 
Techniques for Face Detection & Recognition Systema Comprehensive Review
Techniques for Face Detection & Recognition Systema Comprehensive ReviewTechniques for Face Detection & Recognition Systema Comprehensive Review
Techniques for Face Detection & Recognition Systema Comprehensive Review
 
A Hybrid Approach to Face Detection And Feature Extraction
A Hybrid Approach to Face Detection And Feature ExtractionA Hybrid Approach to Face Detection And Feature Extraction
A Hybrid Approach to Face Detection And Feature Extraction
 
Mining of Images Based on Structural Features Correlation for Facial Annotation
Mining of Images Based on Structural Features Correlation for Facial AnnotationMining of Images Based on Structural Features Correlation for Facial Annotation
Mining of Images Based on Structural Features Correlation for Facial Annotation
 
online test system project report
online test system project reportonline test system project report
online test system project report
 
Face Detection
Face DetectionFace Detection
Face Detection
 
NEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONNEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTION
 

Similar to employee job satisfaction project

IRJET- Autonamy of Attendence using Face Recognition
IRJET- Autonamy of Attendence using Face RecognitionIRJET- Autonamy of Attendence using Face Recognition
IRJET- Autonamy of Attendence using Face RecognitionIRJET Journal
 
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET Journal
 
An eye gaze detection using low resolution web camera in desktop environment
An eye gaze detection using low resolution web camera in desktop environmentAn eye gaze detection using low resolution web camera in desktop environment
An eye gaze detection using low resolution web camera in desktop environmenteSAT Journals
 
DROWSINESS DETECTION MODEL USING PYTHON
DROWSINESS DETECTION MODEL USING PYTHONDROWSINESS DETECTION MODEL USING PYTHON
DROWSINESS DETECTION MODEL USING PYTHONIRJET Journal
 
IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...
IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...
IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...IRJET Journal
 
Attendance System using Facial Recognition
Attendance System using Facial RecognitionAttendance System using Facial Recognition
Attendance System using Facial RecognitionIRJET Journal
 
IRJET- Robust Visual Analysis of Eye State
IRJET-  	  Robust Visual Analysis of Eye StateIRJET-  	  Robust Visual Analysis of Eye State
IRJET- Robust Visual Analysis of Eye StateIRJET Journal
 
IRJET - Facial Recognition based Attendance System with LBPH
IRJET -  	  Facial Recognition based Attendance System with LBPHIRJET -  	  Facial Recognition based Attendance System with LBPH
IRJET - Facial Recognition based Attendance System with LBPHIRJET Journal
 
Real-time eyeglass detection using transfer learning for non-standard facial...
Real-time eyeglass detection using transfer learning for  non-standard facial...Real-time eyeglass detection using transfer learning for  non-standard facial...
Real-time eyeglass detection using transfer learning for non-standard facial...IJECEIAES
 
Face recogntion using PCA algorithm
Face recogntion using PCA algorithmFace recogntion using PCA algorithm
Face recogntion using PCA algorithmAshwini Awatare
 
EE-2018-1303261-1.pdf
EE-2018-1303261-1.pdfEE-2018-1303261-1.pdf
EE-2018-1303261-1.pdfUmarDrazKhan2
 
REAL TIME DROWSINESS DETECTION
REAL TIME DROWSINESS DETECTIONREAL TIME DROWSINESS DETECTION
REAL TIME DROWSINESS DETECTIONIRJET Journal
 
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...IRJET Journal
 
FACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEMFACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEMIRJET Journal
 
FACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEMFACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEMIRJET Journal
 
1923-b.e-eee-batchno-interdis-1.pdf
1923-b.e-eee-batchno-interdis-1.pdf1923-b.e-eee-batchno-interdis-1.pdf
1923-b.e-eee-batchno-interdis-1.pdfSasteKaam1
 
AUTOMATION OF ATTENDANCE USING DEEP LEARNING
AUTOMATION OF ATTENDANCE USING DEEP LEARNINGAUTOMATION OF ATTENDANCE USING DEEP LEARNING
AUTOMATION OF ATTENDANCE USING DEEP LEARNINGIRJET Journal
 
Intel Book Chapter 4
Intel Book Chapter 4Intel Book Chapter 4
Intel Book Chapter 4Thomas Price
 
IRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET Journal
 

Similar to employee job satisfaction project (20)

IRJET- Autonamy of Attendence using Face Recognition
IRJET- Autonamy of Attendence using Face RecognitionIRJET- Autonamy of Attendence using Face Recognition
IRJET- Autonamy of Attendence using Face Recognition
 
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
 
An eye gaze detection using low resolution web camera in desktop environment
An eye gaze detection using low resolution web camera in desktop environmentAn eye gaze detection using low resolution web camera in desktop environment
An eye gaze detection using low resolution web camera in desktop environment
 
DROWSINESS DETECTION MODEL USING PYTHON
DROWSINESS DETECTION MODEL USING PYTHONDROWSINESS DETECTION MODEL USING PYTHON
DROWSINESS DETECTION MODEL USING PYTHON
 
IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...
IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...
IRJET-Comparision of PCA and LDA Techniques for Face Recognition Feature Base...
 
Attendance System using Facial Recognition
Attendance System using Facial RecognitionAttendance System using Facial Recognition
Attendance System using Facial Recognition
 
IRJET- Robust Visual Analysis of Eye State
IRJET-  	  Robust Visual Analysis of Eye StateIRJET-  	  Robust Visual Analysis of Eye State
IRJET- Robust Visual Analysis of Eye State
 
IRJET - Facial Recognition based Attendance System with LBPH
IRJET -  	  Facial Recognition based Attendance System with LBPHIRJET -  	  Facial Recognition based Attendance System with LBPH
IRJET - Facial Recognition based Attendance System with LBPH
 
Real-time eyeglass detection using transfer learning for non-standard facial...
Real-time eyeglass detection using transfer learning for  non-standard facial...Real-time eyeglass detection using transfer learning for  non-standard facial...
Real-time eyeglass detection using transfer learning for non-standard facial...
 
Real time facial expression analysis using pca
Real time facial expression analysis using pcaReal time facial expression analysis using pca
Real time facial expression analysis using pca
 
Face recogntion using PCA algorithm
Face recogntion using PCA algorithmFace recogntion using PCA algorithm
Face recogntion using PCA algorithm
 
EE-2018-1303261-1.pdf
EE-2018-1303261-1.pdfEE-2018-1303261-1.pdf
EE-2018-1303261-1.pdf
 
REAL TIME DROWSINESS DETECTION
REAL TIME DROWSINESS DETECTIONREAL TIME DROWSINESS DETECTION
REAL TIME DROWSINESS DETECTION
 
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
 
FACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEMFACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEM
 
FACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEMFACE RECOGNITION ATTENDANCE SYSTEM
FACE RECOGNITION ATTENDANCE SYSTEM
 
1923-b.e-eee-batchno-interdis-1.pdf
1923-b.e-eee-batchno-interdis-1.pdf1923-b.e-eee-batchno-interdis-1.pdf
1923-b.e-eee-batchno-interdis-1.pdf
 
AUTOMATION OF ATTENDANCE USING DEEP LEARNING
AUTOMATION OF ATTENDANCE USING DEEP LEARNINGAUTOMATION OF ATTENDANCE USING DEEP LEARNING
AUTOMATION OF ATTENDANCE USING DEEP LEARNING
 
Intel Book Chapter 4
Intel Book Chapter 4Intel Book Chapter 4
Intel Book Chapter 4
 
IRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
IRJET - Human Eye Pupil Detection Technique using Center of Gravity Method
 

Recently uploaded

Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...
Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...
Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...RegineManuel2
 
8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCRdollysharma2066
 
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
Call Girl in Low Price Delhi Punjabi Bagh  9711199012Call Girl in Low Price Delhi Punjabi Bagh  9711199012
Call Girl in Low Price Delhi Punjabi Bagh 9711199012sapnasaifi408
 
定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一
定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一
定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一Fs
 
NPPE STUDY GUIDE - NOV2021_study_104040.pdf
NPPE STUDY GUIDE - NOV2021_study_104040.pdfNPPE STUDY GUIDE - NOV2021_study_104040.pdf
NPPE STUDY GUIDE - NOV2021_study_104040.pdfDivyeshPatel234692
 
Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...
Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...
Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...nitagrag2
 
Application deck- Cyril Caudroy-2024.pdf
Application deck- Cyril Caudroy-2024.pdfApplication deck- Cyril Caudroy-2024.pdf
Application deck- Cyril Caudroy-2024.pdfCyril CAUDROY
 
办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一
办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一
办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一A SSS
 
Preventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptxPreventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptxGry Tina Tinde
 
办理哈珀亚当斯大学学院毕业证书文凭学位证书
办理哈珀亚当斯大学学院毕业证书文凭学位证书办理哈珀亚当斯大学学院毕业证书文凭学位证书
办理哈珀亚当斯大学学院毕业证书文凭学位证书saphesg8
 
原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证
原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证
原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证diploma001
 
Outsmarting the Attackers A Deep Dive into Threat Intelligence.docx
Outsmarting the Attackers A Deep Dive into Threat Intelligence.docxOutsmarting the Attackers A Deep Dive into Threat Intelligence.docx
Outsmarting the Attackers A Deep Dive into Threat Intelligence.docxmanas23pgdm157
 
Escort Service Andheri WhatsApp:+91-9833363713
Escort Service Andheri WhatsApp:+91-9833363713Escort Service Andheri WhatsApp:+91-9833363713
Escort Service Andheri WhatsApp:+91-9833363713Riya Pathan
 
Ch. 9- __Skin, hair and nail Assessment (1).pdf
Ch. 9- __Skin, hair and nail Assessment (1).pdfCh. 9- __Skin, hair and nail Assessment (1).pdf
Ch. 9- __Skin, hair and nail Assessment (1).pdfJamalYaseenJameelOde
 
Final Completion Certificate of Marketing Management Internship
Final Completion Certificate of Marketing Management InternshipFinal Completion Certificate of Marketing Management Internship
Final Completion Certificate of Marketing Management InternshipSoham Mondal
 
MIdterm Review International Trade.pptx review
MIdterm Review International Trade.pptx reviewMIdterm Review International Trade.pptx review
MIdterm Review International Trade.pptx reviewSheldon Byron
 
定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一
定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一
定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一z xss
 
定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一
定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一
定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一fjjwgk
 
Black and White Minimalist Co Letter.pdf
Black and White Minimalist Co Letter.pdfBlack and White Minimalist Co Letter.pdf
Black and White Minimalist Co Letter.pdfpadillaangelina0023
 
Protection of Children in context of IHL and Counter Terrorism
Protection of Children in context of IHL and  Counter TerrorismProtection of Children in context of IHL and  Counter Terrorism
Protection of Children in context of IHL and Counter TerrorismNilendra Kumar
 

Recently uploaded (20)

Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...
Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...
Drawing animals and props.pptxDrawing animals and props.pptxDrawing animals a...
 
8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Pitampura Delhi NCR
 
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
Call Girl in Low Price Delhi Punjabi Bagh  9711199012Call Girl in Low Price Delhi Punjabi Bagh  9711199012
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
 
定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一
定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一
定制(Waikato毕业证书)新西兰怀卡托大学毕业证成绩单原版一比一
 
NPPE STUDY GUIDE - NOV2021_study_104040.pdf
NPPE STUDY GUIDE - NOV2021_study_104040.pdfNPPE STUDY GUIDE - NOV2021_study_104040.pdf
NPPE STUDY GUIDE - NOV2021_study_104040.pdf
 
Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...
Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...
Escorts Service Near Surya International Hotel, New Delhi |9873777170| Find H...
 
Application deck- Cyril Caudroy-2024.pdf
Application deck- Cyril Caudroy-2024.pdfApplication deck- Cyril Caudroy-2024.pdf
Application deck- Cyril Caudroy-2024.pdf
 
办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一
办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一
办理学位证(Massey证书)新西兰梅西大学毕业证成绩单原版一比一
 
Preventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptxPreventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptx
 
办理哈珀亚当斯大学学院毕业证书文凭学位证书
办理哈珀亚当斯大学学院毕业证书文凭学位证书办理哈珀亚当斯大学学院毕业证书文凭学位证书
办理哈珀亚当斯大学学院毕业证书文凭学位证书
 
原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证
原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证
原版定制卡尔加里大学毕业证(UC毕业证)留信学历认证
 
Outsmarting the Attackers A Deep Dive into Threat Intelligence.docx
Outsmarting the Attackers A Deep Dive into Threat Intelligence.docxOutsmarting the Attackers A Deep Dive into Threat Intelligence.docx
Outsmarting the Attackers A Deep Dive into Threat Intelligence.docx
 
Escort Service Andheri WhatsApp:+91-9833363713
Escort Service Andheri WhatsApp:+91-9833363713Escort Service Andheri WhatsApp:+91-9833363713
Escort Service Andheri WhatsApp:+91-9833363713
 
Ch. 9- __Skin, hair and nail Assessment (1).pdf
Ch. 9- __Skin, hair and nail Assessment (1).pdfCh. 9- __Skin, hair and nail Assessment (1).pdf
Ch. 9- __Skin, hair and nail Assessment (1).pdf
 
Final Completion Certificate of Marketing Management Internship
Final Completion Certificate of Marketing Management InternshipFinal Completion Certificate of Marketing Management Internship
Final Completion Certificate of Marketing Management Internship
 
MIdterm Review International Trade.pptx review
MIdterm Review International Trade.pptx reviewMIdterm Review International Trade.pptx review
MIdterm Review International Trade.pptx review
 
定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一
定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一
定制(SCU毕业证书)南十字星大学毕业证成绩单原版一比一
 
定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一
定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一
定制(ECU毕业证书)埃迪斯科文大学毕业证毕业证成绩单原版一比一
 
Black and White Minimalist Co Letter.pdf
Black and White Minimalist Co Letter.pdfBlack and White Minimalist Co Letter.pdf
Black and White Minimalist Co Letter.pdf
 
Protection of Children in context of IHL and Counter Terrorism
Protection of Children in context of IHL and  Counter TerrorismProtection of Children in context of IHL and  Counter Terrorism
Protection of Children in context of IHL and Counter Terrorism
 

employee job satisfaction project

  • 1. INTELLECTUAL ANALYSIS OF STUDENT'S OBSERVATION USING DATA MINING AND PATTERN MATCHING A PROJECTREPORT Submitted by PRADHEEBA.B (422711104060) SASIREKA.R (422711104082) VANMATHI.S (422711104100) In partial fulfilmentfor the award of the degree of BACHELOR OF ENGINEERING in COMPUTER SCIENCEAND ENGINEERING V.R.S. COLLEGE OF ENGINEERING AND TECHNOLOGY ARASUR ANNA UNIVERSITY::CHENNAI 600 025 APRIL 2015 ANNA UNIVERSITY:CHENNAI 600 025 BONAFIDE CERTIFICATE
  • 2. Certified that this project report “INTELLECTUALANALYSIS OF STUDENT'S OBSERVATION USING DATA MINING AND PATTERN MATCHING”is the bonafide work of R.SASIREKA [422711104082] who carried out the project work under my supervision. SIGNATURE SIGNATURE Mr. A.PARTHASARATHY, Prof. J.K.JOTHIKALPANA, M.E., MBA., (Ph.D.,) M.Tech., (Ph.D.,) HEAD OF THE DEPARTMENT SUPERVISOR Associate Professor, Professor, Department of Computer Department of Computer Science and Engineering Science and Engineering V.R.S. College of Engineering V.R.S. College of Engineering and Technology, and Technology, Arasur - 607107. Arasur - 607107. Submitted for the University Examination held on 8.4.2015 INTERNAL EXAMINER EXTERNAL EXAMINER ACKNOWLEDGEMENT We express our sincere deep sense of gratitude to our project guide Prof.J.K.JothiKalpana M.Tech.,(Ph.D.,)Associate Professor, Department of Computer Science and Engineering, for her proficient and meticulous to consummate our project.
  • 3. We acknowledge our grateful and sincere thanks to our project co- coordinator and Head of the Department Mr.A.Parthasarathy M.E, M.B.A.,(Ph.D)., Computer Science and Engineering for his valuable suggestions and help toward us. With profoundness, we are indebted to our Principal Dr.N.Anbazhaghan M.E.,Ph.D., for giving constant motivation in succeeding our goal. We extend our deep sense of thanks to our Chief Executive Officer Er.M.SaravananM.E (Ph.D)., for providing us an opportunity to take up his valuable advice to develop technical knowledge in planning our project confidently. We express our extreme gratitude and heart full thanks to our Chair Person Tmt.Vijaya Muthuvanan Secretary and Correspondent Rtn.S.R.Ramanathan and Director, Board of Governor Thiru.N.Muthuvanan for providing facilities to do our project successfully. We wish to express our sincere thanks to all those who helped us in making this project successful.
  • 4. ABSTRACT In this project a fatigue detection technique is based on computer vision. Fatigue is detected from face and facial features of student. Hybrid method is used for face and facial feature detection which not only increase the accuracy of the system but also decrease the processing time. Skin colour pixels detection and viola Jones methods is used for face detection and knowledge based division method is used to increase the accuracy of facial feature detection. Also a dynamic threshold value is used for yawning and eyes status detection. This also addresses two issues for mitigating student distraction/inattention by using novel video analysis techniques: (a) inside an ego class room, student inattention is monitored through first tracking students face/eye region using particle filters, followed by recognition of dynamic eye states using computer vision system. Frequencies of eye blinking and eye closure are used as the indication of sleepy and warning sign is then generated for recommendation; (b) outside an ego class room is analysed. Surrounding class rooms (in both directions) are tracked, and their states are analyzed by sensors. These pieces of information are provided for mitigating student’s inattention. The main novelties of the proposed scheme include facial geometry based eye region detection for eye closure identification, combined tracking and detection of class rooms. CHAPTER NO. TITLE PAGE NO.
  • 5. ABSTRACT i LIST OF FIGURES ii LIST OF TABLES iii LIST OF ABBREVATIONS iv 1 INTRODUCTION 1 1.1 Objectives 1 2 LITERATURE REVIEW 2 2.1 EXISTING SYSTEM 2 2.1.1 Demerits of Existing System 2 2.2 PROPOSEDSYSTEM 2 2.2.1Merits of Proposed System 3 2.3 APPLICATIONS 3 2.4 BLOCK DIAGRAM 4
  • 6. TABLE OF CONTENTS 4 SOFTWARE PROFILE 6 4.1 INTRODUCTION 6 4.2 DEFINITION OF MATLAB 7 4.2.1 Starting MATLAB 7 4.2.2 Setting your initial current directory 8 4.2.3 Setting up MATLAB environment options 8 4.2.4 Configuration certain products 8 4.2.5 Excel link versions 9 4.2.6 Finding information about MATLAB 10 4.2.7 Syntax 11 3 SYSTEM REQUIREMENTS 5 3.1HARDWARE REQUIREMENT 5 3.2 SOFTWAREREQUIREMENT 5
  • 7. 4.2.8 Variables 11 4.2.9 Vectors/Matrices 12 4.2.10 Graphical User Interface Programming 15 4.2.11Object Oriented Programming 17 4.2.12 Interfacing with other Languages 17 4.2.13 License 18 4.2.14 Alternatives 19 5 COMPUTER VISION TOOL BOX 20 5.1COMPUTER VISION SYSTEM TOOLBOX 20 5.2 IMAGE PROCESSING TOOL BOX 21 5.3REGION OF INTEREST 22 6 MODULE DESCRIPTION 23 6.1 DETECTED FACIAL FEATURES IN FACE REGION 23 6.2 DETECTION OF OPEN EYES 24 6.3DETECTION OF CLOSED 26
  • 8. EYES 6.4 FACE AND MOUTH DETECTION 28 7 CONCLUSION 34 8 FUTURE ENHANCEMENT 35 APPENDIX I 36 APPENDIX II 39 REFERENCES 43
  • 9. LIST OF FIGURES FIGURES NO. TITLE PAGE NO. 2.4 Block Diagram 04 4.1 Three dimensional graphics 15 4.2 Normalized sinc function 16 4.3 Unnormalized sinc function 16 6.1 Detected Facial Feature In Face Region 24 6.2 Detection of Open Eyes 25 6.3 Detection of Closed Eyes 28 6.4 Face detection using viola Jones method on from skin color region. 31 6.5 Face and facial feature detection. 32
  • 10. ii LIST OF TABLES TABLE NO. TITLE PAGE NO. 4.1 Product and Commands 9 4.2 Excel file used with Excel Version 9 4.3 Finding information about matlab 10 6.3 Yawning state 27 6.6 Detection of yawning state of test users. 33 6.7 Closed Eyes Detection of Test Users. 34
  • 11. iii LIST OF ABBREVATIONS MATLAB MATrix LABoratory GPU Graphics Processing Unit DCT Discrete Cosine Transform FFT Fast Fourier Transform LINPACK Linear Pack EISPACK EIgen System PACKage LAPACK Linear Algebra PACKage IDL Interactive Data Language APL Applied Physics Letter IL ILlinois GUIDE GUI Development Environment
  • 12. CHAPTER 1 INTRODUCTION Now a days the students’ mentality is changed based on their interest in studies. In the class room the staff members going to deliver lecture class and observe how the students’ can understand the concept. For students the class should be interesting and more interactive when the staff delivers the lecture. From this we can analyze the average student face and their eyes; finally we conclude both the qualitative and quantitative results. When a face image is act as a input to the system, face detection will be performed to locate the rough face region. The second step is to locate two rough regions of eyes in the face. There are two default eye states: open and closed. In our project we are going to discuss about four different stages like Partially opened, Fully opened, and Fully closed. If a man can see something, it is considered that his eyes are open. Our criterion is that if the iris and the white of eye are visible, the eye is open. Otherwise, the eye is closed. The student data is compared with student eye. The CGPA (Cumulative Grade Point Average) attribute in the data set contains a large number of continuous values. For
  • 13. example, we grouped all GPAs into five categorical segments; Excellent, Very good, Good, Average and Poor. 1.1 OBJECTIVE The main objective of our project was to analyze the student performance in a class room. From this we are going to track the face, detect the eye region, detect the mouth region whether the eye is opened or not and mouth is yawning. CHAPTER 2 LITERATURE REVIEW 2.1 EXISTING SYSTEM In the existing system the Riemannian Manifold is impossible to give the output and this give more time for the calculation. For differentiable manifolds, it is impossible to define the derivatives of the curves on the manifold. 2.1.1 Demerits of Existing System • Speculiar output is impossible • More time efficient should take for calculations • This cannot be designed for any other objects
  • 14. 2.2 PROPOSED SYSTEM In the proposed system we are using the students face for the analysis. From this analysis the student face is tracked and eye is detected for the execution. We are using the Viola-Jones algorithm for the face tracking system. The fatigue detection system consists of 6 levels which can be classified as • Image acquisition • Face detection • Facial features detection • Eyes status detection • Yawning status detection • Drowsiness detection 2.2.1 Merits of Proposed System • Extremely fast feature computation. • Efficient feature selection. • Scale and location invariant detector. • Instead of scaling the image itself (e.g. pyramid-filters), we scale the features. 2.3 APPLICATIONS
  • 15. • It can be used in schools ,colleges and other institutions. • It is used in railways, airways and roadways. • It is used for analysis of face. 2.4 BLOCK DIAGRAM Fig. 2.4 Block Diagram
  • 16. CHAPTER 3 REQUIREMENTS SPECIFICATION 3.1 HARDWARE REQUIREMENTS The hardware requirements may serve as the basis for a contract for the implementation of the system and should therefore be a complete and consistent specification of the whole system. It is used by software engineers as the starting point for the system design. It should what the system do and how it should be implemented. • Processor : core i3 • RAM : 5.2 GHz • Hard Disk : 40 GB and Above. • Monitor : 15" COLOR • Web Camera : 9 megapixels(MP)
  • 17. 3.2 SOFTWARE REQUIREMENTS The software requirement document is the specification of the system. It should include both a definition and a specification of requirements. The basis of creating the software. It is useful in estimating cost, planning team activities, performing tasks and tracking the teams and tracking the team progress throughout the development activity. • MATLAB 8.1.0 Version R2013a • Window 7
  • 18. CHAPTER 4 SOFTWARE PROFILE 4.1 INTRODUCTION Cleve Moler, the chairman of the computer science department at the University of New Mexico, started developing MATLAB in the late 1970s. He designed it to give his students access to LINPACK and EISPACK without them having to learn Fortran. It soon spread to other universities and found a strong audience within the applied mathematics community. Jack Little, an engineer, was exposed to it during a visit Moler made to Stanford University in 1983. Recognizing its commercial potential, he joined with Moler and Steve Bangert. They rewrote MATLAB in C and founded MathWorks in 1984 to continue its development. These rewritten libraries were known as JACKPAC. In 2000, MATLAB was rewritten to use a newer set of libraries for matrix manipulation, LAPACK. MATLAB was first adopted by researchers and practitioners in control engineering, Little's specialty, but quickly spread to many other domains. It is now also used in education, in particular the teaching of linear algebra,
  • 19. numerical analysis, and is popular amongst scientists involved in image processing. 4.2 DEFINITION OF MATLAB MATLAB (MATrix LABoratory) is a programming language for technical computing from the math works, Natick, MA. It is used for a wide variety of scientific and engineering calculation, especially for automatic control and signal processing. MATLAB runs on windows, Mac and a variety of unit based systems Developed by Cleve Moler in the late of 1970's and based on the original LINPACK and EISPACK FORTRAN libraries, it was initially used for factoring matrices and solving linear equations. Moler commercialized the product with two colleagues in 1984. MATLAB is also noted for its extensive graphics capabilities. 4.2.1 Starting MATLAB To start MATLAB, you can use any of these methods. Double-click on the MATLAB icon (called a "shortcut") that the installer creates on your desktop.
  • 20. Click on the Start button, select Programs, and click on the MATLAB 6.5 entry. Select MATLAB 6.5 from this menu.Using Windows Explorer, open your top-level MATLAB installation directory and double-click on the shortcut to the MATLAB executable, MATLAB 6.5. 4.2.2 Setting your initial current directory By default, when you start MATLAB using the shortcut the installer puts on your desktop, the initial current directory is the $MATLABwork directory, where $MATLAB represents the name of your installation directory. The work directory is a good place to store the M-files you modify and create because the MATLAB uninstaller does not delete the work directory if it contains files. You can, however, use any directory as your MATLAB initial current directory. To specify another directory as your initial current directory, right-click on the MATLAB shortcut that the installer creates on your desktop and select the Properties option. Specify the name of the directory in the Start in field. 4.2.3 Setting up matlab environment options To include welcome messages, default definitions, or any MATLAB expressions that you want executed every time MATLAB is invoked, create a file named startup.m in the $MATLABtoolboxlocal directory. MATLAB executes this file each time it is invoked. 4.2.4 Configuring certain products Certain products require additional configuration. The following table lists these products and the commands used to configure them. If you installed
  • 21. any of these products, see the documentation for that product for detailed configuration information. PRODUCT COMMAND MATLAB Notebook Notebook-setup MATLAB Runtime server Rtsetup Real-Time Windows Target rtwintgt -setup Table 4.1 Products and the Commands 4.2.5 Excel link versions By default, Excel Link (a separately orderable product) supports two versions of Excel. This table lists which Excel Link files to use with each Excel version. Use the appropriate files for your version of Excel. You can find these files in the $MATLABtoolboxexlink subdirectory. EXCEL VERSION EXCEL LINK FILE
  • 22. Excel 97 (Default) excllink.xla ExliSamp.xls Excel 7 excllink95.xla ExliSamp95.xls Table 4.2 Excel files used with Excel version 4.2.6 Finding information about MATLAB After successfully installing MATLAB, you are probably eager to get started using it. This list provides pointers to sources of information and other features you may find helpful in getting started with MATLAB. TASKS DESCRIPTION To get an overview of MATLAB and its capabilities Read the Release Notes documentation To find out what's new in this release Read the Release Notes documentation To start a productor run one of the demonstration programs Use the Launch Pad in the MATLAB desktop To get information about specific MATLAB feature Choose the Help item in the MATLAB menu bar to view reference and tutorial information in hyperlinked HTML form.
  • 23. To get help with specific questions you can't find answered in the documentation Go to the Math Works Web site (www.mathworks.com), click on Support, and use the Technical Support solution search area to find more information Table4.3 Finding information about MATLAB 4.2.7 Syntax The MATLAB application is built around the MATLAB language, and most use of MATLAB involves typing MATLAB code into the Command Window (as an interactive mathematical shell), or executing text files containing MATLAB code, including scripts and/or functions. 4.2.8 Variables Variables are defined using the assignment operator, =. MATLAB is a weakly typed programming language because types are implicitly converted. It is an inferred typed language because variables can be assigned without declaring their type, except if they are to be treated as symbolic objects, and that their type can change. Values can come from constants, from computation involving values of other variables, or from the output of a function. For example: >> x = 17 x = 17 >> x = 'hat' x = hat >> y = x + 0
  • 24. y = 104 97 116 >> x = [3*4, pi/2] x = 12.0000 1.5708 y = -1.6097 3.0000 4.2.9 Vectors/Matrices A simple array is defined using the colon syntax: init:increment:terminator. For instance: >> array = 1:2:9 array = 1 3 5 7 9 defines a variable named array (or assigns a new value to an existing variable with the name array) which is an array consisting of the values 1, 3, 5, 7, and 9. That is, the array starts at 1 (the init value), increments with each step from the previous value by 2 (the increment value), and stops once it reaches (or to avoid exceeding) 9 (the terminator value). >> array = 1:3:9 array = 1 4 7 the increment value can actually be left out of this syntax (along with one of the colons), to use a default value of 1. >> ari = 1:5 ari = 1 2 3 4 5 assigns to the variable named ari an array with the values 1, 2, 3, 4, and 5, since the default value of 1 is used as the incrementer.
  • 25. Indexing is one-based, which is the usual convention for matrices in mathematics, although not for some programming languages such as C, C++, and Java. Matrices can be defined by separating the elements of a row with blank space or comma and using a semicolon to terminate each row. The list of elements should be surrounded by square brackets: []. Parentheses: () are used to access elements and subarrays (they are also used to denote a function argument list). >> A = [16 3 2 13; 5 10 11 8; 9 6 7 12; 4 15 14 1] A = 16 3 2 13 5 10 11 8 9 6 7 12 4 15 14 1 >> A(2,3) ans = 11 Sets of indices can be specified by expressions such as "2:4", which evaluates to [2, 3, 4]. For example, a submatrix taken from rows 2 through 4 and columns 3 through 4 can be written as: >> A(2:4,3:4) ans = 11 8 7 12 14 1 A square identity matrix of size n can be generated using the function eye, and matrices of any size with zeros or ones can be generated with the functions zeros and ones, respectively.
  • 26. >> eye(3,3) ans = 1 0 0 0 1 0 0 0 1 >> zeros(2,3) ans = 0 0 0 0 0 0 >> ones(2,3) ans = 1 1 1 1 1 1 Most MATLAB functions can accept matrices and will apply themselves to each element. For example, mod(2*J,n) will multiply every element in "J" by 2, and then reduce each element modulo "n". MATLAB does include standard "for" and "while" loops, but (as in other similar applications such as R), using the vectorized notation often produces code that is faster to execute. This code, excerpted from the function magic.m, creates a magic square M for odd values of n (MATLAB function meshgrid is used here to generate square matrices I and J containing 1:n). [J,I] = meshgrid(1:n); A = mod(I + J - (n + 3) / 2, n); B = mod(I + 2 * J - 2, n); M = n * A + B + 1;
  • 27. Method call behavior is different between value and reference classes. For example, a call to a method object.method(); can alter any member of object only if object is an instance of a reference class. 4.2.10 Graphics and graphical user interface programming MATLAB supports developing applications with graphical user interface features. MATLAB includes GUIDE (GUI development environment) for graphically designing GUIs. It also has tightly integrated graph-plotting features. For example the function plot can be used to produce a graph from two vectors x and y. The code: x = 0:pi/100:2*pi; y = sin(x); plot(x,y) Fig.4.1 Three Dimensional Graphics
  • 28. A MATLAB program can produce three-dimensional graphics using the functions surf, plot3 or mesh. [X,Y] = meshgrid(-10:0.25:10,- 10:0.25:10); f = sinc(sqrt((X/pi).^2+(Y/pi).^2)); mesh(X,Y,f); axis([-10 10 -10 10 -0.3 1]) xlabel('{bfx}') ylabel('{bfy}') zlabel('{bfsinc} ({bfR})') hidden off [X,Y] = meshgrid(-10:0.25:10,- 10:0.25:10); f = sinc(sqrt((X/pi).^2+(Y/pi).^2)); surf(X,Y,f); axis([-10 10 -10 10 -0.3 1]) xlabel('{bfx}') ylabel('{bfy}') zlabel('{bfsinc} ({bfR})') 4.2 Two-dimensionalnormalized 4.3 Two-dimensional unnormalized sinc function sinc function
  • 29. In MATLAB, graphical user interfaces can be programmed with the GUI design environment (GUIDE) tool. 4.2.11 Object-oriented programming MATLAB's support for object-oriented programming includes classes, inheritance, virtual dispatch, packages, pass-by-value semantics, and pass-by- reference semantics. classdef hello methods function greet(this) disp('Hello!') end end end When put into a file named hello.m, this can be executed with the following commands: >> x = hello; >> x.greet(); Hello! 4.2.12 Interfacing with other languages MATLAB can call functions and subroutines written in the C programming language or Fortran. A wrapper function is created allowing
  • 30. MATLAB data types to be passed and returned. The dynamically loadable object files created by compiling such functions are termed "MEX-files" (for MATLAB executable). Libraries written in Perl, Java, ActiveX or .NET can be directly called from MATLAB, and many MATLAB libraries (for example XML or SQL support) are implemented as wrappers around Java or ActiveX libraries. Calling MATLAB from Java is more complicated, but can be done with a MATLAB toolbox which is sold separately by MathWorks, or using an undocumented mechanism called JMI (Java-to-MATLAB Interface), (which should not be confused with the unrelated Java Metadata Interface that is also called JMI).As alternatives to the MuPAD based Symbolic Math Toolbox available from MathWorks, MATLAB can be connected to Maple or Mathematica. 4.2.13 License MATLAB is a proprietary product of MathWorks, so users are subject to vendor lock-in. Although MATLAB Builder products can deploy MATLAB functions as library files which can be used with. NETor Java application building environment, future development will still be tied to the MATLAB language. Libraries also exist to import and export MathML. Each toolbox is purchased separately. If an evaluation license is requested, the MathWorks sales department requires detailed information about the project for which MATLAB is to be evaluated. If granted (which it often is), the evaluation license is valid for two to four weeks. A student version of MATLAB is available as is a home-use license for MATLAB, SIMULINK, and
  • 31. a subset of Mathwork's Toolboxes at substantially reduced prices.It has been reported that EU competition regulators are investigating whether MathWorks refused to sell licenses to a competitor. 4.2.14 Alternatives See also: list of numerical analysis software and comparison of numerical analysis software. MATLAB has a number of competitors. Commercial competitors include Mathematica, TK Solver, Maple, and IDL. There are also free open source alternatives to MATLAB, in particular GNU Octave, Scilab, FreeMat, Julia, and Sage which are intended to be mostly compatible with the MATLAB language. Among other languages that treat arrays as basic entities (array programming languages) are APL, Fortran 90 and higher, S-Lang, as well as the statistical languages R and S. There are also libraries to add similar functionality to existing languages, such as IT++ for C++, Perl Data Language for Perl, ILNumerics for .NET, NumPy/SciPy for Python, and Numeric.js for JavaScript. GNU Octave stands out as it treats incompatibility with MATLAB as a bug (see GNU Octave#Matlab), therefore it aims to provide a software clone.
  • 32. CHAPTER 5 COMPUTER VISION TOOL BOX Using images and video to detect, classify, and track objects or events in order to “understand” a real-world scene Image Processing  Remove noise Adjust contrast Measure Computer Vision  Detect Identify Classify Recognize Trac 5.1 COMPUTER VISION SYSTEM TOOLBOX Design and simulate computer vision and video processing systems Feature detection • Feature extraction and matching • Feature-based registration • Motion estimation and tracking • Stereo vision • Video processing • Video file I/O, display, and graphics9
  • 33. 5.2 IMAGE PROCESSINGTOOLBOX Image Processing Toolbox™ provides a comprehensive set of reference- standard algorithms, functions, and apps for image processing, analysis, visualization, and algorithm development. You can perform image analysis, image segmentation, image enhancement, noise reduction, geometric transformations, and image registration. Many toolbox functions support multicore processors, GPUs, and C-code generation. Image Processing Toolbox supports a diverse set of image types, including high dynamic range, gigapixel resolution, embedded ICC profile, and tomographic. Visualization functions and apps let you explore images and videos, examine a region of pixels, adjust color and contrast, create contours or histograms, and manipulate regions of interest (ROIs). The toolbox supports workflows for processing, displaying, and navigating large images. Key Features • Image analysis, including segmentation, morphology, statistics, and measurements. • Image enhancement, filtering, and deblurring. • Geometric transformations and intensity-based image registration methods. • Image transforms, including FFT, DCT, Radon, and fan-beam projection. • Large image workflows, including block processing, tiling, and multiresolution display.
  • 34. • Visualization apps, including Image Viewer and Video Viewer. • Multicore- and GPU-enabled functions, and C-codegeneration support. 5.3 A REGION OF INTEREST (ROI) A region of interest (ROI) is a portion of an image that you want to filter or perform some other operation on. You define an ROI by creating a binary mask, which is a binary image that is the same size as the image you want to process with pixels that define the ROI set to 1 and all other pixels set to 0. You can define more than one ROI in an image. The regions can be geographic in nature, such as polygons that encompass contiguous pixels, or they can be defined by a range of intensities. In the latter case, the pixels are not necessarily contiguous.
  • 35. CHAPTER 6 MODULE DESCRIPTION 6.1 DETECTED FACIAL FEATURES IN FACE REGION Once face is detected, the second step is to detect eyes and mouth in the face area. The detected face area is extracted from input image. The system divides the extracted face image into three sub portions upper left half, upper right half and lower half. These divisions are made on the basis of physical approximation of eyes and mouth locations. After division of image into three parts, Viola Jones method is applied on upper half part of the image for eyes detection and on lowers half part of the image for mouth detection. The advantage of division of image into three parts is that when we apply viola Jones method for one eye detection, the detector only search left upper part of the image for left eye detection instead of searching the whole image. Similarly for second eye detection, the detector searches the right upper part of the image and for mouth detection the detector searches in the lower part of the image only. This not only increases the accuracy of the detection of the facial features but also decreases the processing time.
  • 36. The detection of the facial feature is shown in figure 3. After detection of facial features, the next step is to check weather eyes are open or closed. Since every person has his own facial features so it becomes difficult to determine eyes status by using general data. To do so the system detected eyes information from first fifty frames of every coming one thousand frames. The detected eye region is extracted and converted into binary image. Then the number of black and white pixels in that region are calculated and stored in data. In case of open eyes the ratio of black pixels in the region is greater than in case of closed eyes. Since blinking is negligible, the data of first fifty frames can provide the approximate ratio of black and white pixel in case of open eyes. Fig.6.1 Detected Facial Feature in Face Region
  • 37. Fig.6.1 Methodology 6.2.DETECTION OF OPEN EYES This approximate ratio is used to calculate a threshold value which can be used in coming frames to determine state of eyes. The mathematical expression to calculate threshold value is: Thresholdeyes=black/whiteavg of 30 frames- Black/white avg of 30 frames/3 (1) Once the threshold value is calculated, the ratio of black and white pixels in coming frames is compared to the threshold value. If the ratio of black and
  • 38. white pixels in coming frame is less then threshold value, the eyes are considered to be closed and if ratio is greater than the threshold value it means eyes are open. The mathematical representation of closed eyes detection is given in equation 2. Figure 4 shows the detection of closed eyes after comparing the black and white pixels of eye region with threshold value. Real time image = Closed if black/whiteeye region < Thresholdeyes Open eyes other wise (2) Fig.6.2 Detection of Open Eyes
  • 39. 6.3 DETECTION OF CLOSED EYES The next step is to analyze the yawning condition of mouth. Yawning condition can be identified by calculating the height of mouth. When a person is in yawning state height of his/her mouth is increases by a specific amount. Since every person has different mouth size, so in yawning condition the height of mouth for every person will be different that’s why we need a dynamic threshold value for every person separately that can be compared to the normal size of mouth of that particular person. To calculate the threshold value dynamically, the system stores the information of first fifty frames of the video. The threshold value is then calculated from the first fifty frames shown in equation 3. Thresholdmouth= Mouth heightAvg of 30 frames + Mouth heightAvg of 30 frames/3 (3) Once the threshold value is calculated, the system compares the height of the mouth in coming frames with the threshold value. If the height of the mouth in the coming frame is greater than threshold value it means mouth is in yawning state and if the value of the height of the mouth in coming frame is less than the threshold value it means mouth is in normal condition. This can be expressed in mathematical form as,
  • 40. Real time image = Yawning if current height > Thresholdmouth Not Yawning other wise (4) Table 6.3 show that in case of closed mouth the height of the mouth in pixels is 26 and in case of open mouth the height of the mouth in pixels is 39. The system compare the value of the height of the mouth with threshold value (which is calculated from first fifty frames) and analyze if the height of mouth is less then threshold then it is detected as normal condition otherwise it is addressed as yawning state.
  • 41. Table 6.3 Yawning state The system continuously monitors the status of the eyes and mouth. If eyes are detected as close for more than 1.5 seconds or if yawning condition is detected while lessening. The system reports drowsiness and give an alarm to diver. Fig.6.3 Detection of Closed Eyes Algorithm for Segmentation 1. Define the neighbourhood of each feature (random variable in MRF terms). Generally this includes 1st order or 2nd order neighbours. 2. Set initial probabilities i for each feature as 0 or 1, where i is the set containing features extracted for pixel and define an initial set of clusters. 3. Using the training data compute the mean ( li) and variance ( li) for each label. This is termed as class statistics. 4. Compute the marginal distribution for the given labelling scheme i i using Bayes' theorem and the class statistics calculated earlier. A Gaussian model is used for the marginal distribution.
  • 42. 5. Calculate the probability of each class label given the neighbourhood defined previously. Clique potentials are used to model the social impact in labelling. 6. Iterate over new prior probabilities and redefine clusters such that these probabilities are maximized. This is done using a variety of optimization algorithms described below. 7. Stop when probability is maximized and labelling scheme does not change. The calculations can be implemented in log likelihood terms as well. 6.4 FACE AND MOUTH DETECTION A cascade of boosted classifiers working with Haar-like features is used to detect a user’s face in images captured by a web camera. It is a very efficient and effective algorithm for visual object detection. Each classifier in the cascade consists of a set of weak classifiers based on one image feature each. Features used for face detection are grey-level differences between sums of pixel values in different, rectangle regions in an image window. The window slides over the image and changes its scale. Image features may be computed rapidly for any scale and location in a video frame using integral images. For each window, the decision is made whether the window contains a face; all classifiers in the cascade must detect a face for the classification result to be positive. If any classifiers fails to detect a face, the classification process is halted and the final result is negative. Classifiers in the cascade are trained with AdaBoost algorithm that is tuned to minimize false negatives error ratio.
  • 43. Classifiers in the cascade are combined in the order of increased complexity; initial classifies are based on a few features only. This makes possible for the algorithm to work in the real time because it allows background regions of the image to be quickly discarded while spending more computation on promising regions. Face detection algorithm finds location of all faces in every video frame. It is assumed, that only one person is present in the camera field of view therefore only the first face location is used for further processing. In order to increase speed of the face detection and to make sure that the face is large enough to recognize lip gestures, the minimal width of a face was set to the half of the image frame width. Sample results of face detection and mouth region finding are pictured in the figure. The mouth region is localized arbitrary in the lower part of the face region detected. It is defined by the half- ellipse horizontally centered in the lower half of the face region. The width and the height of the half-ellipse is equal to the half of the height and half of the width of the face region, respectively.
  • 44. Fig.6.4 Face detection using viola Jones method on from skin color region. Fig.6.5 Faceand facialfeature detection
  • 45. Fig.6.6 Detection of yawning state of test users
  • 46. Fig.6.7 Closed eyes detection of test users CHAPTER 7 CONCLUSION This paper describes a fatigue detection system based on lessoning behaviour. The system uses skin color pixels detection and VJ method for face detection. Facial feature detection is achieved by dividing the image into three parts and applying VJ method on each part of the image. For accurate detection of yawning and eyes status a threshold value is calculated dynamically and each coming frame is compared to the threshold value for drowsiness detection. The processing time for drowsiness detection decreases due to fast detection rate of face and facial features. The accuracy level of face and facial feature detection is increased due to application of hybrid method, which in turn increases the accuracy level of the whole system. The system is designed on software bases only and Matlab software is used for simulation. Since VGA camera is used for
  • 47. image acquisition, hence the system works in daylight only. Using of night vision camera in future might make the system able to detect drowsiness level of student in night time as well. CHAPTER 8 FUTURE ENHANCEMENT This analysis will be useful in schools, colleges, educational institutes and institutions to record the presence of students in their class session. In our project we choose to detect a single student presence or activity in the class by detecting his eyes, but in future we may increase the number of student observation by using Sensors. Using Sensors we can easily trace the eyes action of each student and the Alert will be sent to the Class handling person. This project is very helpful to the teachers to trace the student observation in the classroom and they will change their teaching style and make student to concentrate in the class.
  • 48. APPENDIX I COMPARISON OF STUDENT DATA WITH STUDENT EYE %% Drowsy Detection System %% Clear and Close Everything clc; clear all; close all; %% Main Code EyeDetect = vision.CascadeObjectDetector('EyePairBig'); MouthDetect = vision.CascadeObjectDetector('Mouth','MergeThreshold',150);
  • 49. vidDevice = imaq.VideoDevice('winvideo', 1, 'YUY2_640x480', ... % Acquire input video stream 'ROI', [1 1 640 480], ... 'ReturnedColorSpace', 'rgb'); vidInfo = imaqhwinfo(vidDevice); % Acquire input video property reqToolboxes = {'Computer Vision System Toolbox', 'Image Processing Toolbox'}; hVideoIn = vision.VideoPlayer('Name', 'Final Video', ... % Output video player 'Position', [100 100 vidInfo.MaxWidth+20 vidInfo.MaxHeight+30]); nFrame = 0; % Frame number initialization while(nFrame < 12) if nFrame==0 img= step(vidDevice); % Acquire single frame pause(2); end img= step(vidDevice); % Acquire single frame BB=step(EyeDetect,img); CC=step(MouthDetect,img); imshow(img); rectangle('Position',BB,'LineWidth',2,'LineStyle','-','EdgeColor','b'); rectangle('Position',CC,'LineWidth',2,'LineStyle','-','EdgeColor','r'); title('Drowsy Detection'); eye=imcrop(img,BB); mouth=imcrop(img,CC); imwrite(mouth,['mouth',num2str(nFrame),'.jpg']); imwrite(eye,['eye',num2str(nFrame),'.jpg']);
  • 50. %step(hVideoIn,BB); % Output video stream eye=rgb2gray(eye); eye=im2bw(eye,.15); %imshow(eye); [m,n]=size(eye); White_pix=0; Black_pix=0; for j=1:n for i=1:m if eye(i,j)==1 White_pix=White_pix+1; else Black_pix=Black_pix+1; end end end Black_pix mouth=rgb2gray(mouth); mouth=im2bw(mouth,.15); %imshow(eye); [mm,nn]=size(mouth); White_pix1=0; Black_pix1=0; for j=1:nn for i=1:mm if mouth(i,j)==1 White_pix1=White_pix1+1; else
  • 51. Black_pix1=Black_pix1+1; end end end Black_pix1 if Black_pix < 1000 || Black_pix1 < 1000 msgbox('Alert! Alert!'); end pause(.5) nFrame = nFrame+1; end %% Clearing Memory release(hVideoIn); % Release all memory and buffer used release(vidDevice); APPENDIX II SCREEN SHOT
  • 52.
  • 53.
  • 54.
  • 55. REFERENCES [1] S. Abtahi, Hariri, B , Shirmohammadi, S., "Student drowsiness monitoring based on yawning detection," in Instrumentation and Measurement Technology Conference (I2MTC), 2011 IEEE, 2011, pp. 1-4.
  • 56. [2] L. M. N. Bergasa, J. Sotelo, M. A. Barea, R. Lopez, M. E., "Real-time system for monitoring student vigilance," Intelligent Transportation Systems, IEEE Transactions on, vol. 7, pp. 63-77, 2006. [3] S. G. H. Charles C. Liua, Michael G. Lennéa, "Predicting student drowsiness using class room measures: Recent insights and future challenges," Journal of Safety Research, vol. 40, pp. 239-245,2009. [4] L. P. Danghui, Sun YanQing, Xiao Yunxia, Yin,"Drowsiness Detection Based on Eyelid Movement," in Education Technology and Computer Science (ETCS), 2010 Second International Workshop on, 2010, pp. 49-52. [5] B. G. S. Hyun Jae, Chung Ko Keun, Kim ,Kwang-Suk, Park, "A Smart Health Monitoring Chair for Nonintrusive Measurement of Biological Signals," Information Technology in Biomedicine, IEEE Transactions on, vol. 16, pp. 150-158, 2012. [6] T. H. Laurence Hartley, Nick Mabbott, "Review of Fatigue Detection and Prediction Technologies,"Institute for Research in Safety & Transport Murdoch University Western Australia and Gerald P Krueger Krueger Ergonomics Consultants, 2000. [7] M. J. J. PAUL VIOLA, "Robust Real-Time Face Detection," International Journal of Computer Vision, vol. 57, pp. 137-154, 2004.
  • 57. [8] S. B. M. T. SzeSeen Kee, YongMeng Goh, "Lesioning Fatigue and Performance among Occupational Students in Simulated Prolonged Lesioning," Global Journal of Health Science, vol. 2, 2010. [9] H. S. Weiwei Liu, Weijie Shen, "Student Fatigue Detection through Pupil Detection and Yawing Analysis," Bioinformatics and Biomedical Technology (ICBBT), 2010 International Conference, 2010. [10] W. Z. U. Yufeng, "Detection student yawning in successive images," in proc first International Cont. on Bioinformatics and Biomedical Engineering, pp. 581-583, 2007. [11] T.-W. L. Yu-Shan Wu, Quen-Zong Wu,Heng-Sung Liu, "An Eye State Recognition Method for Drowsiness Detection," IEEE international Conference, vol. 978-1-4244-25, 2010. [12] J. Z. Zutao Zhang, "A new real-time eye tracking based on nonlinear unscented Kalman filter for monitoring student fatigue," Journal of Control Theory and Applications vol. 8, pp. 181-188, 2010. SOURSE CODE
  • 58. %% Drowsy Detection System %% Clear and Close Everything clc; clear all; close all; %% Main Code EyeDetect = vision.CascadeObjectDetector('EyePairBig'); MouthDetect = vision.CascadeObjectDetector('Mouth','MergeThreshold',150); vidDevice = imaq.VideoDevice('winvideo', 1, 'YUY2_640x480', ... % Acquire input video stream 'ROI', [1 1 640 480], ... 'ReturnedColorSpace', 'rgb'); vidInfo = imaqhwinfo(vidDevice); % Acquire input video property reqToolboxes = {'Computer Vision System Toolbox', 'Image Processing Toolbox'}; hVideoIn = vision.VideoPlayer('Name', 'Final Video', ... % Output video player 'Position', [100 100 vidInfo.MaxWidth+20 vidInfo.MaxHeight+30]); nFrame = 0; % Frame number initialization while(nFrame < 11) if nFrame==0 img= step(vidDevice); % Acquire single frame pause(2); end img= step(vidDevice); % Acquire single frame BB=step(EyeDetect,img); CC=step(MouthDetect,img); imshow(img); rectangle('Position',BB,'LineWidth',2,'LineStyle','-','EdgeColor','b'); rectangle('Position',CC,'LineWidth',2,'LineStyle','-','EdgeColor','r'); title('Drowsy Detection'); eye=imcrop(img,BB); mouth=imcrop(img,CC); imwrite(mouth,['mouth',num2str(nFrame),'.jpg']); imwrite(eye,['eye',num2str(nFrame),'.jpg']); %step(hVideoIn,BB); % Output video stream eye=rgb2gray(eye); eye=im2bw(eye,.15); %imshow(eye); [m,n]=size(eye); White_pix=0; Black_pix=0; for j=1:n for i=1:m if eye(i,j)==1 White_pix=White_pix+1; else Black_pix=Black_pix+1; end end end
  • 59. Black_pix mouth=rgb2gray(mouth); mouth=im2bw(mouth,.15); %imshow(eye); [mm,nn]=size(mouth); White_pix1=1; Black_pix1=1; for j=1:nn for i=1:mm if mouth(i,j)==1 White_pix1=White_pix1+3; else Black_pix1=Black_pix1+3; end end end Black_pix1 if Black_pix < 1000 || Black_pix1 < 1000 msgbox('Alert! Alert!'); end pause(.100) nFrame = nFrame+1; end %% Clearing Memory release(hVideoIn); % Release all memory and buffer used release(vidDevice);