SlideShare a Scribd company logo
1 of 21
Senior Design
Driver Distraction Detector
By Anthony Cabrera and Indrit Vogli
Overview
 This project builds upon the work of last semester, which involved driver
behavior recognition. It utilized a Kinect sensor and its infrared sensors to
monitor the driver’s motion.
 However, the universal use of the Kinect sensor is problematic, because it is
a copy written device made specifically for Microsoft’s gaming console.
The Kinect it cannot be modified for commercial use due to copyright
laws. Additionally, its size makes it too bulky to use practically.
 Using Seeing Machines API software, we addressed these main issues. This
software allows for the tracking and position of the subject’s head using
most webcams, provided that it is used with Windows operating system.
Introduction
 With the API, we created the directional window which tracks the
movement of the head, and changes color when there is a distraction.
 This can be visualized with graphs: one displays the pitch and the other
displays the yaw.
 We can calibrate the device to be positioned differently to allow for height
differences of the subject. The threshold for distraction can also be
adjusted for extended movement.
 Also we record the time when the distraction happens and we can replay
the video anytime .
Libraries
Seeing Machines/FaceTrackingAPI
QT
QCustomPlot
OpenGL
User Interface
 The frame on the left displays
the subject in real time. Under
the frame are the labels
display and graphs which
display the pitch and yaw,
with roll values and calibrated
values.
 The middle frame with the
green bars is the direction
window. With the
corresponding threshold input.
 Under the direction window it
is the calibration input and the
calibration button.
 The right frame is the video
playback for the recorded
distraction, which are selected
by the playlists underneath.
Layout
• We created MiddleTopLayout,we
included the direction
window(vertical_dw) and
(calibrationControls)
• Than created LeftTopLayout,
QHBoxLayout, and we included
(ar_window) and
(MiddleTopLayaut).
• After we created the
MainleftLayout, which included
(LeftTopLayout), (hbox_pitch),
(plot_pitch), (hbox_yaw), and
(plot_yaw).
• The right layout has the
(videoDisplay) and the (listWidget).
• Everything was then compiled in
the final box Layout as
(MainleftLayout) and (rightLayout).
Pitch & Yaw
The above code provides us with raw yaw, the raw
pitch value, and the raw roll value. For our purpose, the
roll value was not needed.
QLabel & QLineEdit
• To display the uncalibrated and
calibrated yaw and pitch, we used
Qlabel and QLineEdit to match the
raw pitch and raw yaw with the
red graph. We made the label red.
• We did the same thing for the
calibrated values, but the label
was blue.
Calibration
We selected by default 10 seconds;
the user can also input a different
value for the time. The program
executes once you hit the “Calibrate”
button.
Calibration
Calibration
Creating the Graphs
We use QCustomPlot Library Tool to create
the graphs. We created the two graphs: one
displays the pitch and the other displays the
yaw. Both graphs display calibrated and raw
values.
Distraction Display Window
This window displays the distraction.
Initially the green lights are grey prior to
calibration. Once the calibration is
complete, the lights are green. If a
distraction passes the desired threshold
and it’s duration last longer than two
seconds is recorded, the lights turn red.
The user can adjust the threshold by
inputting a different value in the
corresponding box. By default, the
value is 25.
Distraction Display Window
We used OpenGL: (GL_POLYGON) to create four lights.
Distraction Display Window
We created three different color lights: grey, green, and red.
Video Display
Distraction List
Troubleshooting
 The first challenge was to make the provided code work with our machine.
 Creating the calibration was difficult.
 The direction window took a lot of time to complete.
 Saving the video was the biggest challenge.
 Displaying each distraction.
Future Projects
 Integrating this software with existing safety systems: if the driver is not
paying attention, the car can brake and steer the car away from danger.
 Facial controls: using the driver’s eyes or facial movements to control the
vehicle.
 Implementing the software on a mobile device which allows for portability
and be used in multiple vehicles in the same time it uses the device built in
cameras and other sensors .
Improvements
 Making the code run faster.
 To compress the videos to take up less storage space.
 Making the interface more aesthetically pleasing.
Conclusions
 We improved our skills working with QT
 We learned to use another API and graft it to a user interface.

More Related Content

Similar to Final Project Presentation

THE THIRD EYE-Presentation
THE THIRD EYE-PresentationTHE THIRD EYE-Presentation
THE THIRD EYE-PresentationRomil Shah
 
TP_Webots_7mai2021.pdf
TP_Webots_7mai2021.pdfTP_Webots_7mai2021.pdf
TP_Webots_7mai2021.pdfkiiway01
 
Design the implementation of 1D Kalman Filter Encoder and Accelerometer.
Design the implementation of 1D Kalman Filter Encoder and Accelerometer.Design the implementation of 1D Kalman Filter Encoder and Accelerometer.
Design the implementation of 1D Kalman Filter Encoder and Accelerometer.Ankita Tiwari
 
Motion detection system
Motion detection systemMotion detection system
Motion detection systemWritingHubUK
 
The Real Time Drowisness Detection Using Arm 9
The Real Time Drowisness Detection Using Arm 9The Real Time Drowisness Detection Using Arm 9
The Real Time Drowisness Detection Using Arm 9IOSR Journals
 
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdfJIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdfSamiraKids
 
Android Live Streaming Box Technical
Android Live Streaming Box Technical Android Live Streaming Box Technical
Android Live Streaming Box Technical Jimmin Kurichiyil
 
3D Programming Basics: WebGL
3D Programming Basics: WebGL3D Programming Basics: WebGL
3D Programming Basics: WebGLGlobant
 
Motion detection system
Motion detection systemMotion detection system
Motion detection systemWritingHubUK
 
Design of vga based pong game using fpga
Design  of vga based pong game using fpgaDesign  of vga based pong game using fpga
Design of vga based pong game using fpgaRamanan Rajaraman
 
23 Sports Analysis Apps
23 Sports Analysis Apps23 Sports Analysis Apps
23 Sports Analysis AppsRob Carroll
 
ARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experienceARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experienceDevGAMM Conference
 
Design the implementation of Brushless DC Motor Six Step Control.
Design the implementation of Brushless DC Motor Six Step Control.Design the implementation of Brushless DC Motor Six Step Control.
Design the implementation of Brushless DC Motor Six Step Control.Ankita Tiwari
 
IRJET-Implementation of Image Processing using Augmented Reality
IRJET-Implementation of Image Processing using Augmented RealityIRJET-Implementation of Image Processing using Augmented Reality
IRJET-Implementation of Image Processing using Augmented RealityIRJET Journal
 
design the implementation of trajectory path of the robot using parallel loop
design the implementation of trajectory path of the robot using parallel loopdesign the implementation of trajectory path of the robot using parallel loop
design the implementation of trajectory path of the robot using parallel loopAnkita Tiwari
 
Create your own sixth sense device
Create your own sixth sense deviceCreate your own sixth sense device
Create your own sixth sense deviceFaheem Nazir
 

Similar to Final Project Presentation (20)

THE THIRD EYE-Presentation
THE THIRD EYE-PresentationTHE THIRD EYE-Presentation
THE THIRD EYE-Presentation
 
TP_Webots_7mai2021.pdf
TP_Webots_7mai2021.pdfTP_Webots_7mai2021.pdf
TP_Webots_7mai2021.pdf
 
eCognition 8 Highlights
eCognition 8 HighlightseCognition 8 Highlights
eCognition 8 Highlights
 
Design the implementation of 1D Kalman Filter Encoder and Accelerometer.
Design the implementation of 1D Kalman Filter Encoder and Accelerometer.Design the implementation of 1D Kalman Filter Encoder and Accelerometer.
Design the implementation of 1D Kalman Filter Encoder and Accelerometer.
 
C44081316
C44081316C44081316
C44081316
 
Motion detection system
Motion detection systemMotion detection system
Motion detection system
 
The Real Time Drowisness Detection Using Arm 9
The Real Time Drowisness Detection Using Arm 9The Real Time Drowisness Detection Using Arm 9
The Real Time Drowisness Detection Using Arm 9
 
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdfJIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
 
Android Live Streaming Box Technical
Android Live Streaming Box Technical Android Live Streaming Box Technical
Android Live Streaming Box Technical
 
3D Programming Basics: WebGL
3D Programming Basics: WebGL3D Programming Basics: WebGL
3D Programming Basics: WebGL
 
Motion detection system
Motion detection systemMotion detection system
Motion detection system
 
Design of vga based pong game using fpga
Design  of vga based pong game using fpgaDesign  of vga based pong game using fpga
Design of vga based pong game using fpga
 
23 Sports Analysis Apps
23 Sports Analysis Apps23 Sports Analysis Apps
23 Sports Analysis Apps
 
ARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experienceARM: Enhancing your Unity mobile VR experience
ARM: Enhancing your Unity mobile VR experience
 
Design the implementation of Brushless DC Motor Six Step Control.
Design the implementation of Brushless DC Motor Six Step Control.Design the implementation of Brushless DC Motor Six Step Control.
Design the implementation of Brushless DC Motor Six Step Control.
 
IRJET-Implementation of Image Processing using Augmented Reality
IRJET-Implementation of Image Processing using Augmented RealityIRJET-Implementation of Image Processing using Augmented Reality
IRJET-Implementation of Image Processing using Augmented Reality
 
design the implementation of trajectory path of the robot using parallel loop
design the implementation of trajectory path of the robot using parallel loopdesign the implementation of trajectory path of the robot using parallel loop
design the implementation of trajectory path of the robot using parallel loop
 
Real Time Video Processing in FPGA
Real Time Video Processing in FPGA Real Time Video Processing in FPGA
Real Time Video Processing in FPGA
 
Vuzix i wear vr920
Vuzix i wear vr920Vuzix i wear vr920
Vuzix i wear vr920
 
Create your own sixth sense device
Create your own sixth sense deviceCreate your own sixth sense device
Create your own sixth sense device
 

Final Project Presentation

  • 1. Senior Design Driver Distraction Detector By Anthony Cabrera and Indrit Vogli
  • 2. Overview  This project builds upon the work of last semester, which involved driver behavior recognition. It utilized a Kinect sensor and its infrared sensors to monitor the driver’s motion.  However, the universal use of the Kinect sensor is problematic, because it is a copy written device made specifically for Microsoft’s gaming console. The Kinect it cannot be modified for commercial use due to copyright laws. Additionally, its size makes it too bulky to use practically.  Using Seeing Machines API software, we addressed these main issues. This software allows for the tracking and position of the subject’s head using most webcams, provided that it is used with Windows operating system.
  • 3. Introduction  With the API, we created the directional window which tracks the movement of the head, and changes color when there is a distraction.  This can be visualized with graphs: one displays the pitch and the other displays the yaw.  We can calibrate the device to be positioned differently to allow for height differences of the subject. The threshold for distraction can also be adjusted for extended movement.  Also we record the time when the distraction happens and we can replay the video anytime .
  • 5. User Interface  The frame on the left displays the subject in real time. Under the frame are the labels display and graphs which display the pitch and yaw, with roll values and calibrated values.  The middle frame with the green bars is the direction window. With the corresponding threshold input.  Under the direction window it is the calibration input and the calibration button.  The right frame is the video playback for the recorded distraction, which are selected by the playlists underneath.
  • 6. Layout • We created MiddleTopLayout,we included the direction window(vertical_dw) and (calibrationControls) • Than created LeftTopLayout, QHBoxLayout, and we included (ar_window) and (MiddleTopLayaut). • After we created the MainleftLayout, which included (LeftTopLayout), (hbox_pitch), (plot_pitch), (hbox_yaw), and (plot_yaw). • The right layout has the (videoDisplay) and the (listWidget). • Everything was then compiled in the final box Layout as (MainleftLayout) and (rightLayout).
  • 7. Pitch & Yaw The above code provides us with raw yaw, the raw pitch value, and the raw roll value. For our purpose, the roll value was not needed.
  • 8. QLabel & QLineEdit • To display the uncalibrated and calibrated yaw and pitch, we used Qlabel and QLineEdit to match the raw pitch and raw yaw with the red graph. We made the label red. • We did the same thing for the calibrated values, but the label was blue.
  • 9. Calibration We selected by default 10 seconds; the user can also input a different value for the time. The program executes once you hit the “Calibrate” button.
  • 12. Creating the Graphs We use QCustomPlot Library Tool to create the graphs. We created the two graphs: one displays the pitch and the other displays the yaw. Both graphs display calibrated and raw values.
  • 13. Distraction Display Window This window displays the distraction. Initially the green lights are grey prior to calibration. Once the calibration is complete, the lights are green. If a distraction passes the desired threshold and it’s duration last longer than two seconds is recorded, the lights turn red. The user can adjust the threshold by inputting a different value in the corresponding box. By default, the value is 25.
  • 14. Distraction Display Window We used OpenGL: (GL_POLYGON) to create four lights.
  • 15. Distraction Display Window We created three different color lights: grey, green, and red.
  • 18. Troubleshooting  The first challenge was to make the provided code work with our machine.  Creating the calibration was difficult.  The direction window took a lot of time to complete.  Saving the video was the biggest challenge.  Displaying each distraction.
  • 19. Future Projects  Integrating this software with existing safety systems: if the driver is not paying attention, the car can brake and steer the car away from danger.  Facial controls: using the driver’s eyes or facial movements to control the vehicle.  Implementing the software on a mobile device which allows for portability and be used in multiple vehicles in the same time it uses the device built in cameras and other sensors .
  • 20. Improvements  Making the code run faster.  To compress the videos to take up less storage space.  Making the interface more aesthetically pleasing.
  • 21. Conclusions  We improved our skills working with QT  We learned to use another API and graft it to a user interface.