Driver Drowsiness Detection
Driver Drowsiness Detection
CHAPTER 1:
INTRODUCTION
1
1. INTRODUCTION
In today’s fast moving world, people depend on their means of transport excessively.
Feeling drowsy and fatigued during a long drive or after a short night’s sleep is common
among everyone. This physical feeling of tiredness brings down the level of concentration
of the driver. Such conditions are not favoured while driving and result in the increase of
accidents. Driver drowsiness and exhaustion are prime contenders in the cause of road
accidents. The case of car accidents caused by driver drowsiness is increasing at a
shocking pace. Recent numbers indicate 10% to 40% of all road accidents are due to
drivers feeling exhausted and sleepy. In the trucking industry, about 60% of fatal accidents
are caused by driver fatigue.
For the reasons stated above, developing systems to continuously monitor the driver’s
concentration on the road and level of drowsiness and alerting them is important.
Researchers and innovators have been working on producing such systems for the
betterment of the human race. From years of research, the best way of predicting such
behavior is from the physical factors like breathing, heart rate, pulse rate, brain waves, etc.
Such systems never made it to public use as they required attachment of sensors and
electrodes onto the bodies of the drivers, causing frustration. Some representative projects
in this line are the MIT-Smart Car, and ASV (Advanced Safety Vehicle) project performed
by Toyota, Nissan and Honda. Some other systems proposed included monitoring the
movement of pupils and movement of head using specialised helmets and optical lenses.
Such systems were not accepted even after not being disturbing as production costs were
challenging. Some indirect methods were also introduced to detect the drowsiness in a
driver by reading the maneuvering of the steering wheel, positioning of the wheel axles
etc.
These systems were also not entertained as they had other difficulties such as the type of
vehicle, environmental conditions, driver experience, geometric aspects, state of the road,
2
etc. Contrarily, the time taken to analyze these user behaviors is too much and thereby it
doesn’t work with the blinking of eyes or micro-sleeps. In this line we can find an
important Spanish project called TCD (Tech CO Driver) and the Mitsubishi advanced
safety vehicle system. People with exhaustion or fatigue show some visual behaviours
easily notable from changes in their physical features of the face like eyes, movement of
the face and head. Computer Vision is free from disturbance and a natural approach to
monitor the driver’s vigilance.
In this context, it is critical to use new and better technologies to design and build systems
that are able to monitor the drivers and to compute their level of concentration during the
whole process of driving. In this project, a module for Advanced Assistance to Driver
Drowsiness (AADD) is presented in order to control the number of accidents caused by
driver drowsiness and thus improve transport safety. This system will manage to detect the
driver drowsiness using machine vision and artificial intelligence automatically. We
present an algorithm to capture, locate and analyze both the driver’s face and eyes to
measure PERCLOS (percentage of eye closure).
Drowsy driving crashes are usually of high severity due to the drivers’ significant loss of
control, often leading to unpredicted vehicle trajectory and no braking response. Reliable
safety systems are needed to mitigate these crashes. The most important challenge is to
detect the driver’s condition sufficiently early, prior to the onset of sleep, to avoid
collisions.
1.3 OBJECTIVE
To detect whether the vehicle driver slept or not during the driving and also if the
driver had fatigue to give pre-intimation to avoid an accident
3
CHAPTER 2:
SYSTEM ANALYSIS
4
2. SYSTEM ANALYSIS
The existing system evaluate whether changes in the eye-steering correlation that can
indicate distraction. The auto-correlation and cross-correlation of horizontal eye position
and steering wheel angle show that eye movements associated with road scanning
procedure a low eye steering correlation. The eye steering correlation will control the
relationship on a straight road. The straight road led to a low correlation between the
steering movement and eye glances. In this system it is aim to detect the driver distraction
based on visual behaviour or the performance of the driver so for this purpose it is used to
define the relationship between the visual behaviour and the vehicle control. This system
evaluates the eye-steering correlation associated with the straight road with the assumption
that it might show a qualitatively and quantitatively different relationship compared with
curvy road and that it might be sensitive to distraction. Here in the visual behaviour and
vehicle control relationship reflects a fundamental perception-control mechanism which
plays a major role in driving and a strong eye steering correlation associated with this
process has been observed on curvy roads.
DISADVANTAGES
The driver’s fatigue level, detecting the eye state is important more than detecting the
eye steering correlation.
Correlation between the eye-steering is complex and might produce wrong results
undetermined cases.
5
2.2 PROPOSED SYSTEM
In the proposed system, the driver fatigue and distraction is detected only by processing of
eye region. The main symptoms of driver fatigue and distraction appear in the driver’s
eyes because of sleeping while driving. Nowadays, there are many fatigue detection
methods and the best is capturing the eyes in real time using web camera to detect the
physical responses in eyes. Moreover, the processing of the eye region instead of the
processing of the face region has less computational complexity
ADVANTAGES
Accidents can be avoided by alerting the driver’s distraction and drowsiness using
warning signals.
Comparing to the driver detection algorithm this PERCLOS system reduces the time
complexity.
The warning alarm alerts the driver as well as the passengers to be conscious about the
driver’s behaviour.
6
CHAPTER 3:
SYSTEM
CONFIGURATION
7
3. SYSTEM CONFIGURATION
8
CHAPTER 4:
SOFTWARE
DESCRIPTION
4. SOFTWARE DESCRIPTION
9
WAMP Server
WAMPs are packages of independently-created programs installed on computers that
use a Microsoft Windows operating system. The interaction of these programs enables
dynamic web pages to be served over a computer network, such as the internet or a private
network. The equivalent installation on a Linux operating system is known as LAMP. The
equivalent installation on a Mac operating system is known as MAMP. The equivalent
installation on a Solaris operating system is known as SAMP. The equivalent installation
on a FreeBSD operating system is known as FAMP.
"WAMP" is an acronym formed from the initials of the operating system (Windows) and
the package's principal components: Apache, MySQL and PHP (or Perl or Python).
Apache is a web server, which allows people with web browsers like Internet
Explorer or Firefox to connect to a computer and see information there as web pages.
MySQL is a database manager (that is, it keeps track of data in a highly organized
way).
PHP is a scripting language which can manipulate information held in a database
and generate web pages afresh each time an element of content is requested from a
browser.
JAVA:
Java is a small, simple, safe, object oriented, interpreted or dynamically optimized, byte
coded, architectural, garbage collected, multithreaded programming language with a
strongly typed exception-handling for writing distributed and dynamically extensible
programs.
10
It helps to create user friendly interfaces.
It is very dynamic.
It supports multithreading.
It is platform independent
It is highly secure and robust.
It supports internet programming
The Java language was created by James Gosling in June 1991 for use in a set top box
project. The language was initially called Oak, after an oak tree that stood outside
Gosling's office - and also went by the name Green - and ended up later being renamed to
Java, from a list of random words. Gosling's goals were to implement a virtual machine
and a language that had a familiar C/C++ style of notation.
Primary goals
11
There were five primary goals in the creation of the Java language:
The Java platform is the name for a bundle of related programs, or platform, from Sun
which allow for developing and running programs written in the Java programming
language. The platform is not specific to any one processor or operating system, but rather
an execution engine (called a virtual machine) and a compiler with a set of standard
libraries which are implemented for various hardware and operating systems so that Java
programs can run identically on all of them. Different "editions" of the platform are
available, including:
The essential components in the platform are the Java language compiler, the libraries, and
12
the runtime environment in which Java intermediate bytecode "executes" according to the
rules laid out in the virtual machine specification.
The heart of the Java Platform is the concept of a "virtual machine" that executes Java
bytecode programs. This bytecode is the same no matter what hardware or operating
system the program is running under. There is a JIT compiler within the Java Virtual
Machine, or JVM. The JIT compiler translates the Java bytecode into native processor
instructions at run-time and caches the native code in memory during execution.
Although Java programs are Platform Independent, the code of the Java Virtual
Machine (JVM) that execute these programs are not. Every Operating System has its own
JVM.
The Java Runtime Environment, or JRE, is the software required to run any
13
application deployed on the Java Platform. End-users commonly use a JRE in software
packages and Web browser plugins. Sun also distributes a superset of the JRE called the
Java 2 SDK (more commonly known as the JDK), which includes development tools such
as the Java compiler, Javadoc, Jar and debugger.
One of the unique advantages of the concept of a runtime engine is that errors (exceptions)
should not 'crash' the system. Moreover, in runtime engine environments such as Java
there exist tools that attach to the runtime engine and every time that an exception of
interest occurs they record debugging information that existed in memory at the time the
exception was thrown (stack and heap values). These Automated Exception Handling tools
provide 'root-cause' information for exceptions in Java programs that run in production,
testing or development environments.
14
CHAPTER 5:
SYSTEM DESIGN
5. SYSTEM DESIGN
15
5.1 GENERAL
System design is the process or art of defining the architecture, components, modules,
interface, and data for the system to satisfy specified requirements. This chapter deals with
various design and function of the system.
Eye detection
driver
intimation
17
A sequence diagram is an interaction diagram that shows how objects operate with
one another and in what order. It is a construct of a message sequence chart. A sequence
diagram shows object interactions arranged in time sequence.
camera Driver
start camera
face detected
Class diagram in the unified modeling language (UML) is a type of static structure
18
diagram that structure of a system by showing the system classes attribute operators.
1: start camera
4: face detected
Driver camera
19
CHAPTER 6:
SYSTEM
IMPLEMENTATION
6. SYSTEM IMPLEMENTATION
6.1 MODULES
20
• Start Detection (Camera Opencv)
• Driver Eye Detection
• Eye Parameters Calculation
• Drowsiness Level Determination
• Intimation
21
CHAPTER 7:
SYSTEM TESTING
7. SYSTEM TESTING
The purpose of testing is to discover errors. Testing is the process of trying to discover
every conceivable fault or weakness in a work product. It provides a way to check the
22
functionality of components, subassemblies, assemblies, and/or a finished product. It is the
process of exercising software with the intent of ensuring that the software system meets
its requirements and user expectations and type address a specific testing requirement.
Unit testing involves the design of test cases that validate that the internal program logic
is functioning properly, and that program inputs produce valid outputs. All decision
branches and internal code flow should be validated. It is the testing of individual software
units of the application. It is done after the completion of an individual unit before
integration. This is a structural testing, that relies on knowledge of its construction and
is invasive. Unit tests perform basic tests at component level and test a specific business
process, application, and/or System configuration. Unit tests ensure that each unique path
of a business process performs accurately to the documented specifications and contains
clearly defined inputs and expected results. Unit testing is usually conducted as part of a
combined code and test phase of the software lifecycle, although it is not uncommon for
coding and unit testing to be conducted as two distinct phases.
TEST OBJECTIVES
Integration tests are designed to test integrated software components to determine if they
actually run as one program. Testing is event driven and is more concerned with the
23
basic outcome of screens or fields. Integration tests demonstrate that although the
components were individually satisfaction, as shown by successfully unit testing, the
combination of components is correct and consistent. Integration testing is specifically
aimed at exposing the problems that arise from the combination of components.
FUNCTIONAL TEST
Functional tests provide systematic demonstrations that functions tested are available as
specified by the business and technical requirements, system documentation, and user
manuals.
SYSTEM TEST
System testing ensures that the entire integrated software system meets requirements. It
tests a configuration to ensure known and predicate results. An Example of system testing
is the configuration oriented system integration test. System testing is based on process
descriptions and flow emphasizing pre-driven Process links and integration points.
24
CHAPTER 8:
APPENDICES
8. APPENDICES
25
package gui;
import com.sun.javafx.geom.Vec3f;
import java.awt.Graphics;
import java.awt.Image;
import java.awt.image.BufferedImage;
import java.io.ByteArrayInputStream;
import java.util.Vector;
import static javafx.scene.paint.Color.color;
import javax.imageio.ImageIO;
import org.opencv.core.MatOfByte;
import org.opencv.core.MatOfRect;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.core.Size;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.videoio.VideoCapture;
import static org.opencv.imgproc.Imgproc.COLOR_BGR2GRAY;
import static org.opencv.imgproc.Imgproc.circle;
import org.opencv.objdetect.CascadeClassifier;
public class FaceDetection extends javax.swing.JFrame {
private DaemonThread myThread = null;
VideoCapture webSource = null;
Mat frame = new Mat();
MatOfByte mem = new MatOfByte();
CascadeClassifier faceDetector = new
CascadeClassifier(FaceDetection.class.getResource("haarcascade_frontalface_alt.xml").ge
tPath().substring 1));
CascadeClassifier eyeDetector = new
CascadeClassifier(FaceDetection.class.getResource("haarcascade_eye.xml").getPath().sub
string(1));
CascadeClassifier(FaceDetection.class.getResource("haarcascade_smile.xml").getPath().su
bstring(1));
26
//MatOfRect smileDetections = new MatOfRect();
//VideoCapture camera=new VideoCapture(0);
class DaemonThread implements Runnable {
protected volatile boolean runnable = false; @Override
public void run() {
synchronized (this) {
while (runnable) {
if (webSource.grab()) {
try {
webSource.retrieve(frame);
Graphics g = jPanel1.getGraphics();
faceDetector.detectMultiScale(frame, faceDetections);
//System.out.println("frame " +frame);
for (Rect rect : faceDetections.toArray()) {
//System.out.println("ttt");
Imgproc.rectangle(frame, new Point(rect.x, rect.y), new Point(rect.x + rect.width, rect.y +
rect.height),
new Scalar(0, 255, 0));
//System.out.println("point 1= "+ new Point(rect.x, rect.y));
// smileDetector.detectMultiScale(frame, smileDetections);
// for (Rect rect3 : smileDetections.toArray()) {
// if ((rect3.x > rect.x & rect3.x < (rect.x + rect.width)) && (rect3.y >
rect.y & rect3.y < (rect.y + rect.height))) {
// //jLabel2.setText("You are smiling");
// System.out.println("smiling");
// else{
// System.out.println("----------------------------->not smiling");
// //jLabel2.setText("");
eyeDetector.detectMultiScale(frame, eyeDetections);
for (Rect rect2 : eyeDetections.toArray()) {
Imgproc.rectangle(frame, new Point(rect2.x, rect2.y), new Point(rect2.x + rect2.width,
rect2.y + rect2.height),
27
take the eye//
Mat image_roi = new Mat(frame,rectCrop);
Imgcodecs.imwrite("D:\\CapturePictures\\new3.jpg", image_roi);
//System.out.println("fine");
if ((rect2.x > rect.x & rect2.x < (rect.x + rect.width)) && (rect2.y > rect.y & rect2.y <
(rect.y + rect.height))) {
Imgproc.cvtColor(image_roi, image_roi, Imgproc.COLOR_BGR2GRAY);
Imgcodecs.imwrite("D:\\CapturePictures\\new4.jpg", image_roi);
//System.out.println("greyframe = " + frame);
//Imgproc.Sobel(frame, frame, 9, 9, 9);
Imgproc.medianBlur(image_roi, image_roi, 9);
//System.out.println("noise done");
//System.out.println("rows= "+image_roi.rows());
Mat circles = new Mat();
Imgproc.HoughCircles(image_roi, circles, Imgproc.HOUGH_GRADIENT, 1, 10, 50,20 ,
0, 0);
//System.out.println("circles found");
// System.out.println("circle x= "+circles.get(0,0));
// System.out.println("circle y= "+circles.get(1,0));
//// System.out.println("circle r= "+circles.get(2,0));
// System.out.println("circle size = " + circles.size());
// System.out.println("circle cols = " + circles.cols());
float circle[] = new float[3];
for (int i = 0; i < circles.cols(); i++) {
org.opencv.core.Point center = new org.opencv.core.Point();
circles.get(0, i, circle);
center.x = circle[0];
center.y = circle[1];
Imgproc.circle(image_roi, center, (int) circle[2], new Scalar(255, 255, 100, 1), 4);
System.out.println("eyes open");}
} else {
System.out.println("eyes closed-------------------------------->sleeping");
}}
//Mat greyframe = null;
28
// Imgproc.cvtColor(frame, frame, Imgproc.COLOR_BGR2GRAY);
// Imgcodecs.imwrite("D:\\CapturePictures\\new4.jpg", frame);
// System.out.println(":o0");
// //System.out.println("greyframe = " + frame);
// //Imgproc.Sobel(frame, frame, 9, 9, 9);
// Imgproc.medianBlur(frame, frame, 9);
// System.out.println("noise done");
// //System.out.println("rows= "+image_roi.rows());
// //Mat newframe=new Mat();
// Mat circles = new Mat();
// Imgproc.HoughCircles(frame, circles,
Imgproc.HOUGH_GRADIENT, 1, 10, 30, 5, 5, 5);
// System.out.println("circles found");
//// System.out.println("circle x= "+circles.get(0,0));
//// System.out.println("circle y= "+circles.get(1,0));
//// System.out.println("circle r= "+circles.get(2,0));
// System.out.println("circle size = " + circles.size());
// System.out.println("circle cols = " + circles.cols());
// float circle[] = new float[3];
// for (int i = 0; i < circles.cols(); i++) {
// org.opencv.core.Point center = new org.opencv.core.Point();
// circles.get(0, i, circle);
// System.out.println("r= "+circle[2]);
// Imgproc.circle(frame, center, (int) circle[2], new Scalar(255,
255, 100, 1), 4);
Imgcodecs.imencode(".bmp", frame, mem);
Image im = ImageIO.read(new ByteArrayInputStream(mem.toArray()));
BufferedImage buff = (BufferedImage) im;
Imgcodecs.imwrite("D:\\CapturePictures\\new.jpg", frame);
//Imgcodecs.imwrite("D:\\CapturePictures\\greynew.jpg", greyframe);
if (g.drawImage(buff, 0, 0, getWidth(), getHeight() - 150, 0, 0, buff.getWidth(),
buff.getHeight(), null)) {
if (runnable == false) {
System.out.println("Paused ..... ");
29
this.wait();
}
}
} catch (Exception ex) {
System.out.println("Error");
}
}
}
}
}
/**
* Creates new form FaceDetection
*/
public FaceDetection() {
initComponents();
System.out.println(FaceDetection.class.getResource("haarcascade_frontalface_alt.xml").ge
tPath().substring(1));
}
/**
* This method is called from within the constructor to initialize the form.
* WARNING: Do NOT modify this code. The content of this method is always
* regenerated by the Form Editor.
@SuppressWarnings("unchecked")
// <editor-fold defaultstate="collapsed" desc="Generated Code">//GEN-
BEGIN:initComponents
private void initComponents() {
jPanel1 = new javax.swing.JPanel();
jButton1 = new javax.swing.JButton();
jButton2 = new javax.swing.JButton();
setDefaultCloseOperation(javax.swing.WindowConstants.EXIT_ON_CLOSE);
javax.swing.GroupLayout jPanel1Layout = new javax.swing.GroupLayout(jPanel1);
jPanel1.setLayout(jPanel1Layout);
jPanel1Layout.setHorizontalGroup(
jPanel1Layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
30
.addGap(0, 0, Short.MAX_VALUE)
);
jPanel1Layout.setVerticalGroup(
jPanel1Layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGap(0, 376, Short.MAX_VALUE)
);
jButton1.setText("Start");
jButton1.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton1ActionPerformed(evt);
} });
jButton2.setText("Pause");
jButton2.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton2ActionPerformed(evt);
} });
javax.swing.GroupLayout layout = new javax.swing.GroupLayout(getContentPane());
getContentPane().setLayout(layout);
layout.setHorizontalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addGap(24, 24, 24)
.addComponent(jPanel1, javax.swing.GroupLayout.DEFAULT_SIZE,
javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE)
.addContainerGap())
.addGroup(layout.createSequentialGroup()
.addGap(255, 255, 255)
.addComponent(jButton1)
.addGap(86, 86, 86)
.addComponent(jButton2)
.addContainerGap(258, Short.MAX_VALUE))
);
layout.setVerticalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
31
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addComponent(jPanel1, javax.swing.GroupLayout.PREFERRED_SIZE,
javax.swing.GroupLayout.DEFAULT_SIZE,
javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE)
.addComponent(jButton1)
.addComponent(jButton2))
.addContainerGap(javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE)) );
pack(); }// </editor-fold>//GEN-END:initComponents
private void jButton2ActionPerformed(java.awt.event.ActionEvent evt) {//GEN-
FIRST:event_jButton2ActionPerformed
myThread.runnable = false; // stop thread
jButton2.setEnabled(false); // activate start button
jButton1.setEnabled(true); // deactivate stop button
webSource.release(); // stop caturing fron cam
}//GEN-LAST:event_jButton2ActionPerformed
private void jButton1ActionPerformed(java.awt.event.ActionEvent evt) {//GEN-
FIRST:event_jButton1ActionPerformed
webSource = new VideoCapture(0); // video capture from default cam
myThread = new DaemonThread(); //create object of threat class
Thread t = new Thread(myThread);
t.setDaemon(true);
myThread.runnable = true;
t.start(); //start thrad
jButton1.setEnabled(false); // deactivate start button
jButton2.setEnabled(true); // activate stop button
}//GEN-LAST:event_jButton1ActionPerformed
/**
* @param args the command line arguments */
public static void main(String args[]) {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
/* Set the Nimbus look and feel */
32
//<editor-fold defaultstate="collapsed" desc=" Look and feel setting code (optional) ">
/* If Nimbus (introduced in Java SE 6) is not available, stay with the default look and feel.
* For details see http://download.oracle.com/javase/tutorial/uiswing/lookandfeel/plaf.html
*/
try {
for (javax.swing.UIManager.LookAndFeelInfo info :
javax.swing.UIManager.getInstalledLookAndFeels()) {
if ("Nimbus".equals(info.getName())) {
javax.swing.UIManager.setLookAndFeel(info.getClassName());
break;
}
}
} catch (ClassNotFoundException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);
} catch (InstantiationException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);
} catch (IllegalAccessException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);
} catch (javax.swing.UnsupportedLookAndFeelException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);}
//</editor-fold>
/* Create and display the form */
java.awt.EventQueue.invokeLater(new Runnable() {
public void run() {
new FaceDetection().setVisible(true);
}});}
// Variables declaration - do not modify//GEN-BEGIN:variables
private javax.swing.JButton jButton1;
33
// End of variables declaration//GEN-END:variables
}
EYE CASCADE FILES
<rects>
<_>
10 14 4 6 -1.</_>
<_>
10 16 4 2 3.</_></rects></_>
<_>
<rects>
<_>
9 7 3 2 -1.</_>
<_>
10 7 1 2 3.</_></rects></_>
<_>
<rects>
<_>
6 9 6 2 -1.</_>
<_>
6 9 3 1 2.</_>
<_>
9 10 3 1 2.</_></rects></_>
<_>
<rects>
<_>
0 2 1 12 -1.</_>
<_>
0 6 1 4 3.</_></rects></_> <rects>
<_>
4 0 15 1 -1.</_>
9 0 5 1 3.</_></rects></_>
<_>
<rects>
34
<_>
9 0 8 2 -1.</_>
<_>
9 0 4 1 2.</_>
<_>
13 1 4 1 2.</_></rects></_>
<_>
<rects>
<_>
12 2 8 1 -1.</_>
<_>
16 2 4 1 2.</_></rects></_>
<_>
<rects>
<_>
7 1 10 6 -1.</_>
<_>
7 3 10 2 3.</_></rects></_>
<_>
<rects>
<_>
18 6 2 3 -1.</_>
<_>
18 7 2 1 3.</_></rects></_> <_>
<rects>
<_>
4 12 2 2 -1.</_>
4 12 1 1 2.</_>
<_>
5 13 1 1 2.</_></rects></_>
<_>
<rects>
6 6 6 2 -1.</_>
<_>
35
8 6 2 2 3.</_></rects></_>
<_>
<rects>
<_>
0 9 9 6 -1.</_>
<_>
3 9 3 6 3.</_></rects></_>
<_>
<rects>
<_>
17 18 2 2 -1.</_>
<_>
18 18 1 2 2.</_></rects></_>
<_>
<rects>
<_>
11 2 6 16 -1.</_>
<_>
13 2 2 16 3.</_></rects></_>
<_>
<rects>
<_>
2 4 15 13 -1.</_> <_>
7 4 5 13 3.</_></rects></_>
<_>
<rects>
<_>
16 2 3 10 -1.</_>
<_>
17 2 1 10 3.</_></rects></_>
<rects> <_>
6 10 2 1 -1.</_>
7 10 1 1 2.</_></rects></_>
<_>
36
<rects>
<_>
1 1 18 16 -1.</_>
<_>
10 1 9 16 2.</_></rects></_> <_>
<rects> <_>
14 4 3 15 -1.</_>
<_>
15 4 1 15 3.</_></rects></_>
<_>
<rects>
<_>
19 13 1 2 -1.</_>
<_>
19 14 1 1 2.</_></rects></_>
<_>
<rects>
<_>
2 6 5 8 -1.</_>
<_>
2 10 5 4 2.</_></rects></_></features></cascade>
</opencv_storage>
37
Home Page
38
39
40
CHAPTER 9:
CONCLUSION AND
FUTURE WORK
41
9. CONCLUSION AND FUTURE WORK
9.1 CONCLUSION
In this paper, we have presented the concept and implemented a system to detect driver
drowsiness using computer vision which focuses to notify the driver if he is drowsy. The
proposed system has the capability to detect the real time state of the driver in day and
night conditions with the help of a camera. The detection of the Face and Eyes applied
based on the symmetry. We have developed a non-intrusive prototype of a computer
vision-based system for real-time monitoring of the driver’s drowsiness.
For future work, the objective will be to reduce the percentage error, that is, reduce the
amount of false alarms. To achieve this, development of additional entities or experiments
will be done, using better drivers and incorporating new analysis modules, for example,
facial expressions(yawns).
42