[go: up one dir, main page]

0% found this document useful (0 votes)
405 views42 pages

Driver Drowsiness Detection

This document provides an overview of an advanced driver assistance system to detect driver drowsiness. The existing systems that detect eye-steering correlation have disadvantages like being complex and producing incorrect results. The proposed system detects driver fatigue and distraction by only processing the eye region using a webcam in real-time, which has less computational complexity compared to full face processing. The system configuration requires hardware like Intel i3 processor, 2GB RAM, and software like Windows 8, Netbeans IDE, OpenCV, and Java for the front end. WAMP server is used to serve dynamic web pages. The software aims to analyze the driver's eyes and alert them if drowsiness is detected to help avoid accidents.

Uploaded by

I MCA A 202224
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
405 views42 pages

Driver Drowsiness Detection

This document provides an overview of an advanced driver assistance system to detect driver drowsiness. The existing systems that detect eye-steering correlation have disadvantages like being complex and producing incorrect results. The proposed system detects driver fatigue and distraction by only processing the eye region using a webcam in real-time, which has less computational complexity compared to full face processing. The system configuration requires hardware like Intel i3 processor, 2GB RAM, and software like Windows 8, Netbeans IDE, OpenCV, and Java for the front end. WAMP server is used to serve dynamic web pages. The software aims to analyze the driver's eyes and alert them if drowsiness is detected to help avoid accidents.

Uploaded by

I MCA A 202224
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

\

CHAPTER 1:
INTRODUCTION

1
1. INTRODUCTION

1.1 ABOUT THE PROJECT

In today’s fast moving world, people depend on their means of transport excessively.
Feeling drowsy and fatigued during a long drive or after a short night’s sleep is common
among everyone. This physical feeling of tiredness brings down the level of concentration
of the driver. Such conditions are not favoured while driving and result in the increase of
accidents. Driver drowsiness and exhaustion are prime contenders in the cause of road
accidents. The case of car accidents caused by driver drowsiness is increasing at a
shocking pace. Recent numbers indicate 10% to 40% of all road accidents are due to
drivers feeling exhausted and sleepy. In the trucking industry, about 60% of fatal accidents
are caused by driver fatigue.

For the reasons stated above, developing systems to continuously monitor the driver’s
concentration on the road and level of drowsiness and alerting them is important.
Researchers and innovators have been working on producing such systems for the
betterment of the human race. From years of research, the best way of predicting such
behavior is from the physical factors like breathing, heart rate, pulse rate, brain waves, etc.
Such systems never made it to public use as they required attachment of sensors and
electrodes onto the bodies of the drivers, causing frustration. Some representative projects
in this line are the MIT-Smart Car, and ASV (Advanced Safety Vehicle) project performed
by Toyota, Nissan and Honda. Some other systems proposed included monitoring the
movement of pupils and movement of head using specialised helmets and optical lenses.
Such systems were not accepted even after not being disturbing as production costs were
challenging. Some indirect methods were also introduced to detect the drowsiness in a
driver by reading the maneuvering of the steering wheel, positioning of the wheel axles
etc.

These systems were also not entertained as they had other difficulties such as the type of
vehicle, environmental conditions, driver experience, geometric aspects, state of the road,

2
etc. Contrarily, the time taken to analyze these user behaviors is too much and thereby it
doesn’t work with the blinking of eyes or micro-sleeps. In this line we can find an
important Spanish project called TCD (Tech CO Driver) and the Mitsubishi advanced
safety vehicle system. People with exhaustion or fatigue show some visual behaviours
easily notable from changes in their physical features of the face like eyes, movement of
the face and head. Computer Vision is free from disturbance and a natural approach to
monitor the driver’s vigilance.

In this context, it is critical to use new and better technologies to design and build systems
that are able to monitor the drivers and to compute their level of concentration during the
whole process of driving. In this project, a module for Advanced Assistance to Driver
Drowsiness (AADD) is presented in order to control the number of accidents caused by
driver drowsiness and thus improve transport safety. This system will manage to detect the
driver drowsiness using machine vision and artificial intelligence automatically. We
present an algorithm to capture, locate and analyze both the driver’s face and eyes to
measure PERCLOS (percentage of eye closure).

1.2 PROBLEM IDENTIFICATION

Drowsy driving crashes are usually of high severity due to the drivers’ significant loss of
control, often leading to unpredicted vehicle trajectory and no braking response. Reliable
safety systems are needed to mitigate these crashes. The most important challenge is to
detect the driver’s condition sufficiently early, prior to the onset of sleep, to avoid
collisions.

1.3 OBJECTIVE

To detect whether the vehicle driver slept or not during the driving and also if the
driver had fatigue to give pre-intimation to avoid an accident

3
CHAPTER 2:
SYSTEM ANALYSIS

4
2. SYSTEM ANALYSIS

2.1 EXISTING SYSTEM

The existing system evaluate whether changes in the eye-steering correlation that can
indicate distraction. The auto-correlation and cross-correlation of horizontal eye position
and steering wheel angle show that eye movements associated with road scanning
procedure a low eye steering correlation. The eye steering correlation will control the
relationship on a straight road. The straight road led to a low correlation between the
steering movement and eye glances. In this system it is aim to detect the driver distraction
based on visual behaviour or the performance of the driver so for this purpose it is used to
define the relationship between the visual behaviour and the vehicle control. This system
evaluates the eye-steering correlation associated with the straight road with the assumption
that it might show a qualitatively and quantitatively different relationship compared with
curvy road and that it might be sensitive to distraction. Here in the visual behaviour and
vehicle control relationship reflects a fundamental perception-control mechanism which
plays a major role in driving and a strong eye steering correlation associated with this
process has been observed on curvy roads.

DISADVANTAGES

 The driver’s fatigue level, detecting the eye state is important more than detecting the
eye steering correlation.

 Correlation between the eye-steering is complex and might produce wrong results
undetermined cases.

 Steering movements consists of large calculations and graphs and it is difficult to


merge with eye movements in all type of roads. Lane determination is not necessary for
driver distraction.

5
2.2 PROPOSED SYSTEM
In the proposed system, the driver fatigue and distraction is detected only by processing of
eye region. The main symptoms of driver fatigue and distraction appear in the driver’s
eyes because of sleeping while driving. Nowadays, there are many fatigue detection
methods and the best is capturing the eyes in real time using web camera to detect the
physical responses in eyes. Moreover, the processing of the eye region instead of the
processing of the face region has less computational complexity

ADVANTAGES

 Accidents can be avoided by alerting the driver’s distraction and drowsiness using
warning signals.

 Comparing to the driver detection algorithm this PERCLOS system reduces the time
complexity.

 It is used to track the alertness of the driver throughout the journey.

 The warning alarm alerts the driver as well as the passengers to be conscious about the
driver’s behaviour.

 This system typically behaves as a user-friendly application.

6
CHAPTER 3:
SYSTEM
CONFIGURATION

7
3. SYSTEM CONFIGURATION

3.1 HARDWARE REQUIREMENTS

 System : Intel i3.



 Hard Disk : 250 GB.

 RAM : 2 GB.

 Monitor : 14‟ Color Monitor.

 Mouse : Optical Mouse.

3.2 SOFTWARE REQUIREMENTS

 Operating system : Window 8(64bits).



 IDE : Netbeans

 Database : Opencv.

 Front End : JAVA.

8
CHAPTER 4:
SOFTWARE
DESCRIPTION

4. SOFTWARE DESCRIPTION

9
WAMP Server
WAMPs are packages of independently-created programs installed on computers that
use a Microsoft Windows operating system. The interaction of these programs enables
dynamic web pages to be served over a computer network, such as the internet or a private
network. The equivalent installation on a Linux operating system is known as LAMP. The
equivalent installation on a Mac operating system is known as MAMP. The equivalent
installation on a Solaris operating system is known as SAMP. The equivalent installation
on a FreeBSD operating system is known as FAMP.
"WAMP" is an acronym formed from the initials of the operating system (Windows) and
the package's principal components: Apache, MySQL and PHP (or Perl or Python).
 Apache is a web server, which allows people with web browsers like Internet
Explorer or Firefox to connect to a computer and see information there as web pages.
 MySQL is a database manager (that is, it keeps track of data in a highly organized
way).
 PHP is a scripting language which can manipulate information held in a database
and generate web pages afresh each time an element of content is requested from a
browser.
JAVA:

Java is a small, simple, safe, object oriented, interpreted or dynamically optimized, byte
coded, architectural, garbage collected, multithreaded programming language with a
strongly typed exception-handling for writing distributed and dynamically extensible
programs.

Java is an object oriented programming language. Java is a high-level, third generation


language like C, FORTRAN, Small talk, Pearl and many others. You can use java to write
computer applications that crunch numbers, process words, play games, store data or do
any of the thousands of other things computer software can do. Special programs called
applets that can be downloaded from the internet and played safely within a web browser.
Java a supports this application and the follow features make it one of the best
programming languages.

 It is simple and object oriented

10
 It helps to create user friendly interfaces.
 It is very dynamic.
 It supports multithreading.
 It is platform independent
 It is highly secure and robust.
 It supports internet programming

Java is a programming language originally developed by Sun Microsystems and released


in 1995 as a core component of Sun's Java platform. The language derives much of its
syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java
applications are typically compiled to byte code which can run on any Java virtual
machine (JVM) regardless of computer architecture. The original and reference
implementation Java compilers, virtual machines, and class libraries were developed by
Sun from 1995. As of May 2007, in compliance with the specifications of the Java
Community Process, Sun made available most of their Java technologies as free software
under the GNU General Public License. Others have also developed alternative
implementations of these Sun technologies, such as the GNU Compiler for Java and GNU
Classpath.

The Java language was created by James Gosling in June 1991 for use in a set top box
project. The language was initially called Oak, after an oak tree that stood outside
Gosling's office - and also went by the name Green - and ended up later being renamed to
Java, from a list of random words. Gosling's goals were to implement a virtual machine
and a language that had a familiar C/C++ style of notation.

Primary goals
11
There were five primary goals in the creation of the Java language:

1. It should use the object-oriented programming methodology.


2. It should allow the same program to be executed on multiple operating systems.
3. It should contain built-in support for using computer networks.
4. It should be designed to execute code from remote sources securely.
5. It should be easy to use by selecting what were considered the good parts of other
object-oriented languages.

The Java platform is the name for a bundle of related programs, or platform, from Sun
which allow for developing and running programs written in the Java programming
language. The platform is not specific to any one processor or operating system, but rather
an execution engine (called a virtual machine) and a compiler with a set of standard
libraries which are implemented for various hardware and operating systems so that Java
programs can run identically on all of them. Different "editions" of the platform are
available, including:

 Java ME (Micro Edition): Specifies several different sets of libraries (known as


profiles) for devices which are sufficiently limited that supplying the full set of Java
libraries would take up unacceptably large amounts of storage.
 Java SE (Standard Edition): For general purpose use on desktop PCs, servers and
similar devices.
 Java EE (Enterprise Edition): Java SE plus various APIs useful for multi-tier client-
server enterprise applications.
The Java Platform consists of several programs, each of which provides a distinct
portion of its overall capabilities. For example, the Java compiler, which converts Java
source code into Java bytecode (an intermediate language for the Java Virtual Machine
(JVM)), is provided as part of the Java Development Kit (JDK). The sophisticated Java
Runtime Environment (JRE), complementing the JVM with a just-in-time (JIT) compiler,
converts intermediate bytecode into native machine code on the fly. Also supplied are
extensive libraries (pre-compiled into Java bytecode) containing reusable code, as well as
numerous ways for Java applications to be deployed, including being embedded in a web
page as an applet. There are several other components, some available only in certain
editions.

The essential components in the platform are the Java language compiler, the libraries, and

12
the runtime environment in which Java intermediate bytecode "executes" according to the
rules laid out in the virtual machine specification.

Java Virtual Machine

The heart of the Java Platform is the concept of a "virtual machine" that executes Java
bytecode programs. This bytecode is the same no matter what hardware or operating
system the program is running under. There is a JIT compiler within the Java Virtual
Machine, or JVM. The JIT compiler translates the Java bytecode into native processor
instructions at run-time and caches the native code in memory during execution.

The use of bytecode as an intermediate language permits Java programs to run on


any platform that has a virtual machine available. The use of a JIT compiler means that
Java applications, after a short delay during loading and once they have "warmed up" by
being all or mostly JIT-compiled, tend to run about as fast as native programs. Since JRE
version 1.2, Sun's JVM implementation has included a just-in-time compiler instead of an
interpreter.

Although Java programs are Platform Independent, the code of the Java Virtual
Machine (JVM) that execute these programs are not. Every Operating System has its own
JVM.

Java Runtime Environment

The Java Runtime Environment, or JRE, is the software required to run any

13
application deployed on the Java Platform. End-users commonly use a JRE in software
packages and Web browser plugins. Sun also distributes a superset of the JRE called the
Java 2 SDK (more commonly known as the JDK), which includes development tools such
as the Java compiler, Javadoc, Jar and debugger.

One of the unique advantages of the concept of a runtime engine is that errors (exceptions)
should not 'crash' the system. Moreover, in runtime engine environments such as Java
there exist tools that attach to the runtime engine and every time that an exception of
interest occurs they record debugging information that existed in memory at the time the
exception was thrown (stack and heap values). These Automated Exception Handling tools
provide 'root-cause' information for exceptions in Java programs that run in production,
testing or development environments.

14
CHAPTER 5:
SYSTEM DESIGN

5. SYSTEM DESIGN

15
5.1 GENERAL

System design is the process or art of defining the architecture, components, modules,
interface, and data for the system to satisfy specified requirements. This chapter deals with
various design and function of the system.

5.2 STRUCTURE OF DESIGN DOCUMENT

5.2.1 SYSEM ARCHITECTURE

Figure 5.1 Architecture diagram

5.2.2 USECASE DIAGRAM


16
Use-case diagram are usually referred to as behavior diagram used to describe a set
of actions that some system should or can perform in collaboration with one or more
external users of the system.

start detection (opencv)

Eye detection

driver

eye parameter calculation

Drowsiness level determination

intimation

Figure 5.2 Use-case diagram

5.2.3 SEQUENCE DIAGRAM

17
A sequence diagram is an interaction diagram that shows how objects operate with
one another and in what order. It is a construct of a message sequence chart. A sequence
diagram shows object interactions arranged in time sequence.

camera Driver

start camera

load opencv library

find face and eye with help of haarcscade

face detected

drowsiness level calculated

Intimation via alert

Figure 5.3 Sequence Diagram for payment

5.2.4 CLASS DIAGRAM

Class diagram in the unified modeling language (UML) is a type of static structure

18
diagram that structure of a system by showing the system classes attribute operators.

Figure 5.6 Class diagram

5.2.5 COLLABRATION DIAGRAM

A collaboration diagram, also known as a communication diagram, is an illustration of


the relationships and interactions among software objects in the Unified Modeling
Language (UML).

1: start camera
4: face detected
Driver camera

2: load opencv library


3: find face and eye with help of haarcscade
5: drowsiness level calculated
6: Intimation via alert

19
CHAPTER 6:
SYSTEM
IMPLEMENTATION

6. SYSTEM IMPLEMENTATION

6.1 MODULES
20
• Start Detection (Camera Opencv)
• Driver Eye Detection
• Eye Parameters Calculation
• Drowsiness Level Determination
• Intimation

START DETECTION (CAMERA OPENCV)


• This the first module of this system, its used to open a camera with (Opencv)
library.
• After initialize the camera its ready to detect the human face or driver face.
DRIVER EYE DETECTION
• This the second module with the help of this module to detect human eye
through (haarcascade_frontalface_alt) this xml file.
• After with help of this haarcascade file to find the x,y coordinates of eye .

EYE PARAMETERS CALCULATION


• In this module for recognize the face of the atm user. So if the user cover the face
using helmet or etc, This module going to detect that face covered or uncovered

21
CHAPTER 7:
SYSTEM TESTING

7. SYSTEM TESTING

The purpose of testing is to discover errors. Testing is the process of trying to discover
every conceivable fault or weakness in a work product. It provides a way to check the

22
functionality of components, subassemblies, assemblies, and/or a finished product. It is the
process of exercising software with the intent of ensuring that the software system meets
its requirements and user expectations and type address a specific testing requirement.

7.1 UNIT TESTING

Unit testing involves the design of test cases that validate that the internal program logic
is functioning properly, and that program inputs produce valid outputs. All decision
branches and internal code flow should be validated. It is the testing of individual software
units of the application. It is done after the completion of an individual unit before
integration. This is a structural testing, that relies on knowledge of its construction and
is invasive. Unit tests perform basic tests at component level and test a specific business
process, application, and/or System configuration. Unit tests ensure that each unique path
of a business process performs accurately to the documented specifications and contains
clearly defined inputs and expected results. Unit testing is usually conducted as part of a
combined code and test phase of the software lifecycle, although it is not uncommon for
coding and unit testing to be conducted as two distinct phases.

TEST STRATEGY AND APPROACH


Field testing will be performed manually and functional tests will be written in detail.

TEST OBJECTIVES

 All field entries must work properly.



 Pages must be activated from the identified link.

 The entry screen, message and response must not be delayed.


FEATURES TO BE TESTED

 Verify that the entries are of the correct format



 No duplicate entries should be allowed

 All links should take the user to the correct page.

7.2 INTEGRATION TESTING

Integration tests are designed to test integrated software components to determine if they
actually run as one program. Testing is event driven and is more concerned with the

23
basic outcome of screens or fields. Integration tests demonstrate that although the
components were individually satisfaction, as shown by successfully unit testing, the
combination of components is correct and consistent. Integration testing is specifically
aimed at exposing the problems that arise from the combination of components.

FUNCTIONAL TEST
Functional tests provide systematic demonstrations that functions tested are available as
specified by the business and technical requirements, system documentation, and user
manuals.

Functional testing is centered on the following items:

 Valid Input : Identified classes of valid input must be accepted.


 Invalid Input : Identified classes of invalid input must be rejected.
 Functions : Identified functions must be exercised.
 Output : Identified classes of application outputs must be exercised.

Systems/Procedures: interfacing systems or procedures must be invoked. Organization and


preparation of functional tests is focused on requirements, key Functions, or special test
cases. In addition, systematic coverage pertaining to identify Business process flows; data
fields, predefined processes, and successive processes must be considered for testing.

SYSTEM TEST
System testing ensures that the entire integrated software system meets requirements. It
tests a configuration to ensure known and predicate results. An Example of system testing
is the configuration oriented system integration test. System testing is based on process
descriptions and flow emphasizing pre-driven Process links and integration points.

7.3 ACCEPTANCE TESTING


User acceptance testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional
requirements.

24
CHAPTER 8:
APPENDICES

8. APPENDICES

8.1 SOURCE CODE

25
package gui;
import com.sun.javafx.geom.Vec3f;
import java.awt.Graphics;
import java.awt.Image;
import java.awt.image.BufferedImage;
import java.io.ByteArrayInputStream;
import java.util.Vector;
import static javafx.scene.paint.Color.color;
import javax.imageio.ImageIO;
import org.opencv.core.MatOfByte;
import org.opencv.core.MatOfRect;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.core.Size;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.videoio.VideoCapture;
import static org.opencv.imgproc.Imgproc.COLOR_BGR2GRAY;
import static org.opencv.imgproc.Imgproc.circle;
import org.opencv.objdetect.CascadeClassifier;
public class FaceDetection extends javax.swing.JFrame {
private DaemonThread myThread = null;
VideoCapture webSource = null;
Mat frame = new Mat();
MatOfByte mem = new MatOfByte();
CascadeClassifier faceDetector = new
CascadeClassifier(FaceDetection.class.getResource("haarcascade_frontalface_alt.xml").ge
tPath().substring 1));
CascadeClassifier eyeDetector = new
CascadeClassifier(FaceDetection.class.getResource("haarcascade_eye.xml").getPath().sub
string(1));
CascadeClassifier(FaceDetection.class.getResource("haarcascade_smile.xml").getPath().su
bstring(1));

MatOfRect faceDetections = new MatOfRect();


MatOfRect eyeDetections = new MatOfRect();

26
//MatOfRect smileDetections = new MatOfRect();
//VideoCapture camera=new VideoCapture(0);
class DaemonThread implements Runnable {
protected volatile boolean runnable = false; @Override
public void run() {
synchronized (this) {
while (runnable) {
if (webSource.grab()) {
try {
webSource.retrieve(frame);
Graphics g = jPanel1.getGraphics();
faceDetector.detectMultiScale(frame, faceDetections);
//System.out.println("frame " +frame);
for (Rect rect : faceDetections.toArray()) {
//System.out.println("ttt");
Imgproc.rectangle(frame, new Point(rect.x, rect.y), new Point(rect.x + rect.width, rect.y +
rect.height),
new Scalar(0, 255, 0));
//System.out.println("point 1= "+ new Point(rect.x, rect.y));
// smileDetector.detectMultiScale(frame, smileDetections);
// for (Rect rect3 : smileDetections.toArray()) {
// if ((rect3.x > rect.x & rect3.x < (rect.x + rect.width)) && (rect3.y >
rect.y & rect3.y < (rect.y + rect.height))) {
// //jLabel2.setText("You are smiling");
// System.out.println("smiling");
// else{
// System.out.println("----------------------------->not smiling");
// //jLabel2.setText("");
eyeDetector.detectMultiScale(frame, eyeDetections);
for (Rect rect2 : eyeDetections.toArray()) {
Imgproc.rectangle(frame, new Point(rect2.x, rect2.y), new Point(rect2.x + rect2.width,
rect2.y + rect2.height),

new Scalar(255, 0, 255));


Rect rectCrop = new Rect(rect2.x, rect2.y, rect2.width, rect2.height); //crop the image to

27
take the eye//
Mat image_roi = new Mat(frame,rectCrop);
Imgcodecs.imwrite("D:\\CapturePictures\\new3.jpg", image_roi);
//System.out.println("fine");
if ((rect2.x > rect.x & rect2.x < (rect.x + rect.width)) && (rect2.y > rect.y & rect2.y <
(rect.y + rect.height))) {
Imgproc.cvtColor(image_roi, image_roi, Imgproc.COLOR_BGR2GRAY);
Imgcodecs.imwrite("D:\\CapturePictures\\new4.jpg", image_roi);
//System.out.println("greyframe = " + frame);
//Imgproc.Sobel(frame, frame, 9, 9, 9);
Imgproc.medianBlur(image_roi, image_roi, 9);
//System.out.println("noise done");
//System.out.println("rows= "+image_roi.rows());
Mat circles = new Mat();
Imgproc.HoughCircles(image_roi, circles, Imgproc.HOUGH_GRADIENT, 1, 10, 50,20 ,
0, 0);
//System.out.println("circles found");
// System.out.println("circle x= "+circles.get(0,0));
// System.out.println("circle y= "+circles.get(1,0));
//// System.out.println("circle r= "+circles.get(2,0));
// System.out.println("circle size = " + circles.size());
// System.out.println("circle cols = " + circles.cols());
float circle[] = new float[3];
for (int i = 0; i < circles.cols(); i++) {
org.opencv.core.Point center = new org.opencv.core.Point();
circles.get(0, i, circle);
center.x = circle[0];
center.y = circle[1];
Imgproc.circle(image_roi, center, (int) circle[2], new Scalar(255, 255, 100, 1), 4);
System.out.println("eyes open");}
} else {
System.out.println("eyes closed-------------------------------->sleeping");

}}
//Mat greyframe = null;

28
// Imgproc.cvtColor(frame, frame, Imgproc.COLOR_BGR2GRAY);
// Imgcodecs.imwrite("D:\\CapturePictures\\new4.jpg", frame);
// System.out.println(":o0");
// //System.out.println("greyframe = " + frame);
// //Imgproc.Sobel(frame, frame, 9, 9, 9);
// Imgproc.medianBlur(frame, frame, 9);
// System.out.println("noise done");
// //System.out.println("rows= "+image_roi.rows());
// //Mat newframe=new Mat();
// Mat circles = new Mat();
// Imgproc.HoughCircles(frame, circles,
Imgproc.HOUGH_GRADIENT, 1, 10, 30, 5, 5, 5);
// System.out.println("circles found");
//// System.out.println("circle x= "+circles.get(0,0));
//// System.out.println("circle y= "+circles.get(1,0));
//// System.out.println("circle r= "+circles.get(2,0));
// System.out.println("circle size = " + circles.size());
// System.out.println("circle cols = " + circles.cols());
// float circle[] = new float[3];
// for (int i = 0; i < circles.cols(); i++) {
// org.opencv.core.Point center = new org.opencv.core.Point();
// circles.get(0, i, circle);
// System.out.println("r= "+circle[2]);
// Imgproc.circle(frame, center, (int) circle[2], new Scalar(255,
255, 100, 1), 4);
Imgcodecs.imencode(".bmp", frame, mem);
Image im = ImageIO.read(new ByteArrayInputStream(mem.toArray()));
BufferedImage buff = (BufferedImage) im;
Imgcodecs.imwrite("D:\\CapturePictures\\new.jpg", frame);
//Imgcodecs.imwrite("D:\\CapturePictures\\greynew.jpg", greyframe);
if (g.drawImage(buff, 0, 0, getWidth(), getHeight() - 150, 0, 0, buff.getWidth(),
buff.getHeight(), null)) {

if (runnable == false) {
System.out.println("Paused ..... ");

29
this.wait();
}
}
} catch (Exception ex) {
System.out.println("Error");
}
}
}
}
}
/**
* Creates new form FaceDetection
*/
public FaceDetection() {
initComponents();
System.out.println(FaceDetection.class.getResource("haarcascade_frontalface_alt.xml").ge
tPath().substring(1));
}
/**
* This method is called from within the constructor to initialize the form.
* WARNING: Do NOT modify this code. The content of this method is always
* regenerated by the Form Editor.
@SuppressWarnings("unchecked")
// <editor-fold defaultstate="collapsed" desc="Generated Code">//GEN-
BEGIN:initComponents
private void initComponents() {
jPanel1 = new javax.swing.JPanel();
jButton1 = new javax.swing.JButton();
jButton2 = new javax.swing.JButton();
setDefaultCloseOperation(javax.swing.WindowConstants.EXIT_ON_CLOSE);
javax.swing.GroupLayout jPanel1Layout = new javax.swing.GroupLayout(jPanel1);
jPanel1.setLayout(jPanel1Layout);

jPanel1Layout.setHorizontalGroup(
jPanel1Layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)

30
.addGap(0, 0, Short.MAX_VALUE)
);
jPanel1Layout.setVerticalGroup(
jPanel1Layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGap(0, 376, Short.MAX_VALUE)
);
jButton1.setText("Start");
jButton1.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton1ActionPerformed(evt);
} });
jButton2.setText("Pause");
jButton2.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton2ActionPerformed(evt);
} });
javax.swing.GroupLayout layout = new javax.swing.GroupLayout(getContentPane());
getContentPane().setLayout(layout);
layout.setHorizontalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addGap(24, 24, 24)
.addComponent(jPanel1, javax.swing.GroupLayout.DEFAULT_SIZE,
javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE)
.addContainerGap())
.addGroup(layout.createSequentialGroup()
.addGap(255, 255, 255)
.addComponent(jButton1)
.addGap(86, 86, 86)
.addComponent(jButton2)
.addContainerGap(258, Short.MAX_VALUE))
);

layout.setVerticalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)

31
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addComponent(jPanel1, javax.swing.GroupLayout.PREFERRED_SIZE,
javax.swing.GroupLayout.DEFAULT_SIZE,
javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE)
.addComponent(jButton1)
.addComponent(jButton2))
.addContainerGap(javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE)) );
pack(); }// </editor-fold>//GEN-END:initComponents
private void jButton2ActionPerformed(java.awt.event.ActionEvent evt) {//GEN-
FIRST:event_jButton2ActionPerformed
myThread.runnable = false; // stop thread
jButton2.setEnabled(false); // activate start button
jButton1.setEnabled(true); // deactivate stop button
webSource.release(); // stop caturing fron cam
}//GEN-LAST:event_jButton2ActionPerformed
private void jButton1ActionPerformed(java.awt.event.ActionEvent evt) {//GEN-
FIRST:event_jButton1ActionPerformed
webSource = new VideoCapture(0); // video capture from default cam
myThread = new DaemonThread(); //create object of threat class
Thread t = new Thread(myThread);
t.setDaemon(true);
myThread.runnable = true;
t.start(); //start thrad
jButton1.setEnabled(false); // deactivate start button
jButton2.setEnabled(true); // activate stop button
}//GEN-LAST:event_jButton1ActionPerformed
/**
* @param args the command line arguments */
public static void main(String args[]) {

System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
/* Set the Nimbus look and feel */

32
//<editor-fold defaultstate="collapsed" desc=" Look and feel setting code (optional) ">
/* If Nimbus (introduced in Java SE 6) is not available, stay with the default look and feel.
* For details see http://download.oracle.com/javase/tutorial/uiswing/lookandfeel/plaf.html
*/
try {
for (javax.swing.UIManager.LookAndFeelInfo info :
javax.swing.UIManager.getInstalledLookAndFeels()) {
if ("Nimbus".equals(info.getName())) {
javax.swing.UIManager.setLookAndFeel(info.getClassName());
break;
}
}
} catch (ClassNotFoundException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);
} catch (InstantiationException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);
} catch (IllegalAccessException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);
} catch (javax.swing.UnsupportedLookAndFeelException ex) {
java.util.logging.Logger.getLogger(FaceDetection.class.getName()).log(java.util.logging.L
evel.SEVERE, null, ex);}
//</editor-fold>
/* Create and display the form */
java.awt.EventQueue.invokeLater(new Runnable() {
public void run() {
new FaceDetection().setVisible(true);
}});}
// Variables declaration - do not modify//GEN-BEGIN:variables
private javax.swing.JButton jButton1;

private javax.swing.JButton jButton2;


private javax.swing.JPanel jPanel1;

33
// End of variables declaration//GEN-END:variables
}
EYE CASCADE FILES
<rects>
<_>
10 14 4 6 -1.</_>
<_>
10 16 4 2 3.</_></rects></_>
<_>
<rects>
<_>
9 7 3 2 -1.</_>
<_>
10 7 1 2 3.</_></rects></_>
<_>
<rects>
<_>
6 9 6 2 -1.</_>
<_>
6 9 3 1 2.</_>
<_>
9 10 3 1 2.</_></rects></_>
<_>
<rects>
<_>
0 2 1 12 -1.</_>
<_>
0 6 1 4 3.</_></rects></_> <rects>
<_>
4 0 15 1 -1.</_>
9 0 5 1 3.</_></rects></_>

<_>
<rects>

34
<_>
9 0 8 2 -1.</_>
<_>
9 0 4 1 2.</_>
<_>
13 1 4 1 2.</_></rects></_>
<_>
<rects>
<_>
12 2 8 1 -1.</_>
<_>
16 2 4 1 2.</_></rects></_>
<_>
<rects>
<_>
7 1 10 6 -1.</_>
<_>
7 3 10 2 3.</_></rects></_>
<_>
<rects>
<_>
18 6 2 3 -1.</_>
<_>
18 7 2 1 3.</_></rects></_> <_>
<rects>
<_>
4 12 2 2 -1.</_>
4 12 1 1 2.</_>
<_>
5 13 1 1 2.</_></rects></_>
<_>
<rects>

6 6 6 2 -1.</_>
<_>

35
8 6 2 2 3.</_></rects></_>
<_>
<rects>
<_>
0 9 9 6 -1.</_>
<_>
3 9 3 6 3.</_></rects></_>
<_>
<rects>
<_>
17 18 2 2 -1.</_>
<_>
18 18 1 2 2.</_></rects></_>
<_>
<rects>
<_>
11 2 6 16 -1.</_>
<_>
13 2 2 16 3.</_></rects></_>
<_>
<rects>
<_>
2 4 15 13 -1.</_> <_>
7 4 5 13 3.</_></rects></_>
<_>
<rects>
<_>
16 2 3 10 -1.</_>
<_>
17 2 1 10 3.</_></rects></_>
<rects> <_>
6 10 2 1 -1.</_>

7 10 1 1 2.</_></rects></_>
<_>

36
<rects>
<_>
1 1 18 16 -1.</_>
<_>
10 1 9 16 2.</_></rects></_> <_>
<rects> <_>
14 4 3 15 -1.</_>
<_>
15 4 1 15 3.</_></rects></_>
<_>
<rects>
<_>
19 13 1 2 -1.</_>
<_>
19 14 1 1 2.</_></rects></_>
<_>
<rects>
<_>
2 6 5 8 -1.</_>
<_>
2 10 5 4 2.</_></rects></_></features></cascade>
</opencv_storage>

8.2 SCREEN SHOTS

37
Home Page

38
39
40
CHAPTER 9:
CONCLUSION AND
FUTURE WORK

41
9. CONCLUSION AND FUTURE WORK

9.1 CONCLUSION

In this paper, we have presented the concept and implemented a system to detect driver
drowsiness using computer vision which focuses to notify the driver if he is drowsy. The
proposed system has the capability to detect the real time state of the driver in day and
night conditions with the help of a camera. The detection of the Face and Eyes applied
based on the symmetry. We have developed a non-intrusive prototype of a computer
vision-based system for real-time monitoring of the driver’s drowsiness.

9.2 FUTURE ENHANCEMENT

For future work, the objective will be to reduce the percentage error, that is, reduce the
amount of false alarms. To achieve this, development of additional entities or experiments
will be done, using better drivers and incorporating new analysis modules, for example,
facial expressions(yawns).

42

You might also like