[go: up one dir, main page]

CN106530623B - A kind of fatigue driving detection device and detection method - Google Patents

A kind of fatigue driving detection device and detection method Download PDF

Info

Publication number
CN106530623B
CN106530623B CN201611264042.1A CN201611264042A CN106530623B CN 106530623 B CN106530623 B CN 106530623B CN 201611264042 A CN201611264042 A CN 201611264042A CN 106530623 B CN106530623 B CN 106530623B
Authority
CN
China
Prior art keywords
image
state
face
driver
fatigue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201611264042.1A
Other languages
Chinese (zh)
Other versions
CN106530623A (en
Inventor
曹兵
李鹏
王许生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hongchang Up Power Technology Co Ltd
Nanjing University of Science and Technology
Original Assignee
Shenzhen Hongchang Up Power Technology Co Ltd
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hongchang Up Power Technology Co Ltd, Nanjing University of Science and Technology filed Critical Shenzhen Hongchang Up Power Technology Co Ltd
Priority to CN201611264042.1A priority Critical patent/CN106530623B/en
Publication of CN106530623A publication Critical patent/CN106530623A/en
Application granted granted Critical
Publication of CN106530623B publication Critical patent/CN106530623B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Processing (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The invention discloses a kind of fatigue driving detection device and detection methods, including arm processor equipment, SD card, USB camera, data line, radiator fan and warning device, arm processor equipment further includes Face detection module, human eye state identification module, tired determination module;Detection method includes the following steps for fatigue driving detection: 1. initialization cameras;2. acquiring image, image information is conveyed to arm processor equipment;3. image preprocessing;4. judging whether driver is in a state of fatigue by the face that human eye feature classifier positions the detection driver of human eye area 5. by face characteristic classifier locating human face region;6. the eye of detection driver judges whether driver is in a state of fatigue;7. comprehensive descision driver fatigue state starts corresponding alarm.The ocular image binaryzation processing method that the present invention uses can preferably be partitioned into pupil and fringe region, and than the detection of single method, recognition accuracy is higher.

Description

A kind of fatigue driving detection device and detection method
Technical field
The invention belongs to vehicle security drive technical fields, belong to the technologies such as image procossing, pattern-recognition, neural network neck Domain, especially a kind of fatigue driving detecting system.
Background technique
With the rapid development of social economy, the quantity of automobile is also more and more next more while transportation develops, More and more multiple trend is presented in the traffic accident as caused by fatigue driving.For this phenomenon, produce various tired Please detection technique is sailed, contact measurement and non-contact detection are broadly divided into.Contact measurement is usually to measure driver's Electrocardiogram, electroencephalogram etc., this measurement method can not only interfere the driver behavior of driver, but also at high cost.When driver is tired Lao Shi can show to bow, eye closing frequency increases etc., and physiological characteristics, non-contact detection technology are exactly to be detected by monitoring device These physiological characteristics of driver.Contactless fatigue-driving detection technology has the characteristics that at low cost, accuracy is high, therefore, It is widely adopted in current fatigue driving detection device.
The currently existing contactless fatigue detection device based on physiological driver's feature passes through image procossing skill mostly Art, locating human face, then in the range of face analyze eyes state, judge whether fatigue.Chinese invention patent CN101593425A and CN201681470U discloses a kind of method for detecting fatigue driving, and is by single detection people The state of eye judges the fatigue state of people, although this single detection mode can have in fatigue detecting it is certain accurate Property, but the influence for being easy to be illuminated by the light, whether wear glasses etc. factors causes accidentally to survey.Wherein patent of invention CN101593425A is used Maximum variance between clusters, be easy by the eyelash shadow ring, be unfavorable for iris segmentation.
Summary of the invention
The purpose of the present invention is to provide a kind of fatigue driving detection device and detection methods, by face state and people It is corresponding to can be realized different alarms as a result, the fatigue state to driver carries out final judgement for the comprehensive descision of eye shape state Mode, and ocular image binaryzation processing adaptive approach used can preferably be partitioned into iris area in the present invention Domain.
The technical solution for realizing the aim of the invention is as follows:
A kind of fatigue driving detection device, including for storing embedded system SD card, for acquire driver front The USB camera of image, the warning device for prompting driver, is used for processor heat dissipation at the data line for equipment power supply Radiator fan;Further include: the report for the arm processor equipment type of alarm different with progress that image procossing and fatigue determine Alarm device;The warning device includes that can carry out different alarm sides according to the Different Results of comprehensive descision driver fatigue state The loudspeaker warning device and LED lamp alarm device of formula alarm;
The arm processor equipment includes Face detection module, human eye state identification module, tired determination module;Face After locating module receives pictorial information, face state is identified with Adaboost algorithm locating human face and by coordinate difference;Human eye State recognition module obtains the human face region image in Face detection module, positions human eye area with Adaboost algorithm, passes through Human eye state is identified using the integral projection of eye self-adaption binaryzation processing image;Tired determination module according to face state with Human eye state carries out tired judgement, and judging result is then converted to electric signal, is conveyed to warning device by I/O interface.
The Face detection module, human eye state identification module, tired determination module are specific as follows:
Face detection module: with trained face characteristic classifier locating human face region, and face center position is calculated Set and image center be in longitudinal deviation value, according to the vibration frequency of face center in the time of deviation oscillation and setting come Judge whether driver is in a state of fatigue;
Human eye state identification module: behind locating human face region, according to the face distribution proportion of people, by 3/5, face top It is allocated as interested region, positions human eye area with trained human eye classifier;Feature is carried out to the eye of driver to mention It takes, calculates separately the maximum value of the vertically and horizontally integral projection of the image of ocular binary conversion treatment and the width of integral domain Ratio is spent, in conjunction with two ratios, comprehensive judgement goes out the current state of human eye, i.e. human eye closure situation, and sentences by what is set It fixes, it is whether tired to carry out the current state of mind of driver;
Tired determination module: according to the judging result of face state and human eye state, the state of mind of driver is carried out Final judgement;
A kind of method for detecting fatigue driving, includes the following steps:
Step 1, camera is initialized, setting camera reads in the attribute value of picture;
Step 2, USB camera acquires image, and image information is conveyed to arm processor equipment;
Step 3, image preprocessing, i.e. image down, gray processing processing;
Step 4, existing face and human eye feature classifier in OpenCV machine vision library are loaded, training in advance is passed through Face characteristic classifier locating human face region;
Step 5, pass through face state determine fatigue state: using Adaboost algorithm detect driver face there are positions It sets, calculates face center position and image center in longitudinal deviation value, according to deviation value compared with given threshold and one It fixes time the vibration frequency of interior face center, that is, frequency of nodding judges whether driver is in a state of fatigue;
Step 6, pass through human eye state determine fatigue state: using Adaboost algorithm detect driver eye there are positions It sets, feature extraction is carried out to the eye of driver, calculates the maximum value of the integral projection of ocular and the width of integral domain Ratio, and judge whether driver is in a state of fatigue compared with given threshold T1;
Step 7, Image Acquisition or the different alarm sides of starting are returned to as a result, determining according to the tired comprehensive judgement of step 5,6 One kind of formula.
Compared with prior art, the present invention its remarkable advantage:
(1) present invention uses embedded system, small in size, easy to use;(2) reduce individual sex differernce to testing result Influence, improve fatigue judgement accuracy, have preferable practicability;(3) by combine people head and human eye two Notable feature carries out compound judgement, carries out recognition detection than single method, recognition accuracy is higher;(4) ocular image two Value processing adaptive approach used can preferably be partitioned into pupil and fringe region, not vulnerable to the influence of eyelash;(5) Differentiated by the state of mind to driver, is stopped not when starting warning device prompting driver is under fatigue state Breath, can effectively reduce traffic accident, provide a strong guarantee for the security of the lives and property of the people.
Present invention is further described in detail with reference to the accompanying drawing.
Detailed description of the invention
Fig. 1 is fatigue driving detection device structural schematic diagram.
Fig. 2 is connection schematic diagram inside arm processor equipment.
Fig. 3 is method for detecting fatigue driving flow diagram.
Fig. 4 is the vertical integral projection figure of the image of eye binary conversion treatment.
Fig. 5 is the horizontal integral projection figure of the image of eye binary conversion treatment.
Fig. 6 is PERCLOS measuring principle figure.
Fig. 7 is eyes image.
Fig. 8 is the image of the self-adaption binaryzation processing of eyes image.
Specific embodiment
In conjunction with Fig. 1, a kind of fatigue driving detection device, including for storing embedded system SD card, for acquire drive The USB camera of the person's of sailing direct picture, for equipment power supply data line, the warning device for prompting driver, for locating Manage the radiator fan of device heat dissipation;Further include: the arm processor equipment alarm different with progress determined for image procossing and fatigue The warning device of mode;The warning device includes that can be carried out not according to the Different Results of comprehensive descision driver fatigue state With the loudspeaker warning device and LED lamp alarm device of type of alarm alarm;
The arm processor equipment includes Face detection module, human eye state identification module, tired determination module;Face After locating module receives pictorial information, face state is identified with Adaboost algorithm locating human face and by coordinate difference;Human eye State recognition module obtains the human face region image in Face detection module, positions human eye area with Adaboost algorithm, passes through Human eye state is identified using the integral projection of eye self-adaption binaryzation processing image;Tired determination module according to face state with Human eye state carries out tired judgement, and judging result is then converted to electric signal, is conveyed to warning device by I/O interface.
The Face detection module, human eye state identification module, tired determination module are specific as follows:
Face detection module: with trained face characteristic classifier locating human face region, and face center position is calculated Set and image center be in longitudinal deviation value, according to the vibration frequency of face center in the time of deviation oscillation and setting come Judge whether driver is in a state of fatigue;
Human eye state identification module: behind locating human face region, according to the face distribution proportion of people, by 3/5, face top It is allocated as interested region, positions human eye area with trained human eye classifier;Feature is carried out to the eye of driver to mention It takes, calculates separately the maximum value of the vertically and horizontally integral projection of the image of ocular binary conversion treatment and the width of integral domain Ratio is spent, in conjunction with two ratios, comprehensive judgement goes out the current state of human eye, i.e. human eye closure situation, and sentences by what is set It fixes, it is whether tired to carry out the current state of mind of driver;
Tired determination module: according to the judging result of face state and human eye state, the state of mind of driver is carried out Final judgement;
The arm processor equipment as shown in Figure 2 includes: power-switching circuit, usb circuit, crystal oscillating circuit, enables Signal circuit, SD card reading circuit;
Wherein, power supply interface conversion is realized in power-switching circuit part, first USB power supply circuit, passes through three poles of NXP type Pipe obtains 5V voltage, and centre one self-recovery fuse of concatenation makes device have the function of voltage protection, then USB power supply electricity The output on road connects voltage regulator circuit, and voltage regulator circuit is converted to three kinds of different burning voltages by voltage stabilizing chip;The work of ARM chip Determined with comprising image procossing and fatigue;Usb circuit is connected with LAN9512 chip, carries out data transmission, and receives USB and takes the photograph As the video information of head is incoming, information is passed in ARM chip processor by Video Decoder;Crystal oscillating circuit passes through crystal Oscillator generates clock signal, provides timing for processor;Enable signal circuit passes through 5V input voltage and field effect transistor Generate two kinds of signals, for provide SDRAM enable signal and ARM chip needed for RUN signal, SDRAM provides fortune for system Row space, while storing the compression image information of the 3-5 minute acquired recently;LED state indicator light is shown for equipment state; Linux system in SD card is read into equipment by SD card reading circuit, so that equipment is had running environment, and realize the reading of data Enter and stores;HDMI is that screen shows spare interface;The trigger signal of I/O interface offer warning device.
Video stream information is converted into digital information by conversion circuit and device by USB camera, and ARM chip is as main place Unit is managed, Face detection module is started, carries out Face detection and state analysis, treated, and image information is temporarily stored in In SDRAM, eye recognition module obtains face recognition module from SDRAM treated image information, carries out human eye state knowledge Not, then tired determination module according to Face detection module and the state outcome of human eye state identification module carries out tired judgement, It will finally determine that result is converted into electric signal and sends warning device to from I/O interface.
In conjunction with Fig. 3-5, the detection method of the fatigue driving detection device is comprised the following steps:
Step 1, camera is initialized, setting camera reads in the attribute value of picture, i.e., sets the image resolution ratio of reading It is set to 640 × 480;
Step 2, USB camera acquires image, and image information is conveyed to ARM embeded processor;
Step 3, image preprocessing, i.e. image down, gray processing processing;
As a kind of preferred scheme, image downscaling method specifically: by image down 1/2, the method for use is part Averaging method preferably retains original image information while reducing picture size, and the image after diminution can reduce operand, Improve real-time.
Step 4, existing face and human eye feature classifier in OpenCV machine vision library are loaded, training in advance is passed through Face characteristic classifier locating human face region obtains face key feature points with Harr feature detection mode, according to feature Point location human face region, and respectively obtain human face region upper left angle point and bottom right angular coordinate (x1,y1)、(x2,y2), calculate people Center point coordinate (the H of face area imagex,Hy), whereinAccording to (x1,y1)、(x2,y2) Two o'clock coordinate outlines human face region with rectangle frame;
Step 5, fatigue state is determined by face state: detects the face position of driver using Adaboost algorithm It sets, Image Acquisition is returned if face is not detected, if after collecting face, calculating human face region image center position (Hx, Hy) the image center position (I that arrives with initial acquisitionx,Iy) in longitudinal deviation value Diff, in unit period (10 seconds), system Count amount of images and total number of images amount that deviation value Diff is greater than given threshold T1 (initial to read in the 1/4 of picture altitude value), meter The vibration frequency of face center is calculated, that is, frequency of nodding judges whether driver is in a state of fatigue;
By face state determine fatigue state comprising the following steps:
5.1, after the center position for obtaining human face region, calculate the ordinate H of face centeryWith the ordinate of image IyDifference Diff=| Hy-Iy|;
5.2, statistical unit period (10 seconds) interior difference Diff are greater than the amount of images n and image totalframes of given threshold T1 N;
5.3, the ratio of amount of images n and image totalframes N are calculated, ratio is compared with given threshold T, if ratio Threshold value T dry greatly, then judge that driver is in a state of fatigue;Threshold value T is sized to 0.68, by experiment gained.
Step 6, pass through human eye state determine fatigue state: using Adaboost algorithm detect driver eye there are positions It sets, Image Acquisition is returned if human eye is not detected, if after collecting human eye, carrying out feature extraction, meter to the eye of driver Calculate the maximum value of the integral projection of the image of ocular binary conversion treatment and the width ratio of integral domain, and and given threshold T2 compares, and if more than threshold value T2, then judges that the eyes of driver are closures, unit of account period (10 seconds) driver, which closes one's eyes, to scheme As the ratio of quantity and total amount of images judges that driver is in a state of fatigue if ratio is greater than given threshold T;Threshold value T2 Value is according to obtained by experiment, and value is 2.0 in the present invention.
Fatigue state is determined by human eye state: specific comprising step in detail below:
6.1 position human eye by human eye feature classifier in human face region;
Human eye area, position fixing process and people are positioned by human eye feature classifier trained in advance in human face region image Face zone location is identical, and the feature classifiers only loaded are different;Calculate the center point coordinate of human eye area image, calculating process With calculate as human face region image center coordinate, judge human eye central point to locating human face region rectangle frame lower boundary Whether the ratio of distance and rectangle frame height meets normal organ distribution proportion, if judging, error is larger, may detection To eyebrow, detection is re-started.
6.2 pairs of ocular images carry out gaussian filtering, removal noise, using adaptive binary conversion treatment;
For different threshold values, the image that binary conversion treatment obtains is different, and the present invention is carried out using adaptive threshold Image binaryzation processing, can preferably be partitioned into pupil region, i.e. setting minimum threshold low and max-thresholds high, image Threshold tau used in binary conversion treatment is 0.05 to be gradually incremented by as step-length using numerical values recited, to same piece image constantly two-value again Change processing, obtains the image of a series of binary conversion treatment, therefrom finds out the smallest image of non-zero connected domain quantity as to be checked Altimetric image.The processing of self-adaption binaryzation used by step 6.2, specific method are as follows:
6.2.1. minimum threshold low=0.1 (0.1 times of image minimum gradation value) is set, max-thresholds high=0.5 (0.5 times of image maximum gradation value), threshold tau incremental steps step=0.05;
6.2.2. by imgs=binary (I, τ) constantly to the processing of ocular image binaryzation;
6.2.3. after the processing for finding non-zero connected region minimum number in a series of image imgs of binary conversion treatments Image img1 and corresponding threshold tau;
6.2.4. opening operation is carried out to the image img1 of the binary conversion treatment of non-zero connected region minimum number and inner hole is filled out It fills;
6.2.5 the largest connected domain in the image img1 of binary conversion treatment is looked for, and removes other connected domains, is obtained final The image img of binary conversion treatment;
Wherein, imgs indicates that the image of binary conversion treatment, bianry indicate binary conversion treatment process, and I indicates ocular Image, τ are the binarization thresholds being gradually incremented by by step 6.2.1 from minimum threshold low;Imgs=binary (I, τ) is this hair The representation of bright image binaryzation processing.
6.3 calculate the integral projection of the image img of binary conversion treatment, and maximum value and the integral for calculating vertical integral projection are wide The ratio r ate2 of the ratio r ate1 of degree and the maximum value of horizontal integral projection and integral breadth;
Wherein, the formula that the calculating of vertical integral projection uses:
The formula that the calculating of horizontal integral projection uses:
As shown in Fig. 3, from Sv(x) the maximum value V of vertical integral projection is found out inmax, from Sh(x) horizontal integral is found out in The maximum value H of projectionmax, calculate separately out the peak width V of vertical integral projection and horizontal integral projectionwidth、Hwidth, then It calculates:
Rate1=Vmax/Vwidth
Rate2=Hmax/Hwidth
Wherein, Sv(x) be image img after binary conversion treatment unit width corresponding to the sum of pixel value, Sh(x) it is The sum of pixel value corresponding to the unit height of the image img of binary conversion treatment, Y1=1, Y2Equal to the height of ocular image Degree, X1=1, X2Equal to the width of ocular image, I (x, y) is that (x, y) is corresponding in ocular binary conversion treatment image Pixel value, (x, y) are binary conversion treatment image coordinate value, and rate1, rate2 respectively indicate vertical integral projection maximum value and hang down The ratio of the ratio of direct integral view field width, horizontal integral projection maximum value and horizontal integral projection peak width;
6.4 combine rate1 and rate2, calculateJudge whether driver is in tired shape State;By the value of rate compared with given threshold T2, determine that the opening and closing state of human eye, the even value of rate are greater than T2, indicate When the closure degree of human eye reaches 80%, it is regarded as current eye closing completely;The size of threshold value T2 is by normal to driver The acquisition of eyes image data when driving acquires image when different driver's normal drivings in 10 minutes, according to above-mentioned step Suddenly a series of rate value is calculated, and obtains a final value T2 by way of averaging.
Wherein, rate indicates vertical and horizontal integral projection depth-width ratio example relationship integrated value,Be rate1 and The weight coefficient of rate2,WithValue take 0.4 and 0.6 respectively in the present invention, value size is by experiment gained;It is comprehensive Consider upright projection and floor projection, first is that prevent one direction integral projection parameter from determining that there is one-sidedness for fatigue, second is that It is more conducive to distinguishing eyes being to be closed or open.
6.5 carry out tired judgement using the P80 method of PERCLOS, in the unit period of statistics setting (10 seconds), eye Eyeball closing time accounts for the percentage of overall time, if ratio has been more than preset threshold value T, that is, has been regarded as current drivers In fatigue driving;
It is described further in conjunction with P80 method of the Fig. 6 to PERCLOS:
There are three types of standards in the application by PERCLOS: P70, P80 and EM, respectively indicating eyes closed degree is 70%, 80% With 50%.Experiments have shown that P80 standard effect is best, therefore, the present invention judges degree of fatigue using the criterion of P80.t1 At the time of initial time when for people's emmetropia state, i.e. human eye opening degree are 80%;T2 is human eye in closing course, people At the time of eye opening degree is 20%;T3 is human eye after being closed completely again in opening process, and human eye opening degree reaches 20% At the time of;T4 is that human eye completes primary blink process, at the time of when being restored to normal open configuration;
After acquirement t1, t2, t3, t4, the value f of PERCLOS is calculated:
F is the percentage for the eyes closed time accounting for set period of time;
Statistics is in unit period (10 seconds), the number of image frames and image totalframes of eyes closed, the value of the PERCLOS F is equal to:
In the present invention, if the value of f is greater than T, determine that driver is in a state of fatigue.
Step 7, according to step 5,6 tired comprehensive judgement as a result, if it is determined that fatigue then return to Image Acquisition, if it is determined that It is tired then start warning device.
By step 5 and step 6 comprehensive judgement as a result, starting corresponding type of alarm:
When step 5, judgement driver is in a state of fatigue, then starts the alarm of LED lamp alarm device;
When step 6, judgement driver is in a state of fatigue, then starts the alarm of loudspeaker warning device;
When step 5, step 6 determine that driver is in a state of fatigue, then start LED lamp alarm device and loudspeaker report simultaneously Alarm device alarm.
Fig. 7,8 are the effect picture of the image of ocular image and self-adaption binaryzation processing, binary conversion treatment side respectively Method is to find optimal threshold from a series of threshold values, can make the quantity of the non-zero connected domain of the image of eye binary conversion treatment At least, inner hole filling is then carried out, finds largest connected domain, which is exactly the part to be partitioned into.Human eye pupil Bore region, which can be preferably divided, to be come out, and is not influenced by eyelash, the figure of finally obtained binary conversion treatment It is good as playing the role of for subsequent calculating integral projection.
Fatigue driving detection device of the invention, design is simple, small in size, and installation is easy on automobile.In use, Image collecting device is preferably mounted on windshield, the left anterior-superior part of position of driver or the roof of front upper right On, instrument panel center can also be placed the image acquisition device in, the positive image of driver can be captured through steering wheel.Image It is warning device power supply with vehicle-mounted USB charger before processing and warning device can be fixed on windshield.

Claims (8)

1.一种疲劳驾驶检测装置,包括用于存放嵌入式系统的SD卡、用于采集驾驶员正面图像的USB摄像头、用于设备供电的数据线、用于提示驾驶员的报警装置、用于处理器散热的散热风扇;其特征在于,还包括:用于图像处理及疲劳判定的ARM处理器设备和进行不同报警方式的报警装置;所述报警装置包含可根据综合判断驾驶员疲劳状态的不同结果,进行不同报警方式报警的喇叭报警装置和LED灯报警装置;1. A fatigue driving detection device, comprising an SD card for storing an embedded system, a USB camera for collecting the front image of the driver, a data cable for powering the device, an alarm device for prompting the driver, A cooling fan for dissipating heat from a processor; it is characterized in that it also includes: an ARM processor device for image processing and fatigue determination, and an alarm device for performing different alarm modes; As a result, the horn alarm device and the LED light alarm device for alarming in different alarm modes; 所述ARM处理器设备包括人脸定位模块、人眼状态识别模块、疲劳判定模块;人脸定位模块接收图片信息后,用Adaboost算法定位人脸并通过坐标差值识别脸部状态;人眼状态识别模块获取人脸定位模块中的人脸区域图像,用Adaboost算法定位人眼区域,通过利用眼部自适应二值化处理图像的积分投影识别人眼状态;疲劳判定模块根据人脸状态与人眼状态进行疲劳判定,然后将判断结果转换为电信号,通过I/O接口输送给报警装置;The ARM processor device includes a face positioning module, a human eye state identification module, and a fatigue determination module; after the face positioning module receives the picture information, the Adaboost algorithm is used to locate the face and identify the face state through the coordinate difference; the human eye state The recognition module obtains the image of the face area in the face localization module, uses the Adaboost algorithm to locate the human eye area, and recognizes the human eye state by using the integral projection of the eye adaptive binarization processing image; the fatigue determination module is based on the face state and human The eye state is fatigued, and then the judgment result is converted into an electrical signal, which is sent to the alarm device through the I/O interface; 所述人脸定位模块,人眼状态识别模块,疲劳判定模块具体如下:The face location module, the human eye state recognition module, and the fatigue determination module are as follows: 人脸定位模块:用训练好的人脸特征分类器定位人脸区域,并计算人脸中心点位置与图像中心点在纵向的偏离值,根据偏离幅度及设定的时间内人脸中心点的波动频率来判断驾驶员是否处于疲劳状态;Face localization module: Use the trained face feature classifier to locate the face area, and calculate the vertical deviation between the position of the face center point and the image center point, according to the deviation range and the set time. Fluctuation frequency to judge whether the driver is in a state of fatigue; 人眼状态识别模块:定位人脸区域后,根据人的五官分布比例,将人脸上部3/5部分作为感兴趣的区域,用训练好的人眼分类器定位人眼区域;对驾驶员的眼部进行特征提取,分别计算眼部区域二值化处理的图像的垂直和水平积分投影的最大值与积分区域的宽度比值,结合两个比值,综合判定出人眼的当前的状态,即人眼闭合情况,并通过设定的判定准则来进行驾驶者当前的精神状态是否疲劳;Human eye state recognition module: After locating the face area, according to the distribution ratio of human facial features, the 3/5 part of the human face is regarded as the area of interest, and the trained human eye classifier is used to locate the human eye area; Perform feature extraction on the eyes, calculate the ratio of the maximum vertical and horizontal integral projections of the binarized image of the eye area to the width of the integral area, and combine the two ratios to comprehensively determine the current state of the human eye, that is, the human eye. Eye closure, and determine whether the driver's current mental state is fatigued through the set judgment criteria; 所述人眼状态识别模块通过下述步骤实现评判驾驶员是否处于疲劳状态:The human eye state recognition module realizes to judge whether the driver is in a fatigue state through the following steps: 4.1 利用Adaboost算法检测驾驶员的眼部存在位置,在人脸区域通过人眼特征分类器定位人眼;4.1 Use the Adaboost algorithm to detect the presence of the driver's eyes, and locate the human eyes in the face area through the human eye feature classifier; 4.2 对眼部区域图像进行高斯滤波、去除噪声,采用自适应的二值化处理;4.2 Perform Gaussian filtering on the eye area image, remove noise, and use adaptive binarization processing; 4.3 计算眼部二值化处理后的图像img的积分投影,计算垂直积分投影的最大值与积分宽度的比值rate1和水平积分投影的最大值与积分宽度的比值rate2;4.3 Calculate the integral projection of the binarized image img of the eye, calculate the ratio rate1 of the maximum value of the vertical integral projection to the integral width and the ratio rate2 of the maximum value of the horizontal integral projection to the integral width; 其中,垂直积分投影的计算采用的公式: Among them, the formula used for the calculation of the vertical integral projection: 水平积分投影的计算采用的公式: The formula used for the calculation of the horizontal integral projection: 分别找出垂直积分投影和水平积分投影的最大值Vmax、Hmax,分别计算出垂直积分投影和水平积分投影的区域宽度Vwidth、Hwidth,然后计算:Find the maximum values V max and H max of the vertical integral projection and the horizontal integral projection respectively, and calculate the area widths V width and H width of the vertical integral projection and the horizontal integral projection respectively, and then calculate: rate1=Vmax/Vwidthrate1=V max /V width ; rate2=Hmax/Hwidthrate2=H max /H width ; 其中,Sv(x)是二值化处理后的图像img的单位宽度所对应的像素值之和,Sh(x)是二值化处理的图像img的单位高度所对应的像素值之和,Y1=1,Y2等于眼部区域图像的高度,X1=1,X2等于眼部区域图像的宽度,I(x,y)表示眼部区域二值化处理图像中(x、y)对应的像素值,(x,y)为二值化处理图像坐标值,rate1、rate2分别表示积分投影高度与宽度的比例关系;Among them, S v (x) is the sum of the pixel values corresponding to the unit width of the binarized image img, and Sh (x) is the sum of the pixel values corresponding to the unit height of the binarized image img , Y 1 =1, Y 2 is equal to the height of the eye area image, X 1 =1, X 2 is equal to the width of the eye area image, I(x, y) represents (x, y) in the eye area binarized image y) the corresponding pixel value, (x, y) is the coordinate value of the binarized image, rate1, rate2 respectively represent the proportional relationship between the height and width of the integral projection; 4.4 结合rate1和rate2,计算判断驾驶员是否处于疲劳状态;将rate的值与设定阈值T2比较,判定人眼的张开闭合状态,即若rate的值大于设定阈值T2,表示人眼的闭合程度达到80%时,视作当前已经完全闭眼;其中rate表示垂直与水平积分投影高宽比例关系综合值,是rate1和rate2的权重系数;4.4 Combine rate1 and rate2 to calculate Determine whether the driver is in a state of fatigue; compare the value of rate with the set threshold T2 to determine the open and closed state of the human eye, that is, if the value of rate is greater than the set threshold T2, it means that the closed degree of the human eye reaches 80%. It is considered that the eyes have been completely closed at present; where rate represents the comprehensive value of the vertical and horizontal integral projection height-width ratio, is the weight coefficient of rate1 and rate2; 4.5 使用PERCLOS的P80方法进行疲劳判定,通过统计设定的单位周期内,眼睛闭合时间占总体时间的百分比,若比例超过了预先设定阈值T,即视作当前驾驶者已经处于了疲劳驾驶;计算PERCLOS的值f:4.5 The P80 method of PERCLOS is used to determine the fatigue. The percentage of the eye closure time in the total time in the set unit period is calculated. If the ratio exceeds the preset threshold T, it is considered that the current driver is already in fatigue driving; Calculate the value f of PERCLOS: 在单位时间内,统计眼睛闭合的图像帧数和图像总帧数,所述PERCLOS的值f等同于:In unit time, count the number of image frames with eyes closed and the total number of image frames, the value f of PERCLOS is equivalent to: f为眼睛闭合时间占设定时间段的百分率;若f的值大于设定阈值T,则判定驾驶员处于疲劳状态;f is the percentage of eye closure time in the set time period; if the value of f is greater than the set threshold T, it is determined that the driver is in a fatigued state; 疲劳判定模块:根据人脸状态和人眼状态的判断结果,对驾驶员的精神状态进行最终的判定。Fatigue judgment module: According to the judgment results of the face state and the human eye state, the final judgment of the driver's mental state is carried out. 2.如权利要求1所述的一种疲劳驾驶检测装置,其特征在于,所述ARM处理器设备包括:电源转换电路,USB接口电路,晶振电路、使能信号电路,SD卡读取电路;2. a kind of fatigue driving detection device as claimed in claim 1, is characterized in that, described ARM processor equipment comprises: power conversion circuit, USB interface circuit, crystal oscillator circuit, enable signal circuit, SD card reading circuit; 电源转换电路,首先USB供电电路实现供电接口转换,通过NXP型的三极管获得5V电压,中间串接一个自恢复保险丝,然后USB供电电路的输出连接稳压电路,稳压电路通过稳压芯片转换为三种不同的稳定电压;ARM芯片的作用包含图像处理和疲劳判定;USB接口电路与LAN9512芯片相连,进行数据传输,接收USB摄像头的视频信息传入,经过视频解码器将信息传入到ARM芯片处理器中;晶振电路通过晶体振荡器产生时钟信号,为处理器提供时序;使能信号电路通过5V输入电压和场效应晶体管产生两种信号,用于提供SDRAM的使能信号及ARM芯片所需的RUN信号,SDRAM为系统提供运行空间,同时存储最近采集的3-5分钟的压缩图像信息;LED状态指示灯用于设备状态显示;SD卡读取电路将SD卡中的Linux系统读入到设备中,使设备具有运行环境,并实现数据的读入与存储;HDMI是屏幕显示预留接口;I/O接口提供报警装置的触发信号;Power conversion circuit, first of all, the USB power supply circuit realizes the power supply interface conversion, obtains 5V voltage through the NXP type triode, and connects a self-recovery fuse in series, and then the output of the USB power supply circuit is connected to the voltage regulator circuit, which is converted into a voltage regulator chip. Three different stable voltages; the role of the ARM chip includes image processing and fatigue determination; the USB interface circuit is connected to the LAN9512 chip for data transmission, receiving video information from the USB camera, and passing the information to the ARM chip through the video decoder In the processor; the crystal oscillator circuit generates the clock signal through the crystal oscillator to provide timing for the processor; the enable signal circuit generates two signals through the 5V input voltage and the field effect transistor, which are used to provide the enable signal of the SDRAM and the requirements of the ARM chip RUN signal, SDRAM provides running space for the system, and stores the compressed image information collected recently for 3-5 minutes; LED status indicator is used for device status display; SD card reading circuit reads the Linux system in SD card into In the equipment, the equipment has an operating environment, and realizes the reading and storage of data; HDMI is the reserved interface for screen display; I/O interface provides the trigger signal of the alarm device; USB摄像头通过转换电路及装置将视频流信息转换成数字信息,ARM芯片作为主处理单元,启动人脸定位模块,进行人脸定位及状态分析,处理后的图像信息临时存储于SDRAM中,人眼状态识别模块从SDRAM中获取人脸定位模块处理后的图像信息,进行人眼状态识别,然后疲劳判定模块根据人脸定位模块和人眼状态识别模块的状态结果进行疲劳判定,最终将判定结果转换成电信号从I/O接口传送给报警装置。The USB camera converts the video stream information into digital information through the conversion circuit and device. The ARM chip is used as the main processing unit to start the face positioning module to perform face positioning and status analysis. The processed image information is temporarily stored in SDRAM. The state recognition module obtains the image information processed by the face positioning module from the SDRAM, and recognizes the state of the human eye. Then the fatigue determination module performs fatigue determination according to the state results of the face positioning module and the human eye state recognition module, and finally converts the determination result. The electrical signal is transmitted from the I/O interface to the alarm device. 3.如权利要求1所述的一种疲劳驾驶检测装置,其特征在于,所述人脸定位模块通过下述步骤实现评判驾驶员是否处于疲劳状态:3. a kind of fatigue driving detection device as claimed in claim 1, is characterized in that, described face location module realizes to judge whether driver is in fatigued state by following steps: 3.1 利用Adaboost算法检测出驾驶员的脸部位置,获取人脸区域的中心点位置(Hx,Hy)与初始采集到的图像中心点位置(Ix,Iy)在纵向的差值Diff;3.1 Use the Adaboost algorithm to detect the driver's face position, and obtain the longitudinal difference Diff between the center point position (H x ,H y ) of the face area and the initially collected image center point position (I x , I y ) ; 3.2 获取人脸区域的中心点位置后,计算人脸中心点的纵坐标Hy与图像的纵坐标Iy的差值Diff=|Hy-Iy|;3.2 After obtaining the center point position of the face area, calculate the difference between the ordinate Hy of the face center point and the ordinate I y of the image Diff=|H y -I y |; 3.3 统计单位周期内差值Diff大于设定阈值T1的图像数量n和图像总帧数N;3.3 Count the number of images n and the total number of image frames N whose difference Diff is greater than the set threshold T1 in the statistical unit period; 3.4 计算图像数量n与图像总帧数N的比值,将比值与设定阈值T3进行比较,若比值大于设定阈值T3,则判定驾驶员处于疲劳状态。3.4 Calculate the ratio of the number of images n to the total number of image frames N, and compare the ratio with the set threshold T3. If the ratio is greater than the set threshold T3, it is determined that the driver is in a fatigued state. 4.如权利要求1所述的一种疲劳驾驶检测装置,其特征在于,所述疲劳判定模块通过下述综合结果对驾驶员的精神状态进行最终的判定:4. a kind of fatigue driving detection device as claimed in claim 1 is characterized in that, described fatigue determination module carries out final determination to the mental state of driver by following comprehensive result: 通过人脸定位模块和人眼状态识别模块的综合判断结果,进行三种报警方式:当人脸定位模块判定驾驶员处于疲劳状态,则启动LED灯报警装置报警;当人眼状态识别模块判定驾驶员处于疲劳状态,则启动喇叭报警装置报警;当人脸定位模块和人眼状态识别模块均判定驾驶员处于疲劳状态,则同时启动LED灯报警装置和喇叭报警装置报警。According to the comprehensive judgment results of the face positioning module and the human eye state identification module, three alarm modes are carried out: when the face positioning module determines that the driver is in a fatigued state, the LED light alarm device will be activated; when the human eye state identification module determines that the driver is in a fatigue state If the driver is in a fatigued state, the horn alarm device will be activated to give an alarm; when both the face positioning module and the human eye state recognition module determine that the driver is in a fatigued state, the LED light alarm device and the horn alarm device will be activated at the same time. 5.一种基于权利要求1所述的疲劳驾驶检测装置实现的疲劳驾驶检测方法,其特征在于,包括如下步骤:5. A fatigue driving detection method realized based on the fatigue driving detection device according to claim 1, is characterized in that, comprises the steps: 步骤1,初始化摄像头,设置摄像头读入图片的属性值;Step 1, initialize the camera, set the attribute value of the image read by the camera; 步骤2,USB摄像头采集图像,将图像信息输送给ARM处理器设备;Step 2, the USB camera captures the image, and transmits the image information to the ARM processor device; 步骤3,图像预处理,即图像缩小、灰度化处理;Step 3, image preprocessing, that is, image reduction and grayscale processing; 步骤4,加载OpenCV机器视觉库中已有的人脸和人眼特征分类器,通过预先训练的人脸特征分类器定位人脸区域,即用Haar特征检测方式获取人脸部关键特征点,根据特征点定位人脸区域,并分别得到人脸区域左上角点和右下角点坐标(x1,y1)、(x2,y2),计算人脸区域图像的中心点坐标(Hx,Hy),其中根据(x1,y1)、(x2,y2)两点坐标用矩形框框出人脸区域;Step 4: Load the existing face and eye feature classifiers in the OpenCV machine vision library, and locate the face area through the pre-trained face feature classifier, that is, use the Haar feature detection method to obtain the key feature points of the human face. The feature points locate the face area, and obtain the coordinates (x 1 , y 1 ) and (x 2 , y 2 ) of the upper left corner and the lower right corner of the face area respectively, and calculate the center point coordinates (H x , y 2 ) of the face area image. H y ), where According to the coordinates of (x 1 , y 1 ) and (x 2 , y 2 ), the face area is framed by a rectangular frame; 步骤5,通过人脸状态判定疲劳状态:利用Adaboost算法检测驾驶员的脸部存在位置,计算人脸中心点位置与图像中心点在纵向的偏离值,根据偏离值与设定阈值T1比较及一定时间内人脸中心点的波动频率,即点头频率来判断驾驶员是否处于疲劳状态;Step 5, determine the fatigue state by the state of the face: use the Adaboost algorithm to detect the position of the driver's face, calculate the longitudinal deviation between the position of the face center point and the image center point, and compare the deviation value with the set threshold T1 and determine it. The fluctuation frequency of the center point of the face in time, that is, the frequency of nodding, to judge whether the driver is in a state of fatigue; 步骤6,通过人眼状态判定疲劳状态:利用Adaboost算法检测驾驶员的眼部存在位置,对驾驶员的眼部进行特征提取,计算眼部区域的积分投影的最大值与积分区域的宽度比值,并与设定阈值T2比较来判断驾驶员是否处于疲劳状态;具体包含以下步骤:Step 6, determine the fatigue state by the state of the human eye: use the Adaboost algorithm to detect the position of the driver's eye, perform feature extraction on the driver's eye, and calculate the ratio of the maximum value of the integral projection of the eye area to the width of the integral area, And compare it with the set threshold T2 to determine whether the driver is in a fatigued state; it specifically includes the following steps: 6.1 利用Adaboost算法检测驾驶员的眼部存在位置,在人脸区域通过人眼特征分类器定位人眼;6.1 Use the Adaboost algorithm to detect the presence of the driver's eyes, and locate the eyes in the face area through the human eye feature classifier; 6.2 对眼部区域图像进行高斯滤波、去除噪声,采用自适应的二值化处理;6.2 Perform Gaussian filtering on the eye area image, remove noise, and use adaptive binarization processing; 6.3 计算二值化处理的图像img的积分投影,计算垂直积分投影的最大值与积分宽度的比值rate1和水平积分投影的最大值与积分宽度的比值rate2;6.3 Calculate the integral projection of the binarized image img, calculate the ratio of the maximum value of the vertical integral projection to the integral width rate1 and the ratio of the maximum value of the horizontal integral projection to the integral width rate2; 其中,垂直积分投影的计算采用的公式: Among them, the formula used for the calculation of the vertical integral projection: 水平积分投影的计算采用的公式: The formula used for the calculation of the horizontal integral projection: 分别找出垂直积分投影和水平积分投影的最大值Vmax、Hmax,分别计算出垂直积分投影和水平积分投影的区域宽度Vwidth、Hwidth,然后计算:Find the maximum values V max and H max of the vertical integral projection and the horizontal integral projection respectively, and calculate the area widths V width and H width of the vertical integral projection and the horizontal integral projection respectively, and then calculate: rate1=Vmax/Vwidthrate1=V max /V width ; rate2=Hmax/Hwidthrate2=H max /H width ; 其中,Sv(x)是二值化处理的图像img的单位宽度所对应的像素值之和,Sh(x)是二值化处理的图像img的单位高度所对应的像素值之和,Y1=1,Y2等于眼部区域图像的高度,X1=1,X2等于眼部区域图像的宽度,I(x,y)是眼部区域二值化处理图像中(x、y)对应的像素值,(x,y)为二值化处理图像坐标值,rate1、rate2分别表示垂直积分投影最大值和垂直积分投影区域宽度的比值、水平积分投影最大值和水平积分投影区域宽度的比值;Among them, S v (x) is the sum of the pixel values corresponding to the unit width of the binarized image img, and Sh (x) is the sum of the pixel values corresponding to the unit height of the binarized image img, Y 1 =1, Y 2 is equal to the height of the eye area image, X 1 =1, X 2 is equal to the width of the eye area image, I(x, y) is (x, y) in the binarized image of the eye area ) corresponding pixel value, (x, y) is the coordinate value of the binarized image, rate1, rate2 represent the ratio of the maximum vertical integral projection to the width of the vertical integral projection area, the maximum horizontal integral projection and the width of the horizontal integral projection area, respectively ratio; 6.4 结合rate1和rate2,计算判断驾驶员是否处于疲劳状态;将rate的值与设定阈值T2比较,判定人眼的张开闭合状态,即若rate的值大于设定阈值T2,表示人眼的闭合程度达到80%时,视作当前已经完全闭眼;6.4 Combining rate1 and rate2, calculate Determine whether the driver is in a state of fatigue; compare the value of rate with the set threshold T2 to determine the open and closed state of the human eye, that is, if the value of rate is greater than the set threshold T2, it means that the closed degree of the human eye reaches 80%. It is deemed that the eyes are currently completely closed; 其中,rate表示垂直与水平积分投影高宽比例关系综合值,是rate1和rate2的权重系数;Among them, rate represents the integrated value of the vertical and horizontal integral projection height-width ratio relationship, is the weight coefficient of rate1 and rate2; 6.5 使用PERCLOS的P80方法进行疲劳判定,通过统计设定的单位周期内,眼睛闭合时间占总体时间的百分比,若比例超过了预先设定阈值T,即视作当前驾驶者已经处于了疲劳驾驶;计算PERCLOS的值f:6.5 The P80 method of PERCLOS is used to determine fatigue. The percentage of eye closure time in the total time in the set unit period is calculated. If the ratio exceeds the preset threshold T, it is deemed that the current driver is already in fatigue driving; Calculate the value f of PERCLOS: 在单位时间内,统计眼睛闭合的图像帧数和单位时间图像总帧数,所述PERCLOS的值f等同于:In unit time, count the number of image frames with eyes closed and the total number of image frames per unit time, the value f of PERCLOS is equivalent to: f为眼睛闭合时间占设定时间段的百分率;若f的值大于设定阈值T,则判定驾驶员处于疲劳状态;f is the percentage of eye closure time in the set time period; if the value of f is greater than the set threshold T, it is determined that the driver is in a fatigued state; 步骤7,根据步骤5、6的疲劳综合判定结果,确定返回图像采集或启动不同报警方式的一种。Step 7: According to the comprehensive determination results of fatigue in steps 5 and 6, it is determined to return to image acquisition or to activate one of different alarm modes. 6.如权利要求5所述的疲劳驾驶检测方法,其特征在于,步骤5所述的通过人脸状态判定疲劳状态具体包含以下步骤:6. fatigue driving detection method as claimed in claim 5, is characterized in that, determining fatigue state by face state described in step 5 specifically comprises the following steps: 5.1 利用Adaboost算法检测出驾驶员的脸部位置,获取人脸区域的中心点位置(Hx,Hy)与初始采集到的图像中心点位置(Ix,Iy)在纵向的差值Diff;5.1 Use the Adaboost algorithm to detect the driver's face position, and obtain the longitudinal difference Diff between the center point position (H x ,H y ) of the face area and the initially collected image center point position (I x , I y ) ; 5.2 获取人脸区域的中心点位置后,计算人脸中心点的纵坐标Hy与图像的纵坐标Iy的差值Diff=|Hy-Iy|;5.2 After obtaining the center point position of the face area, calculate the difference between the ordinate Hy of the face center point and the ordinate I y of the image Diff=|H y -I y |; 5.3 统计单位周期内差值Diff大于设定阈值T1的图像数量n和图像总帧数N;5.3 Count the number of images n and the total number of frames N of images whose difference Diff is greater than the set threshold T1 in the unit period; 5.4 计算图像数量n与图像总帧数N的比值,将比值与设定阈值T3进行比较,若比值大于设定阈值T3,则判定驾驶员处于疲劳状态。5.4 Calculate the ratio of the number of images n to the total number of image frames N, and compare the ratio with the set threshold T3. If the ratio is greater than the set threshold T3, it is determined that the driver is in a fatigued state. 7.如权利要求6所述的疲劳驾驶检测方法,其特征在于,步骤6.2所述的二值化处理,具体的方法为:7. The fatigue driving detection method according to claim 6, wherein the binarization process described in step 6.2, the specific method is: 6.2.1.设置最小阈值low=0.1,最大阈值high=0.5,阈值τ递增步长step=0.05;6.2.1. Set the minimum threshold low=0.1, the maximum threshold high=0.5, and the threshold τ increment step=0.05; 6.2.2.通过imgs=binary(I,τ)不断对眼部区域图像二值化处理;6.2.2. Continuously binarize the eye area image through imgs=binary(I,τ); 6.2.3.从一系列二值化处理的图像imgs中找到非零连通区域数量最少的二值化处理的图像img1及与之对应的阈值τ;6.2.3. From a series of binarized image imgs, find the binarized image img1 with the least number of non-zero connected regions and its corresponding threshold τ; 6.2.4.对非零连通区域数量最少的二值化处理的图像img1进行开运算并内孔填充;6.2.4. Perform the opening operation on the binarized image img1 with the least number of non-zero connected regions and fill the inner holes; 6.2.5 找二值化处理的图像img1中的最大连通域,并去除其他连通域,得到最终的二值化处理的图像img;6.2.5 Find the largest connected domain in the binarized image img1, and remove other connected domains to obtain the final binarized image img; 其中,imgs表示二值化处理的图像,bianry表示二值化处理过程,I表示眼部区域图像,τ是按步骤6.2.1从最小阈值low逐步递增的二值化阈值;imgs=binary(I,τ)为本发明图像二值化处理的表示形式。Among them, imgs represents the binarized image, bianry represents the binarization process, I represents the eye area image, and τ is the binarization threshold gradually increasing from the minimum threshold low according to step 6.2.1; imgs=binary(I ,τ) is the representation form of image binarization processing in the present invention. 8.如权利要求5所述的疲劳检测方法,其判定方法具体为:通过步骤5和步骤6综合判断,其特征在于,步骤7所述的不同报警方式为:当步骤5判定驾驶员处于疲劳状态,则启动LED灯报警装置报警;当步骤6判定驾驶员处于疲劳状态,则启动喇叭报警装置报警;当步骤5、步骤6均判定驾驶员处于疲劳状态,则同时启动LED灯报警装置和喇叭报警装置报警。8. The fatigue detection method as claimed in claim 5, wherein the determination method is specifically: comprehensively judged by step 5 and step 6, it is characterized in that, the different alarm modes described in step 7 are: when step 5 determines that the driver is fatigued When the driver is in a fatigued state in step 6, the horn alarm device is activated; when both steps 5 and 6 determine that the driver is in a fatigued state, the LED light alarm device and the horn are activated at the same time. Alarm device alarms.
CN201611264042.1A 2016-12-30 2016-12-30 A kind of fatigue driving detection device and detection method Expired - Fee Related CN106530623B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611264042.1A CN106530623B (en) 2016-12-30 2016-12-30 A kind of fatigue driving detection device and detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611264042.1A CN106530623B (en) 2016-12-30 2016-12-30 A kind of fatigue driving detection device and detection method

Publications (2)

Publication Number Publication Date
CN106530623A CN106530623A (en) 2017-03-22
CN106530623B true CN106530623B (en) 2019-06-07

Family

ID=58336475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611264042.1A Expired - Fee Related CN106530623B (en) 2016-12-30 2016-12-30 A kind of fatigue driving detection device and detection method

Country Status (1)

Country Link
CN (1) CN106530623B (en)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107085715A (en) * 2017-05-19 2017-08-22 武汉理工大学 System and method for TV intelligent detection of user's sleep state
CN107392153B (en) * 2017-07-24 2020-09-29 中国科学院苏州生物医学工程技术研究所 Human fatigue determination method
CN107730834A (en) * 2017-08-07 2018-02-23 西北工业大学 A kind of antifatigue Intelligent worn device being combined based on attitude detection with image recognition
CN107562199B (en) * 2017-08-31 2020-10-09 北京金山安全软件有限公司 Page object setting method and device, electronic equipment and storage medium
CN107571735A (en) * 2017-10-13 2018-01-12 苏州小黄人汽车科技有限公司 A kind of vehicle drivers status monitoring system and monitoring method
CN108022256A (en) * 2017-12-25 2018-05-11 虹德科技(杭州)有限公司 A kind of video intelligent analyzer
CN108021911B (en) * 2018-01-04 2018-09-11 重庆公共运输职业学院 A kind of driver tired driving monitoring method
CN108446644A (en) * 2018-03-26 2018-08-24 刘福珍 A kind of virtual display system for New-energy electric vehicle
CN108805049A (en) * 2018-05-25 2018-11-13 郑州目盼智能科技有限公司 A kind of embedded human face detection terminal
CN109063686A (en) * 2018-08-29 2018-12-21 安徽华元智控科技有限公司 A kind of fatigue of automobile driver detection method and system
CN109340607A (en) * 2018-10-23 2019-02-15 电子科技大学 A multifunctional desk lamp for preventing visual fatigue
CN109513112B (en) * 2018-12-03 2023-01-20 四川瑞精特科技有限公司 Full-automatic test system and test method for defibrillation pacemaker
CN111291590B (en) * 2018-12-06 2021-03-19 广州汽车集团股份有限公司 Driver fatigue detection method, driver fatigue detection device, computer equipment and storage medium
CN109977930B (en) * 2019-04-29 2021-04-02 中国电子信息产业集团有限公司第六研究所 Fatigue driving detection method and device
CN110194174B (en) * 2019-05-24 2021-02-12 江西理工大学 Fatigue driving monitoring system
US12061971B2 (en) 2019-08-12 2024-08-13 Micron Technology, Inc. Predictive maintenance of automotive engines
US12249189B2 (en) 2019-08-12 2025-03-11 Micron Technology, Inc. Predictive maintenance of automotive lighting
US10993647B2 (en) * 2019-08-21 2021-05-04 Micron Technology, Inc. Drowsiness detection for vehicle control
US12210401B2 (en) 2019-09-05 2025-01-28 Micron Technology, Inc. Temperature based optimization of data storage operations
CN110879973A (en) * 2019-10-31 2020-03-13 安徽普华灵动机器人科技有限公司 Driver fatigue state facial feature recognition and detection method
CN111062292B (en) * 2019-12-10 2022-07-29 哈尔滨工程大学 Fatigue driving detection device and method
CN116740686A (en) * 2019-12-31 2023-09-12 广东科学技术职业学院 A fatigue driving detection method applied to unmanned driving equipment and unmanned driving equipment
CN111274963A (en) * 2020-01-20 2020-06-12 西南科技大学 Fatigue driving warning system based on image processing
CN112381871B (en) * 2020-10-16 2024-10-15 华东交通大学 Implementation method of locomotive vigilance device based on face recognition
CN112418002B (en) * 2020-11-05 2023-10-24 中国航空工业集团公司西安飞行自动控制研究所 Method for identifying own airport by unmanned aerial vehicle
CN112668393A (en) * 2020-11-30 2021-04-16 海纳致远数字科技(上海)有限公司 Fatigue degree detection device and method based on face recognition and key point detection
CN112528906B (en) * 2020-12-18 2022-10-14 武汉理工大学 Driver state detection equipment
CN113066264A (en) * 2021-02-22 2021-07-02 广州铁路职业技术学院(广州铁路机械学校) A kind of fatigue state identification method and table lamp
CN112966664A (en) * 2021-04-01 2021-06-15 科世达(上海)机电有限公司 Fatigue driving detection method, system and device and readable storage medium
CN113468956A (en) * 2021-05-24 2021-10-01 北京迈格威科技有限公司 Attention judging method, model training method and corresponding device
CN113989887A (en) * 2021-10-22 2022-01-28 南京理工大学 Fatigue state detection method for equipment operators based on visual feature information fusion
CN114140865A (en) * 2022-01-29 2022-03-04 深圳市中讯网联科技有限公司 Intelligent early warning method and device, storage medium and electronic equipment
CN114572330B (en) * 2022-02-22 2024-03-08 深圳飞亮智能科技有限公司 Electric vehicle intelligent vehicle-mounted terminal based on millimeter wave radar
CN116805405B (en) * 2023-08-25 2023-10-27 南通大学 Intelligent protection method and system for milling machine equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436715A (en) * 2011-11-25 2012-05-02 大连海创高科信息技术有限公司 Fatigue driving detection method
CN204576754U (en) * 2015-03-27 2015-08-19 孙建利 A kind of vehicle-mounted car steering giving fatigue pre-warning equipment
CN106205052A (en) * 2016-07-21 2016-12-07 上海仰笑信息科技有限公司 A kind of driving recording method for early warning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140025812A (en) * 2012-08-22 2014-03-05 삼성전기주식회사 Apparatus and method for sensing drowsy driving

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436715A (en) * 2011-11-25 2012-05-02 大连海创高科信息技术有限公司 Fatigue driving detection method
CN204576754U (en) * 2015-03-27 2015-08-19 孙建利 A kind of vehicle-mounted car steering giving fatigue pre-warning equipment
CN106205052A (en) * 2016-07-21 2016-12-07 上海仰笑信息科技有限公司 A kind of driving recording method for early warning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Kinect的疲劳驾驶检测系统的设计与实现;陈萍;《中国优秀硕士学位论文全文数据库 工程科技II辑》;20160215(第2期);C035-125

Also Published As

Publication number Publication date
CN106530623A (en) 2017-03-22

Similar Documents

Publication Publication Date Title
CN106530623B (en) A kind of fatigue driving detection device and detection method
CN101593425B (en) Machine vision based fatigue driving monitoring method and system
Mbouna et al. Visual analysis of eye state and head pose for driver alertness monitoring
CN106846734B (en) A kind of fatigue driving detection device and method
CN104809445B (en) method for detecting fatigue driving based on eye and mouth state
CN106687037B (en) Device, method and computer program for detecting transient sleep
CN109389806B (en) Method, system and medium for fatigue driving detection and early warning based on multi-information fusion
Tayab Khan et al. Smart Real‐Time Video Surveillance Platform for Drowsiness Detection Based on Eyelid Closure
Zhang et al. A new real-time eye tracking based on nonlinear unscented Kalman filter for monitoring driver fatigue
CN104361332B (en) A kind of face eye areas localization method for fatigue driving detection
CN112183502B (en) Method for determining driving state of driver, computer storage medium, and electronic device
Anjali et al. Real-time nonintrusive monitoring and detection of eye blinking in view of accident prevention due to drowsiness
Luo et al. The driver fatigue monitoring system based on face recognition technology
Huda et al. Mobile-based driver sleepiness detection using facial landmarks and analysis of EAR values
Vinoth et al. A drowsiness detection using smart sensors during driving and smart message alert system to avoid accidents
Ranjan et al. Driver drowsiness detection system using computer vision
Cheng et al. A fatigue detection system with eyeglasses removal
Liu et al. A practical driver fatigue detection algorithm based on eye state
Ashwini et al. Deep Learning Based Drowsiness Detection With Alert System Using Raspberry Pi Pico
Alioua et al. Driver’s fatigue and drowsiness detection to reduce traffic accidents on road
Zeng et al. A driving assistant safety method based on human eye fatigue detection
CN204706141U (en) Wearable device
CN114419606A (en) Driver state detection method, electronic device, and storage medium
Shewale et al. Real time driver drowsiness detection system
Doppala et al. A machine intelligence model to detect drowsiness for preventing road accidents

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190607

Termination date: 20201230