[go: up one dir, main page]

CN101872171B - Driver fatigue state recognition method and system based on information fusion - Google Patents

Driver fatigue state recognition method and system based on information fusion Download PDF

Info

Publication number
CN101872171B
CN101872171B CN2009100825662A CN200910082566A CN101872171B CN 101872171 B CN101872171 B CN 101872171B CN 2009100825662 A CN2009100825662 A CN 2009100825662A CN 200910082566 A CN200910082566 A CN 200910082566A CN 101872171 B CN101872171 B CN 101872171B
Authority
CN
China
Prior art keywords
fatigue
driver
steering wheel
eye
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009100825662A
Other languages
Chinese (zh)
Other versions
CN101872171A (en
Inventor
宋正河
朱忠祥
谢斌
毛恩荣
张俊
成波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN2009100825662A priority Critical patent/CN101872171B/en
Publication of CN101872171A publication Critical patent/CN101872171A/en
Application granted granted Critical
Publication of CN101872171B publication Critical patent/CN101872171B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

本发明公开了一种基于信息融合的驾驶员疲劳状态识别方法和系统,其中,方法包括:采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息;对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息;对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。本发明提高了驾驶员疲劳状态监测可靠性和准确性,可降低驾驶员疲劳驾驶导致的交通事故数量,保证道路交通安全;并且该多源信息的获取不对驾驶员的驾驶造成干扰,实用性强。

The invention discloses a driver fatigue state recognition method and system based on information fusion, wherein the method includes: collecting driver's eye state information, steering wheel manipulation state information and vehicle trajectory driving state information; The eye state information, the steering wheel manipulation state information and the vehicle track driving state information are processed to obtain the driver’s fatigue performance information; the fatigue performance information is weighted and fused to obtain the fatigue membership degree corresponding to the driver’s fatigue level, and according to The fatigue membership yields the driver's fatigue level. The invention improves the reliability and accuracy of driver fatigue state monitoring, can reduce the number of traffic accidents caused by driver fatigue driving, and ensures road traffic safety; and the acquisition of the multi-source information does not interfere with the driver's driving, and has strong practicability .

Description

基于信息融合的驾驶员疲劳状态识别方法和系统Driver Fatigue State Recognition Method and System Based on Information Fusion

技术领域 technical field

本发明涉及交通工程,特别涉及一种基于信息融合的驾驶员疲劳状态识别方法和系统。The invention relates to traffic engineering, in particular to a driver fatigue state recognition method and system based on information fusion.

背景技术 Background technique

随着经济的发展和汽车数量的急剧增加,道路交通事故的发生率也逐渐增高,而驾驶员疲劳驾驶是导致道路交通事故的一个重要因素,因此,科学有效地监测驾驶员的疲劳状况对保证道路交通安全有着重要的意义。With the development of the economy and the sharp increase in the number of cars, the incidence of road traffic accidents has gradually increased, and driver fatigue driving is an important factor leading to road traffic accidents. Therefore, scientific and effective monitoring of driver fatigue is important to ensure Road traffic safety is of great significance.

现有技术中对驾驶员疲劳状态的监测主要是采用单源信息监测,例如,通过监测驾驶员生理状况来判断驾驶员是否处于疲劳驾驶状态。长期以来,脑电测量或心电测量被誉为监测驾驶员疲劳状态的“黄金标准”。脑电波中的不同频段的波在驾驶员疲劳的不同时期会有相应的变化,在初期疲劳时,慢波delta、alpha会发生变化,中期疲劳时belta波会增加,极度疲劳时delta、theta和alpha波会有较大幅度的增加。心电指标中的心率变异性的高频成分HF和低频成分LF在疲劳时都会发生变化。The monitoring of the driver's fatigue state in the prior art mainly adopts single-source information monitoring, for example, by monitoring the driver's physiological condition to determine whether the driver is in a fatigue driving state. EEG measurements, or electrocardiograms, have long been hailed as the "gold standard" for monitoring driver fatigue. The waves of different frequency bands in the brain wave will have corresponding changes in different periods of driver fatigue. In the initial fatigue, the slow wave delta and alpha will change, and the belta wave will increase in the middle fatigue, and the delta, theta and The alpha wave will increase significantly. Both the high-frequency component HF and the low-frequency component LF of heart rate variability in ECG indicators will change during fatigue.

但是,现有技术中这种采用单源信息监测驾驶员疲劳状态的方法尚存在如下技术缺陷:一方面,单源信息存在信息单一、识别的准确性、稳定性和可靠性不高等缺陷,如果该单源信息不准确就会使得上述的驾驶员疲劳状态识别方法失效,则难以监测得到驾驶员的驾驶疲劳状况,从而使得驾驶员的驾驶存在很大的安全隐患;另一方面,单源信息测量设备(如测量脑电的设备或测量心电的设备)对驾驶员的驾驶会产生较大的干扰,不适用于实际驾驶员疲劳状态的监测,实用性差。However, in the prior art, the method of using single-source information to monitor the driver's fatigue state still has the following technical defects: on the one hand, the single-source information has defects such as single information, low recognition accuracy, stability and reliability, etc., if The inaccuracy of the single-source information will make the above-mentioned driver fatigue state identification method invalid, and it is difficult to monitor the driver's driving fatigue state, so that the driver's driving has a great potential safety hazard; on the other hand, the single-source information Measuring equipment (such as equipment for measuring EEG or equipment for measuring ECG) will cause greater interference to the driver's driving, and is not suitable for monitoring the actual driver's fatigue state, and has poor practicability.

发明内容Contents of the invention

本发明的目的是提供一种基于信息融合的驾驶员疲劳状态识别方法和系统,有效克服现有技术不适用于实际应用、信息单一、识别的准确性和可靠性不高等缺陷。The purpose of the present invention is to provide a driver fatigue state recognition method and system based on information fusion, which can effectively overcome the defects of the prior art that are not suitable for practical application, single information, and low recognition accuracy and reliability.

本发明提供了一种基于信息融合的驾驶员疲劳状态识别方法,包括:The invention provides a driver fatigue state recognition method based on information fusion, comprising:

步骤1、采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息;Step 1. Collect the driver's eye status information, steering wheel manipulation status information and vehicle track driving status information;

步骤2、对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息,所述疲劳表现信息包括单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度、最长闭眼时间、方向盘保持不动的时间比例、方向盘突然转动的角度变化量、车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例;Step 2. Process the driver's eye state information, steering wheel manipulation state information, and vehicle track driving state information to obtain driver fatigue performance information, which includes the eye closure degree exceeding 80% per unit time The proportion of time occupied, the average degree of eye opening, the longest eye-closed time, the proportion of time the steering wheel remains still, the angle change of the steering wheel suddenly turning, the standard deviation of the lateral displacement of the vehicle in a serpentine driving state, and the lateral displacement in The time ratio of the safety distance range;

步骤3、对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。Step 3. Perform weighted fusion processing on the fatigue performance information to obtain a fatigue membership degree corresponding to the driver's fatigue level, and obtain the driver's fatigue level according to the fatigue membership degree.

本发明提供了一种基于信息融合的驾驶员疲劳状态识别系统,包括:The invention provides a driver fatigue state recognition system based on information fusion, comprising:

采集模块,用于采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息;The collection module is used to collect the driver's eye state information, steering wheel manipulation state information and vehicle track driving state information;

处理模块,用于对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息,所述疲劳表现信息包括单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度、最长闭眼时间、方向盘保持不动的时间比例、方向盘突然转动的角度变化量、车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例;The processing module is used to process the driver's eye state information, steering wheel manipulation state information and vehicle track driving state information to obtain driver fatigue performance information, and the fatigue performance information includes the degree of eye closure per unit time exceeding The proportion of time occupied by 80%, the average degree of eye opening, the longest eye-closed time, the proportion of time when the steering wheel remains still, the angle change of sudden steering wheel rotation, the standard deviation of the lateral displacement of the vehicle in a serpentine driving state, and the lateral displacement of the vehicle. The proportion of time that the displacement is within the safe distance range;

融合模块,用于对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。The fusion module is configured to perform weighted fusion processing on the fatigue performance information to obtain a fatigue membership degree corresponding to the driver's fatigue level, and obtain the driver's fatigue level according to the fatigue membership degree.

本发明提供了一种基于信息融合的驾驶员疲劳状态识别方法和系统,首先采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息,之后对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息,最后对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。由于本发明通过对反映驾驶员驾驶疲劳状况的多源信息进行综合起来评价驾驶员的驾驶疲劳状况,因此提高了驾驶员疲劳状态识别的可靠性和准确性,可有效降低驾驶员疲劳驾驶导致的交通事故数量,保证道路交通安全。与现有技术相比,本发明基于信息融合的驾驶员疲劳状态识别方法和系统的实施不会对驾驶员的驾驶造成干扰,具有实用性强、便于实施等优点,具有广泛的应用前景。The present invention provides a driver fatigue state recognition method and system based on information fusion. Firstly, the driver's eye state information, steering wheel manipulation state information and vehicle track driving state information are collected, and then the driver's eye state information is collected. information, steering wheel control state information and vehicle trajectory driving state information to obtain the driver’s fatigue performance information, and finally carry out weighted fusion processing on the fatigue performance information to obtain the fatigue membership degree corresponding to the driver’s fatigue level, and according to the The fatigue membership yields the driver's fatigue rating. Since the present invention evaluates the driver's driving fatigue by synthesizing the multi-source information reflecting the driver's driving fatigue, the reliability and accuracy of driver's fatigue state recognition are improved, and the fatigue caused by driver fatigue can be effectively reduced. The number of traffic accidents to ensure road traffic safety. Compared with the prior art, the implementation of the information fusion-based driver fatigue state identification method and system of the present invention will not cause interference to the driver's driving, has the advantages of strong practicability, easy implementation, etc., and has broad application prospects.

附图说明 Description of drawings

图1为本发明基于信息融合的驾驶员疲劳状态识别方法第一实施例的流程图;Fig. 1 is the flowchart of the first embodiment of the driver's fatigue state recognition method based on information fusion in the present invention;

图2为本发明基于信息融合的驾驶员疲劳状态识别方法第二实施例的流程图;Fig. 2 is the flowchart of the second embodiment of the driver's fatigue state identification method based on information fusion in the present invention;

图3为本发明基于信息融合的驾驶员疲劳状态识别系统第一实施例的结构示意图;Fig. 3 is the schematic structural diagram of the first embodiment of the driver's fatigue state recognition system based on information fusion in the present invention;

图4为本发明基于信息融合的驾驶员疲劳状态识别系统第二实施例的结构示意图。FIG. 4 is a schematic structural diagram of a second embodiment of the driver fatigue state recognition system based on information fusion in the present invention.

具体实施方式 Detailed ways

发明人在实现本发明的过程中发现,驾驶员的疲劳状况可以反映在多种方面,生理状态、操纵行为和车辆状态与驾驶员疲劳状态都具有一定的相关性,比如,面部状态中的眼睑运动、头部运动、眼部视线和面部表情都会在驾驶员发生疲劳状态时出现明显的变化;在车辆操纵方面,方向盘转角的时域和频域指标、方向盘的握力、角速度、车速、脚踏板的蹬踩力和车辆轨迹的横向偏移等都与驾驶员的疲劳状态有一定的相关性。基于此,本发明在监测驾驶人疲劳状态时抓住关键的外部症状,将眼部状态、方向盘操纵和车辆轨迹三方面的信息进行融合,应用到驾驶员疲劳状态监测中,综合起来评价驾驶员的疲劳状况,能够在参数数据可靠度较低的时候对其进行剔除,利用其他可靠度较高的信息,从而可以弥补单源信息监测的不足,提高驾驶员疲劳状态监测的准确率稳定性。The inventor found in the process of realizing the present invention that the driver's fatigue status can be reflected in various aspects, and the physiological state, manipulation behavior and vehicle status all have a certain correlation with the driver's fatigue status, for example, the eyelids in the facial status Movement, head movement, eye sight and facial expression will all have obvious changes when the driver is fatigued; in terms of vehicle manipulation, the time domain and frequency domain indicators of the steering wheel angle, the grip force of the steering wheel, angular velocity, vehicle speed, pedal The pedaling force of the board and the lateral deviation of the vehicle trajectory are related to the driver's fatigue state to a certain extent. Based on this, the present invention grasps the key external symptoms when monitoring the driver's fatigue state, fuses the information of the eye state, steering wheel manipulation and vehicle trajectory, and applies it to the monitoring of the driver's fatigue state to comprehensively evaluate the driver's fatigue state. The fatigue status of the driver can be eliminated when the reliability of the parameter data is low, and other information with high reliability can be used to make up for the lack of single-source information monitoring and improve the accuracy and stability of driver fatigue status monitoring.

下面通过附图和实施例,对本发明的技术方案做进一步的详细描述。The technical solutions of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments.

图1为本发明基于信息融合的驾驶员疲劳状态识别方法第一实施例的流程图,如图1所示,本实施例包括:Fig. 1 is the flowchart of the first embodiment of the driver's fatigue state identification method based on information fusion in the present invention, as shown in Fig. 1, this embodiment includes:

步骤101、采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息。Step 101, collecting the driver's eye state information, steering wheel manipulation state information and vehicle trajectory driving state information.

本步骤中的采集驾驶人的眼部状态信息可以采用现有技术的多种图像采集方式,例如摄影或摄像。摄影是指每隔一设定时间(如1分钟)采集一张驾驶人的眼部图像,摄像是指连续采集驾驶人的眼部图像后,按一设定时间(如1分钟)提取静止图像。采集驾驶人的方向盘操纵状态信息可以采用现有技术的多种传感器方式,例如在方向盘上设置编码器或角度传感器。采集车辆轨迹行驶状态信息可以采用现有技术的多种测量方式,例如采用位置测量装置测量车辆距道路中线的横向偏移距离。本实施例中的驾驶员眼部参数可以是通过摄像头采集眼部图像进行统计处理得到;方向盘参数可以是通过方向盘编码器采集瞬时方向盘转角进行统计处理得到;车辆轨迹参数可以是通过图像处理采集车辆轨迹距道路中线的横向偏移距离经过统计处理得到。The collection of the driver's eye state information in this step can use various image collection methods in the prior art, such as photography or video recording. Photography refers to collecting a driver's eye image at intervals of a set time (such as 1 minute), and photography refers to extracting still images according to a set time (such as 1 minute) after continuously collecting driver's eye images . A variety of sensor methods in the prior art can be used to collect the driver's steering wheel manipulation state information, for example, an encoder or an angle sensor is set on the steering wheel. A variety of measurement methods in the prior art can be used to collect vehicle trajectory driving state information, for example, a position measurement device is used to measure the lateral offset distance of the vehicle from the centerline of the road. The driver's eye parameters in this embodiment can be obtained through statistical processing of eye images collected by the camera; the steering wheel parameters can be obtained through statistical processing of the instantaneous steering wheel angle collected by the steering wheel encoder; the vehicle trajectory parameters can be collected through image processing. The lateral offset distance between the trajectory and the road centerline is obtained through statistical processing.

步骤102、对驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息。Step 102: Process the driver's eye state information, steering wheel manipulation state information, and vehicle trajectory driving state information to obtain driver fatigue performance information.

将采集到的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行统计处理,得到驾驶员的疲劳表现信息。该疲劳表现信息包括单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度、最长闭眼时间、方向盘保持不动的时间比例、方向盘突然转动的角度变化量、车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例。Statistically process the collected eye state information, steering wheel control state information, and vehicle trajectory driving state information to obtain driver fatigue performance information. The fatigue performance information includes the proportion of time in which the eyes are closed more than 80% per unit time, the average eye opening degree, the longest eye-closed time, the proportion of time when the steering wheel remains still, the angle change of the steering wheel suddenly turning, the vehicle snake The standard deviation of the lateral displacement in the normal driving state and the time ratio of the lateral displacement within the safe distance range.

步骤103、对疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据疲劳隶属度得出驾驶员的疲劳等级。Step 103, perform weighted fusion processing on the fatigue performance information to obtain a fatigue membership degree corresponding to the driver's fatigue level, and obtain the driver's fatigue level according to the fatigue membership degree.

本实施例通过对反映驾驶员驾驶疲劳状况的多源信息进行综合起来评价驾驶员的驾驶疲劳状况,提高了驾驶员疲劳状态监测可靠性和准确性,可降低驾驶员疲劳驾驶导致的交通事故数量,保证道路交通安全;并且该多源信息的获取不对驾驶员的驾驶造成干扰,实用性强。This embodiment evaluates the driver's driving fatigue status by combining multiple sources of information reflecting the driver's driving fatigue status, which improves the reliability and accuracy of driver fatigue status monitoring, and can reduce the number of traffic accidents caused by driver fatigue driving , to ensure road traffic safety; and the acquisition of the multi-source information does not interfere with the driving of the driver, and has strong practicability.

图2为本发明基于信息融合的驾驶员疲劳状态识别方法第二实施例的流程图,如图2所示,本实施例包括:Fig. 2 is the flowchart of the second embodiment of the driver's fatigue state recognition method based on information fusion in the present invention, as shown in Fig. 2, this embodiment includes:

步骤201、获取反映驾驶员疲劳状态的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息,并经过处理得到疲劳表现信息;Step 201. Obtain eye state information reflecting the driver's fatigue state, steering wheel manipulation state information, and vehicle trajectory driving state information, and obtain fatigue performance information through processing;

本实施例技术方案中,首先采用专家视频评分的方法,对整个驾驶过程中驾驶人面部状态的录像切割为每1分钟的单独视频,将几位专家对每1分钟的评分进行有效的统计处理得到整个驾驶过程的疲劳评分。在这里将疲劳等级分为4级,即清醒、轻度疲劳、中度疲劳和重度疲劳。以上述专家评分为标准,确定并获取所需要的反映驾驶员疲劳状态的信息类型,包括眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息。在此基础上,应用相关分析、方差分析等统计方法对通过传感器采集的驾驶员眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行统计处理,提取出与驾驶员疲劳评价相关性最强的疲劳表现信息。例如,驾驶员眼部参数是通过对采集的眼部图像处理得到的单帧闭眼程度进行统计处理得到,在单位时间内,该驾驶员眼部参数包括与疲劳相关性最强的PERCLOS80(单位时间眼睛闭合超过80%所占的时间比例)、最长闭眼时间(闭合超过80%即判断为闭眼)和眼睛平均睁开度(可以以比例值表示);方向盘参数是通过对方向盘编码器采集的瞬时方向盘转角进行统计处理得到,该方向盘参数包括单位时间内所述方向盘保持不动的时间比例和方向盘突然转动的角度变化量;车辆轨迹参数是通过对车辆行使传感器采集的车辆轨迹距道路中线的横向偏移距离经过统计处理得到,该车辆轨迹参数包括车辆蛇形行驶状态的横向位移标准差和横向位移处于安全距离范围的时间比例,该安全距离为横向位移小于100mm/s。In the technical solution of this embodiment, the method of expert video scoring is firstly used to cut the video recording of the driver's facial state during the entire driving process into separate videos for each minute, and several experts perform effective statistical processing on the scores of each minute Get a fatigue score for the entire driving process. Here, the fatigue level is divided into 4 levels, namely awake, mild fatigue, moderate fatigue and severe fatigue. Using the above-mentioned expert scores as the standard, determine and obtain the required information types reflecting the driver's fatigue state, including eye state information, steering wheel manipulation state information, and vehicle trajectory driving state information. On this basis, statistical methods such as correlation analysis and variance analysis are used to statistically process the driver's eye state information, steering wheel control state information and vehicle trajectory driving state information collected through the sensor, and extract the most correlated with driver fatigue evaluation. Strong fatigue performance information. For example, the driver's eye parameters are obtained by statistically processing the degree of eye closure of a single frame obtained by processing the collected eye images. In a unit time, the driver's eye parameters include PERCLOS80 (unit The proportion of time when the eyes are closed more than 80%), the longest eye-closed time (closed more than 80% is judged as closed eyes) and the average eye opening (can be expressed as a proportional value); the steering wheel parameters are obtained by encoding the steering wheel The instantaneous steering wheel angle collected by the sensor is obtained by statistical processing. The steering wheel parameters include the time ratio of the steering wheel remaining still and the angle change of the steering wheel's sudden rotation in a unit time; The lateral offset distance of the road centerline is obtained through statistical processing. The vehicle trajectory parameters include the standard deviation of the lateral displacement of the vehicle in a serpentine state and the time ratio of the lateral displacement within the safe distance range. The safe distance is that the lateral displacement is less than 100mm/s.

步骤202、将获取的疲劳表现信息输入人工神经网络得到对应不同参数空间的不同测量周期的初级疲劳隶属度。Step 202: Input the acquired fatigue performance information into the artificial neural network to obtain primary fatigue membership degrees corresponding to different measurement periods in different parameter spaces.

将疲劳表现信息中的眼部状态参数、方向盘操纵状态参数和车辆轨迹行驶状态参数分别输入训练好的神经网络模型,得到初级评价结果,即属于各个疲劳等级的初级隶属度。本实施例可以采用三层BP前馈网络,该三层BP前馈网络的模型公式如下公式(1):The eye state parameters, steering wheel control state parameters and vehicle trajectory state parameters in the fatigue performance information are respectively input into the trained neural network model to obtain the primary evaluation results, that is, the primary membership degrees belonging to each fatigue level. The present embodiment can adopt three-layer BP feed-forward network, and the model formula of this three-layer BP feed-forward network is as follows formula (1):

yk=f2(2),f1(1),xk,b(1)),b(2))             (1)y k = f 2(2) , f 1(1) , x k , b (1) ), b (2) ) (1)

其中,f1(),f2()分别为神经网络隐层和输出层神经元的传递函数;ω(1),ω(2)分别为连接输入层与隐层、隐层与输出层神经元的权重矢量;b(1),b(2)分别为隐层和输出层神经元的偏置矢量;xk为不同参数空间的参数,比如眼部状态参数、方向盘操纵状态参数或者车辆轨迹行驶状态参数,yk为对应各疲劳等级的初级隶属度。具体的输出层yk的表示如下表1所示:Among them, f 1 (), f 2 ( ) are the transfer functions of neurons in the hidden layer and output layer of the neural network respectively; The weight vector of the unit; b (1) and b (2) are the bias vectors of the neurons in the hidden layer and the output layer respectively; x k is the parameters of different parameter spaces, such as eye state parameters, steering wheel control state parameters or vehicle trajectory Driving state parameters, y k is the primary membership degree corresponding to each fatigue level. The representation of the specific output layer y k is shown in Table 1 below:

表1疲劳等级表述Table 1 Fatigue grade expression

Figure G2009100825662D00061
Figure G2009100825662D00061

例如,如上表1中的第一行,0.9、0.1、0.1、0.1分别标识对疲劳等级清醒、轻度疲劳、中度疲劳、重度疲劳的隶属度,当疲劳等级清醒的隶属度0.9最高时,该输出结果对应的疲劳等级为“清醒”。For example, as shown in the first row of Table 1 above, 0.9, 0.1, 0.1, and 0.1 respectively indicate the membership degrees to fatigue levels of sobriety, mild fatigue, moderate fatigue, and severe fatigue. When the fatigue level of sober membership is the highest at 0.9, The fatigue level corresponding to the output result is "awake".

本实施例中的神经网络事先按照驾驶员的生理特征和操作特点进行分类,利用不同类型驾驶员的训练样本对建立的BP神经网络进行离线训练,将权重和偏置存储入记事本,当进行实时判断时,调入已存储的参数,按照模板匹配的思想,使用对应的驾驶人类型训练参数进行在线判断;所以该神经网络模型中参数包括分别匹配不同驾驶员的不同类型参数。The neural network in this embodiment is classified in advance according to the driver's physiological characteristics and operating characteristics, and the BP neural network established is trained offline by using training samples of different types of drivers, and the weights and offsets are stored in the notepad. In real-time judgment, the stored parameters are called in, and according to the idea of template matching, the corresponding driver type training parameters are used for online judgment; therefore, the parameters in the neural network model include different types of parameters that match different drivers.

步骤203、将经过神经网络输出的不同测量周期的疲劳等级隶属度进行多周期D_S证据理论融合,得出对应不同参数空间的基本可信度。Step 203 , performing multi-period D_S evidence theory fusion of the fatigue grade membership degrees output by the neural network in different measurement periods, to obtain the basic reliability corresponding to different parameter spaces.

传感器的测量包括多个测量周期,将不同测量周期的隶属度进行单传感器多周期D_S证据理论融合,首先将神经网络输出的多个测量周期的初级疲劳隶属度经过归一化转化为作为D_S证据理论的重要依据的基本可信度分配参数;然后利用以下的公式(2)将多个测量周期的基本可信度分配参数进行融合,就可以得出不同参数空间的基本可信度。该融合多个测量周期的D_S证据理论的融合公式如下:The measurement of the sensor includes multiple measurement cycles, and the membership degrees of different measurement cycles are theoretically fused with single-sensor multi-cycle D_S evidence. First, the primary fatigue membership of multiple measurement cycles output by the neural network is normalized and transformed into D_S evidence The basic reliability distribution parameters of the important basis of the theory; and then use the following formula (2) to fuse the basic reliability distribution parameters of multiple measurement cycles to obtain the basic reliability of different parameter spaces. The fusion formula of the D_S evidence theory that fuses multiple measurement cycles is as follows:

Mm (( AA ii )) == cc -- 11 ΣΣ II AA jj == AA ii ΠΠ 11 ≤≤ sthe s ≤≤ nno Mm sthe s (( AA ii )) ,, ii == 1,21,2 ,, .. .. .. ,, kk -- -- -- (( 22 ))

其中in

cc == 11 -- ΣΣ II AA ii == ΦΦ ΠΠ 11 ≤≤ sthe s ≤≤ nno Mm sthe s (( AA ii )) == ΣΣ II AA == φφ ΠΠ 11 ≤≤ sthe s ≤≤ nno Mm sthe s (( AA ii ))

上式为传感器依据n个测量周期的累积测量对k个命题的融合后验可信度分配公式,其中,设M1(Ai),M2(Ai),...Mn(Ai)为传感器在各个测量周期中,通过不断的目标态势和固定不变的先验可信度分配而获得的后验可信度分配,i=1,2,...,k,Mj(Ai)表示在第j个周期中对命题Ai的可信度分配。通过上述公式2即可得出不同参数空间的基本可信度,即眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度。The above formula is the fused a posteriori reliability distribution formula for k propositions based on the accumulative measurements of n measurement periods, where M 1 (A i ), M 2 (A i ),...M n (A i i ) is the posterior reliability distribution obtained by the sensor through continuous target situation and fixed prior reliability distribution in each measurement period, i=1, 2,..., k, M j (A i ) represents the credibility assignment to proposition A i in the j-th period. The basic reliability of different parameter spaces can be obtained through the above formula 2, that is, the reliability of eye state parameters, the reliability of steering wheel control state parameters and the reliability of vehicle trajectory driving state parameters.

经过步骤202~203中的神经网络和D_S证据理论完成了子系统级的融合,继续执行步骤204~步骤205,进行系统级的融合。After the neural network and D_S evidence theory in steps 202-203 complete the fusion at the subsystem level, continue to execute steps 204-205 to perform system-level fusion.

步骤204、确定对疲劳表现信息进行加权融合的融合权重。Step 204, determining fusion weights for weighted fusion of the fatigue performance information.

眼部状态参数、方向盘操纵状态参数和车辆轨迹行驶状态参数与驾驶员的疲劳状态都具有一定的相关性,但是相关的程度是有所不同的,因此与上述三类参数对应的融合权重也应不同。该融合权重是通过常权重结合子系统级融合中得到的参数可信度确定。Eye state parameters, steering wheel control state parameters, and vehicle trajectory state parameters all have a certain correlation with the driver's fatigue state, but the degree of correlation is different, so the fusion weight corresponding to the above three types of parameters should also be different. The fusion weight is determined by combining the constant weight with the parameter credibility obtained in the subsystem-level fusion.

首先,常权重是利用模式识别中类可分性指标归一化的结果来确定,模式识别中的类可分性指标越大说明该指标的类可分性越好。在系统级层次利用不同参数空间的参数对疲劳的区分能力来确定常权重。该常权重以J表示,其确定方法是利用如下公式(3):First, the constant weight is determined by using the normalized result of the class separability index in pattern recognition. The larger the class separability index in pattern recognition, the better the class separability of the index. At the system level, the constant weights are determined by utilizing the fatigue discrimination ability of parameters in different parameter spaces. The constant weight is represented by J, and its determination method is to use the following formula (3):

JJ == ΣΣ ii Mm ΣΣ jj ≠≠ ii Mm (( μμ ii -- μμ jj )) 22 σσ ii 22 ++ σσ jj 22 ,, -- -- -- (( 33 ))

其中,J表示参数的可分类能力,比如眼部状态参数的常权重J表示眼部状态参数的可分类能力。μi、μj分别表示相邻类的均值,σi、σj是相邻类的方差,M表示总共的类别数。通过将J值进行归一化处理作为常权重。相邻类是指相邻的疲劳等级,比如轻度疲劳和中度疲劳为两个相邻类。分别计算三个参数空间中与相邻类对应的的参数的均值或方差,即可得到μi、μj和σi、σj。本实施例中的疲劳等级为4级,因此M等于4。通过上述公式3即可得到对应的三种类型参数的各自常权重,即眼部状态参数常权重、方向盘操纵状态参数常权重和车辆轨迹行驶状态参数常权重。Among them, J represents the classifiable ability of the parameter, for example, the constant weight J of the eye state parameter represents the classifiable ability of the eye state parameter. μ i , μ j represent the mean value of adjacent classes, σ i , σ j are the variances of adjacent classes, and M represents the total number of classes. By normalizing the J value as a constant weight. Adjacent classes refer to adjacent fatigue levels, for example, mild fatigue and moderate fatigue are two adjacent classes. Calculate the mean or variance of the parameters corresponding to the adjacent classes in the three parameter spaces, respectively, to obtain μi, μj and σi, σj. The fatigue level in this embodiment is level 4, so M is equal to 4. The corresponding constant weights of the three types of parameters can be obtained through the above formula 3, namely, the constant weights of the eye state parameters, the constant weights of the steering wheel manipulation state parameters, and the constant weights of the vehicle trajectory driving state parameters.

然后,通过常权重结合子系统级融合中得到的参数可信度确定融合权重。可以预先设定一个参考可信度上限和参考可信度下限,本实施例中将参考可信度上限设定为0.8,将参考可信度下限设定为0.5。将各个参数空间的参数和参数可信度一起上报到融合中心。当参数可信度<0.5时,融合权重为零,即不融合该参数空间,该参数空间的参数被舍弃;当参数可信度>0.8时,此时不进行变权,使用常权重进行融合;当参数可信度处于0.5-0.8之间时,按照公式(4)-(9)计算融合权重。此时参数可信度与融合权重成线性正比例关系,常权比作为比例系数,权重之间的关系如下式:Then, the fusion weight is determined by combining the constant weight with the parameter credibility obtained in the subsystem-level fusion. An upper limit of reference reliability and a lower limit of reference reliability may be preset. In this embodiment, the upper limit of reference reliability is set to 0.8, and the lower limit of reference reliability is set to 0.5. Report the parameters and parameter credibility of each parameter space to the fusion center together. When the parameter reliability is less than 0.5, the fusion weight is zero, that is, the parameter space is not fused, and the parameters of the parameter space are discarded; when the parameter reliability is greater than 0.8, the weight is not changed at this time, and the constant weight is used for fusion ; When the parameter reliability is between 0.5-0.8, calculate the fusion weight according to the formula (4)-(9). At this time, the parameter credibility is linearly proportional to the fusion weight, and the constant weight ratio is used as the proportional coefficient. The relationship between the weights is as follows:

WW 11 WW 33 == ww 1313 ×× rr 11 rr 33 -- -- -- (( 44 ))

WW 22 WW 33 == ww 23twenty three ×× rr 22 rr 33 -- -- -- (( 55 ))

W1+W2+W3=1                                 (6)W 1 +W 2 +W 3 =1 (6)

WW 11 == rr 11 ww 1212 rr 11 ww 1212 ++ rr 22 ++ rr 33 ww 3232 -- -- -- (( 77 ))

WW 22 == rr 22 rr 11 ww 1212 ++ rr 22 ++ ww 3232 rr 33 -- -- -- (( 88 ))

WW 33 == rr 33 rr 11 ww 1313 ++ rr 22 ww 23twenty three ++ rr 33 -- -- -- (( 99 ))

其中,W1、W2、W3分别表示眼部状态参数融合权重、方向盘操纵状态参数融合权重和车辆轨迹行驶状态参数融合权重;r1、r2、r3分别表示眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度;w12、w23、w13分别表示眼部状态参数常权重与方向盘操纵状态参数常权重之比、方向盘操纵状态参数常权重与车辆轨迹行驶状态参数常权重之比、眼部状态参数常权重与车辆轨迹行驶状态参数常权重之比。Among them, W 1 , W 2 , and W 3 represent the fusion weights of eye state parameters, steering wheel control state parameters, and vehicle trajectory state parameters respectively; r 1 , r 2 , and r 3 represent the credible eye state parameters w 12 , w 23 , and w 13 represent the ratio of the constant weight of the eye state parameter to the constant weight of the steering wheel control state parameter, and the constant weight of the steering wheel control state parameter The ratio of the weight to the constant weight of the vehicle trajectory driving state parameter, the ratio of the constant weight of the eye state parameter to the constant weight of the vehicle trajectory driving state parameter.

步骤205、对疲劳表现信息进行变权重的自身D_S融合,得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。Step 205 , perform self-D_S fusion with variable weights on the fatigue performance information to obtain the fatigue membership degree corresponding to the driver's fatigue level, and obtain the driver's fatigue level according to the fatigue membership degree.

根据步骤204中得到的三类不同参数的融合权重将眼部状态参数、方向盘操纵状态参数和车辆轨迹行驶状态参数进行权重的自身D_S融合,即在系统级进行不同参数空间的多传感器D_S证据理论融合,得到与驾驶员疲劳等级对应的疲劳隶属度,并根据该疲劳隶属度得出驾驶员的疲劳等级。该步骤中的多传感器D_S证据理论融合依然采用公式(2),只是将多个测量周期的数据融合改变为多个参数空间的数据融合。According to the fusion weights of the three different parameters obtained in step 204, the eye state parameters, the steering wheel manipulation state parameters and the vehicle trajectory driving state parameters are weighted for their own D_S fusion, that is, the multi-sensor D_S evidence theory of different parameter spaces is carried out at the system level Fusion, the fatigue membership degree corresponding to the driver's fatigue level is obtained, and the driver's fatigue level is obtained according to the fatigue membership degree. The multi-sensor D_S evidence theory fusion in this step still adopts the formula (2), but the data fusion of multiple measurement periods is changed to the data fusion of multiple parameter spaces.

本实施例通过对反映驾驶员驾驶疲劳状况的多源信息进行综合起来评价驾驶员的驾驶疲劳状况,三方面信息可以更加全面的评价疲劳,如果单方面信息失效也不会使得评价系统瘫痪,仍然可以得到疲劳状态,从而提高了驾驶员疲劳状态监测可靠性和准确性,可降低驾驶员疲劳驾驶导致的交通事故数量,保证道路交通安全;并且该多源信息的获取不对驾驶员的驾驶造成干扰,实用性强。In this embodiment, the multi-source information reflecting the driver's driving fatigue is combined to evaluate the driver's driving fatigue. The three aspects of information can more comprehensively evaluate the fatigue. The fatigue state can be obtained, thereby improving the reliability and accuracy of driver fatigue state monitoring, reducing the number of traffic accidents caused by driver fatigue driving, and ensuring road traffic safety; and the acquisition of multi-source information does not interfere with the driver's driving , Strong practicability.

图3为本发明基于信息融合的驾驶员疲劳状态识别系统第一实施例的结构示意图,如图3所示,本实施例驾驶员疲劳监测系统可以包括采集模块301、处理模块302和融合模块303,其中,采集模块301用于采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息;处理模块302用于对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息;融合模块303用于对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。Figure 3 is a schematic structural diagram of the first embodiment of the driver fatigue state recognition system based on information fusion in the present invention, as shown in Figure 3, the driver fatigue monitoring system of this embodiment may include an acquisition module 301, a processing module 302 and a fusion module 303 , wherein the collection module 301 is used to collect the driver's eye state information, steering wheel manipulation state information and vehicle track driving state information; the processing module 302 is used to collect the driver's eye state information, steering wheel manipulation state information and vehicle Trajectory driving state information is processed to obtain the fatigue performance information of the driver; the fusion module 303 is used to carry out weighted fusion processing on the fatigue performance information to obtain the fatigue membership degree corresponding to the driver fatigue level, and obtain the fatigue membership degree according to the fatigue membership degree The driver's fatigue level.

具体的,采集模块301采集反映驾驶员疲劳状态的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息,处理模块302将采集模块301获取的上述信息进行分析处理,得到驾驶员的疲劳表现信息,该疲劳表现信息包括单位时间内眼睛闭合程度超过80%所占的时间比例、平均闭眼时间、最长闭眼时间、方向盘保持不动的时间比例、方向盘突然转动的角度变化量、车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例。融合模块303将上述疲劳表现信息进行加权融合处理后即可得到与疲劳等级对应的综合疲劳隶属度,并根据该综合疲劳隶属度得出驾驶员的疲劳等级。Specifically, the collection module 301 collects eye state information reflecting the driver's fatigue state, steering wheel manipulation state information, and vehicle trajectory driving state information, and the processing module 302 analyzes and processes the above information obtained by the collection module 301 to obtain the driver's fatigue performance The fatigue performance information includes the proportion of time when the eyes are closed more than 80% per unit time, the average eye-closed time, the longest eye-closed time, the proportion of time when the steering wheel remains still, the angle change of the steering wheel suddenly turned, the vehicle The standard deviation of the lateral displacement in the serpentine driving state and the time ratio of the lateral displacement within the safe distance range. The fusion module 303 performs weighted fusion processing on the above fatigue performance information to obtain the comprehensive fatigue membership degree corresponding to the fatigue level, and obtains the fatigue level of the driver according to the comprehensive fatigue membership degree.

本实施例驾驶员疲劳监测系统通过对反映驾驶员驾驶疲劳状况的多源信息进行综合起来评价驾驶员的驾驶疲劳状况,提高了驾驶员疲劳状态监测可靠性和准确性,可降低驾驶员疲劳驾驶导致的交通事故数量,保证道路交通安全;并且该多源信息的获取不对驾驶员的驾驶造成干扰,实用性强。The driver fatigue monitoring system of this embodiment evaluates the driver's driving fatigue status by integrating multiple sources of information reflecting the driver's driving fatigue status, which improves the reliability and accuracy of driver fatigue status monitoring, and can reduce driver fatigue driving. The number of traffic accidents caused can ensure road traffic safety; and the acquisition of the multi-source information does not interfere with the driver's driving, which is highly practical.

图4为本发明基于信息融合的驾驶员疲劳状态识别系统第二实施例的结构示意图,如图4所示,本实施例在第一实施例的基础上,融合模块303可以包括子系统级融合单元3031和系统级融合单元3032,其中,子系统级融合单元3031用于将所述疲劳表现信息中的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息分别进行加权融合处理,得到眼部状态信息可信度、方向盘操纵状态信息可信度和车辆轨迹行驶状态信息可信度;系统级融合单元3032根据所述眼部状态信息可信度、方向盘操纵状态信息可信度和车辆轨迹行驶状态信息可信度分别确定眼部状态信息融合权重、方向盘操纵状态信息融合权重和车辆轨迹行驶状态信息融合权重,并根据所述眼部状态信息融合权重、方向盘操纵状态信息融合权重和车辆轨迹行驶状态信息融合权重对所述眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行加权融合处理,得到驾驶员的疲劳等级。FIG. 4 is a schematic structural diagram of the second embodiment of the driver fatigue state recognition system based on information fusion in the present invention. As shown in FIG. Unit 3031 and system-level fusion unit 3032, wherein the subsystem-level fusion unit 3031 is used to perform weighted fusion processing on the eye state information, steering wheel manipulation state information, and vehicle trajectory driving state information in the fatigue performance information to obtain eye Credibility of eye state information, steering wheel manipulation state information credibility, and vehicle trajectory driving state information credibility; system-level fusion unit 3032 The reliability of the driving state information determines the fusion weight of the eye state information, the fusion weight of the steering wheel manipulation state information and the fusion weight of the vehicle trajectory driving state information respectively, and according to the fusion weight of the eye state information, the fusion weight of the steering wheel manipulation state information and the vehicle trajectory The driving state information fusion weight performs weighted fusion processing on the eye state information, steering wheel manipulation state information and vehicle trajectory driving state information to obtain the driver's fatigue level.

进一步地,子系统级融合单元3031可以进一步包括第一子单元、第二子单元,其中,第一子单元用于将所述眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息依据神经网络参数限定关系进行加权融合得到对应驾驶员疲劳等级的初级疲劳隶属度,所述初级疲劳隶属度包括对应多个所述眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息测量周期的初级疲劳隶属度;第二子单元用于将所述初级疲劳隶属度进行归一化处理转化为基本可信度分配参数,并将对应所述多个所述眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息测量周期的基本可信度分配参数进行D_S证据理论融合,分别得到眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度。系统级融合单元3032包括权重确定子单元,用于根据所述眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度分别确定眼部状态参数融合权重、方向盘操纵状态参数融合权重和车辆轨迹行驶状态参数融合权重。Further, the subsystem-level fusion unit 3031 may further include a first subunit and a second subunit, wherein the first subunit is used to combine the eye state information, steering wheel manipulation state information, and vehicle trajectory driving state information according to neural Performing weighted fusion on the network parameter definition relationship to obtain the primary fatigue membership degree corresponding to the driver fatigue level, the primary fatigue membership degree includes the primary fatigue membership degree corresponding to a plurality of measurement cycles of the eye state information, steering wheel manipulation state information, and vehicle trajectory driving state information. Fatigue membership degree; the second subunit is used to normalize the primary fatigue membership degree and transform it into a basic reliability distribution parameter, and will correspond to the plurality of eye state information, steering wheel manipulation state information and The basic reliability distribution parameters of the vehicle trajectory driving state information measurement cycle are fused with D_S evidence theory, and the eye state parameter credibility, steering wheel manipulation state parameter credibility and vehicle trajectory driving state parameter credibility are respectively obtained. The system-level fusion unit 3032 includes a weight determination subunit, which is used to determine the eye state parameter fusion weight, steering wheel Manipulation state parameter fusion weight and vehicle trajectory driving state parameter fusion weight.

具体的,在第一实施例的基础上,融合模块303对上述疲劳表现信息进行加权融合处理的过程可以分为子系统级融合和系统级融合两个步骤,而子系统级融合又可以分为神经网络融合和D_S证据理论融合两个步骤。子系统级融合单元3031中的第一子单元先将眼部状态参数、方向盘操纵状态参数和车辆轨迹行驶状态参数输入已经训练好的人工神经网络,可以分别得到这三种参数空间对应的属于不同疲劳等级的初级疲劳隶属度,而且是不同测量周期的初级疲劳隶属度。第二子单元将上述初级疲劳隶属度进行归一化处理转化为对应不同测量周期的基本可信度分配参数,然后利用D_S证据理论对单一空间多测量周期的基本可信度分配参数进行融合,就可以得到每类参数空间的基本可信度,即眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度。在系统级融合单元3032对三类参数进行加权融合时,其中的权重确定子单元需要先确定三类参数的融合权重,该融合权重是根据第二子单元得出的基本可信度变动的,是通过常权重结合各类参数的可信度综合确定的,具体的确定方法可以参见方法实施例。在融合权重确定后,系统级融合单元3032则可以根据该融合权重对三类参数进行加权融合和自身变权重D_S证据理论融合,得到疲劳等级结果。Specifically, on the basis of the first embodiment, the process of the fusion module 303 performing weighted fusion processing on the above fatigue performance information can be divided into two steps: subsystem-level fusion and system-level fusion, and subsystem-level fusion can be divided into two steps: There are two steps of neural network fusion and D_S evidence theory fusion. The first subunit in the subsystem-level fusion unit 3031 first inputs the eye state parameters, steering wheel manipulation state parameters, and vehicle trajectory driving state parameters into the trained artificial neural network, and can obtain the three parameter spaces corresponding to different The primary fatigue membership of the fatigue class and the primary fatigue membership of the different measurement cycles. The second subunit normalizes the above-mentioned primary fatigue membership into basic reliability distribution parameters corresponding to different measurement periods, and then uses the D_S evidence theory to fuse the basic reliability distribution parameters of multiple measurement periods in a single space, The basic reliability of each type of parameter space can be obtained, that is, the reliability of eye state parameters, the reliability of steering wheel control state parameters and the reliability of vehicle trajectory driving state parameters. When the system-level fusion unit 3032 performs weighted fusion of the three types of parameters, the weight determination subunit needs to first determine the fusion weight of the three types of parameters, and the fusion weight is changed according to the basic credibility obtained by the second subunit. It is comprehensively determined through constant weights combined with the credibility of various parameters. For the specific determination method, please refer to the method embodiment. After the fusion weight is determined, the system-level fusion unit 3032 can perform weighted fusion of the three types of parameters and self-variable weight D_S evidence theory fusion according to the fusion weight to obtain the fatigue level result.

本实施例驾驶员疲劳监测系统通过对反映驾驶员驾驶疲劳状况的多源信息进行综合起来评价驾驶员的驾驶疲劳状况,提高了驾驶员疲劳状态监测可靠性和准确性,可降低驾驶员疲劳驾驶导致的交通事故数量,保证道路交通安全;并且该多源信息的获取不对驾驶员的驾驶造成干扰,实用性强。The driver fatigue monitoring system of this embodiment evaluates the driver's driving fatigue status by integrating multiple sources of information reflecting the driver's driving fatigue status, which improves the reliability and accuracy of driver fatigue status monitoring, and can reduce driver fatigue driving. The number of traffic accidents caused can ensure road traffic safety; and the acquisition of the multi-source information does not interfere with the driver's driving, which is highly practical.

最后应说明的是:以上实施例仅用以说明本发明的技术方案而非对其进行限制,尽管参照较佳实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对本发明的技术方案进行修改或者等同替换,而这些修改或者等同替换亦不能使修改后的技术方案脱离本发明技术方案的精神和范围。Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the present invention and not to limit them. Although the present invention has been described in detail with reference to the preferred embodiments, those of ordinary skill in the art should understand that: it still Modifications or equivalent replacements can be made to the technical solutions of the present invention, and these modifications or equivalent replacements cannot make the modified technical solutions deviate from the spirit and scope of the technical solutions of the present invention.

Claims (6)

1.一种基于信息融合的驾驶员疲劳状态识别方法,其特征在于,包括:1. A driver's fatigue state recognition method based on information fusion, is characterized in that, comprises: 步骤1、采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息;Step 1. Collect the driver's eye state information, steering wheel manipulation state information and vehicle trajectory driving state information; 步骤2、对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息,所述疲劳表现信息包括单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度、最长闭眼时间、方向盘保持不动的时间比例、方向盘突然转动的角度变化量、车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例;Step 2. Process the driver's eye state information, steering wheel manipulation state information, and vehicle track driving state information to obtain driver fatigue performance information, which includes the eye closure degree exceeding 80% per unit time The proportion of time occupied, the average degree of eye opening, the longest eye-closed time, the proportion of time the steering wheel remains still, the angle change of the steering wheel suddenly turning, the standard deviation of the lateral displacement of the vehicle in a serpentine driving state, and the lateral displacement in The time ratio of the safety distance range; 步骤3、对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级;Step 3, performing weighted fusion processing on the fatigue performance information to obtain a fatigue membership degree corresponding to the driver's fatigue level, and obtaining the driver's fatigue level according to the fatigue membership degree; 其中,所述步骤3包括:Wherein, said step 3 includes: 步骤31、将所述单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度和最长闭眼时间进行融合得到眼部状态参数可信度,将所述方向盘保持不动的时间比例和方向盘突然转动的角度变化量进行融合得到方向盘操纵状态参数可信度,将所述车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例进行融合得到车辆轨迹行驶状态参数可信度;Step 31. Fusion the proportion of time that the eye closure degree exceeds 80% per unit time, the average eye opening degree, and the longest eye closure time to obtain the eye state parameter reliability, and keep the steering wheel still The time ratio of the steering wheel is fused with the angle change of the sudden rotation of the steering wheel to obtain the reliability of the steering wheel manipulation state parameters, and the standard deviation of the lateral displacement of the vehicle in a serpentine driving state is fused with the time ratio of the lateral displacement in the safe distance range to obtain The reliability of vehicle trajectory driving state parameters; 步骤32、根据所述眼部状态参数可信度确定眼部状态参数融合权重,根据所述方向盘操纵状态参数可信度确定方向盘操纵状态参数融合权重,根据所述车辆轨迹行驶状态参数可信度得到车辆轨迹行驶状态参数融合权重;根据所述眼部状态参数融合权重、方向盘操纵状态参数融合权重和车辆轨迹行驶状态参数融合权重将所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。Step 32: Determine the fusion weight of the eye state parameters according to the credibility of the eye state parameters, determine the fusion weight of the steering wheel steering state parameters according to the credibility of the steering wheel steering state parameters, and determine the fusion weight of the steering wheel steering state parameters according to the credibility of the vehicle trajectory driving state parameters Obtaining the fusion weight of the vehicle trajectory driving state parameters; performing weighted fusion processing on the fatigue performance information according to the fusion weight of the eye state parameters, the fusion weight of the steering wheel manipulation state parameters and the fusion weight of the vehicle trajectory driving state parameters to obtain the corresponding driver fatigue level The fatigue membership degree, and according to the fatigue membership degree, the driver's fatigue level is obtained. 2.根据权利要求1所述的方法,其特征在于,所述步骤31具体包括:2. The method according to claim 1, wherein said step 31 specifically comprises: 步骤311、将所述疲劳表现信息按照神经网络模型进行融合,分别得到对应疲劳表现信息的不同测量周期的眼部状态参数初级疲劳隶属度、方向盘操纵状态参数初级疲劳隶属度和车辆轨迹行驶状态参数初级疲劳隶属度;Step 311: Fuse the fatigue performance information according to the neural network model to obtain the primary fatigue membership degree of the eye state parameter, the primary fatigue membership degree of the steering wheel control state parameter, and the vehicle trajectory driving state parameter of different measurement periods corresponding to the fatigue performance information primary fatigue membership; 步骤312、将不同测量周期的初级疲劳隶属度转化为基本可信度分配参数,并将所述不同测量周期的基本可信度分配参数按照D-S证据理论模型进行融合分别得到眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度。Step 312: Convert the primary fatigue membership degrees of different measurement periods into basic reliability allocation parameters, and fuse the basic reliability allocation parameters of different measurement periods according to the D-S evidence theory model to obtain the eye state parameter credibility respectively. degree, the reliability of steering wheel control state parameters, and the reliability of vehicle trajectory driving state parameters. 3.根据权利要求1所述的方法,其特征在于,参数可信度包括所述眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度,所述步骤32包括:3. method according to claim 1, it is characterized in that, parameter credibility comprises described eye state parameter credibility, steering wheel steering state parameter credibility and vehicle trajectory driving state parameter credibility, described step 32 includes: 步骤321、提取预先设定的参考可信度上限和参考可信度下限;Step 321, extracting the pre-set upper limit of reference reliability and lower limit of reference reliability; 步骤322、将所述参数可信度与参考可信度上限和参考可信度下限进行比较,若所述参数可信度小于所述参考可信度下限,则与所述参数可信度对应的参数融合权重为零;若所述参数可信度大于所述参考可信度上限,则与所述参数可信度对应的参数融合权重为用于标识参数对疲劳的区分能力的常权重;若所述参数可信度在所述参考可信度下限和参考可信度上限之间,则与所述参数可信度对应的参数融合权重与所述参数可信度成线性正比例关系。Step 322: Compare the parameter credibility with the upper limit of reference credibility and the lower limit of reference credibility, if the parameter credibility is smaller than the lower limit of reference credibility, then it corresponds to the parameter credibility The parameter fusion weight is zero; if the parameter credibility is greater than the upper limit of the reference credibility, the parameter fusion weight corresponding to the parameter credibility is a constant weight used to identify the ability of the parameter to distinguish fatigue; If the parameter credibility is between the reference reliability lower limit and the reference reliability upper limit, then the parameter fusion weight corresponding to the parameter credibility is linearly proportional to the parameter credibility. 4.一种基于信息融合的驾驶员疲劳状态识别系统,其特征在于,包括:4. A driver fatigue state recognition system based on information fusion, characterized in that it comprises: 采集模块,用于采集驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息;The collection module is used to collect the driver's eye state information, steering wheel manipulation state information and vehicle track driving state information; 处理模块,用于对所述驾驶员的眼部状态信息、方向盘操纵状态信息和车辆轨迹行驶状态信息进行处理,获得驾驶员的疲劳表现信息,所述疲劳表现信息包括单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度、最长闭眼时间、方向盘保持不动的时间比例、方向盘突然转动的角度变化量、车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例;The processing module is used to process the driver's eye state information, steering wheel manipulation state information and vehicle track driving state information to obtain driver fatigue performance information, and the fatigue performance information includes the degree of eye closure per unit time exceeding The proportion of time occupied by 80%, the average degree of eye opening, the longest eye-closed time, the proportion of time when the steering wheel remains still, the angle change of sudden steering wheel rotation, the standard deviation of the lateral displacement of the vehicle in a serpentine driving state, and the lateral displacement of the vehicle. The proportion of time that the displacement is within the safe distance range; 融合模块,用于对所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级;The fusion module is used to carry out weighted fusion processing on the fatigue performance information to obtain the fatigue membership degree corresponding to the driver's fatigue level, and obtain the driver's fatigue level according to the fatigue membership degree; 其中,所述融合模块包括:Wherein, the fusion module includes: 子系统级融合单元,用于将所述单位时间内眼睛闭合程度超过80%所占的时间比例、眼睛平均睁开度和最长闭眼时间进行融合得到眼部状态参数可信度,将所述方向盘保持不动的时间比例和方向盘突然转动的角度变化量进行融合得到方向盘操纵状态参数可信度,将所述车辆蛇形行驶状态的横向位移标准差和所述横向位移处于安全距离范围的时间比例进行融合得到车辆轨迹行驶状态参数可信度;The subsystem-level fusion unit is used to fuse the time proportion of the eye closure degree exceeding 80% in the unit time, the average eye opening degree and the longest eye closure time to obtain the eye state parameter credibility, and combine the The time ratio of the steering wheel keeping still and the angle change of the sudden rotation of the steering wheel are fused to obtain the reliability of the steering wheel control state parameters, and the standard deviation of the lateral displacement of the vehicle in a serpentine driving state and the value of the lateral displacement within the safe distance range The time ratio is fused to obtain the reliability of the vehicle trajectory driving state parameters; 系统级融合单元,用于根据与所述眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度相关的眼部状态参数融合权重、方向盘操纵状态参数融合权重和车辆轨迹行驶状态参数融合权重将所述疲劳表现信息进行加权融合处理得到与驾驶员疲劳等级对应的疲劳隶属度,并根据所述疲劳隶属度得出驾驶员的疲劳等级。The system-level fusion unit is used to fuse weights of eye state parameters and fusion weights of steering wheel manipulation state parameters related to the credibility of eye state parameters, steering wheel manipulation state parameters, and vehicle trajectory driving state parameters. Fusing the weight with the vehicle trajectory driving state parameters to perform weighted fusion processing on the fatigue performance information to obtain the fatigue membership degree corresponding to the driver fatigue level, and obtain the driver fatigue level according to the fatigue membership degree. 5.根据权利要求4所述的系统,其特征在于,所述子系统级融合单元包括:5. The system according to claim 4, wherein the subsystem-level fusion unit comprises: 第一子单元,用于将所述疲劳表现信息按照神经网络模型进行融合,分别得到对应疲劳表现信息的不同测量周期的眼部状态参数初级疲劳隶属度、方向盘操纵状态参数初级疲劳隶属度和车辆轨迹行驶状态参数初级疲劳隶属度;The first subunit is used to fuse the fatigue performance information according to the neural network model, and respectively obtain the primary fatigue membership degree of the eye state parameter, the primary fatigue membership degree of the steering wheel manipulation state parameter and the vehicle Trajectory driving state parameter primary fatigue membership degree; 第二子单元,用于将不同测量周期的初级疲劳隶属度转化为基本可信度分配参数,并将所述不同测量周期的基本可信度分配参数按照D-S证据理论模型进行融合分别得到眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度。The second subunit is used to convert the primary fatigue membership degrees of different measurement periods into basic reliability distribution parameters, and fuse the basic reliability distribution parameters of different measurement periods according to the D-S evidence theory model to obtain eye State parameter credibility, steering wheel manipulation state parameter credibility and vehicle trajectory driving state parameter credibility. 6.根据权利要求4所述的系统,其特征在于,所述系统级融合单元包括:6. The system according to claim 4, wherein the system-level fusion unit comprises: 权重确定子单元,用于根据所述眼部状态参数可信度、方向盘操纵状态参数可信度和车辆轨迹行驶状态参数可信度分别确定眼部状态参数融合权重、方向盘操纵状态参数融合权重和车辆轨迹行驶状态参数融合权重。The weight determination subunit is used to determine the eye state parameter fusion weight, steering wheel manipulation state parameter fusion weight and Fusion weight of vehicle trajectory driving state parameters.
CN2009100825662A 2009-04-24 2009-04-24 Driver fatigue state recognition method and system based on information fusion Expired - Fee Related CN101872171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100825662A CN101872171B (en) 2009-04-24 2009-04-24 Driver fatigue state recognition method and system based on information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100825662A CN101872171B (en) 2009-04-24 2009-04-24 Driver fatigue state recognition method and system based on information fusion

Publications (2)

Publication Number Publication Date
CN101872171A CN101872171A (en) 2010-10-27
CN101872171B true CN101872171B (en) 2012-06-27

Family

ID=42997091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100825662A Expired - Fee Related CN101872171B (en) 2009-04-24 2009-04-24 Driver fatigue state recognition method and system based on information fusion

Country Status (1)

Country Link
CN (1) CN101872171B (en)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112012006288B4 (en) * 2012-04-25 2020-09-03 Toyota Jidosha Kabushiki Kaisha Drift judging device
CN103473890B (en) * 2013-09-12 2015-09-16 合肥工业大学 Based on driver fatigue real-time monitoring system and the monitoring method of multi information
CN103578227B (en) * 2013-09-23 2015-10-07 吉林大学 Based on the method for detecting fatigue driving of GPS locating information
CN103989471B (en) * 2014-05-08 2015-11-04 东北大学 A fatigue driving detection method based on electroencephalogram recognition
DE102014214214A1 (en) * 2014-07-22 2016-01-28 Robert Bosch Gmbh Method and device for determining a degree of fatigue of a driver of a vehicle and vehicle
CN104391569A (en) * 2014-10-15 2015-03-04 东南大学 Brain-machine interface system based on cognition and emotional state multi-mode perception
CN104952209A (en) * 2015-04-30 2015-09-30 广州视声光电有限公司 Driving prewarning method and device
JP6399311B2 (en) * 2015-05-07 2018-10-03 スズキ株式会社 Dozing detection device
CN105160818B (en) * 2015-08-26 2019-10-29 厦门雅迅网络股份有限公司 It is a kind of based on preceding method for detecting fatigue driving and device to risk of collision
CN105139070B (en) * 2015-08-27 2018-02-02 南京信息工程大学 fatigue driving evaluation method based on artificial neural network and evidence theory
CN105303829A (en) * 2015-09-11 2016-02-03 深圳市乐驰互联技术有限公司 Vehicle driver emotion recognition method and device
JP6666705B2 (en) * 2015-09-28 2020-03-18 デルタ工業株式会社 Biological condition estimation device, biological condition estimation method, and computer program
CN105320966A (en) * 2015-10-30 2016-02-10 东软集团股份有限公司 Vehicle driving state recognition method and apparatus
CN105261153A (en) * 2015-11-03 2016-01-20 北京奇虎科技有限公司 Vehicle running monitoring method and device
DE102015225109A1 (en) * 2015-12-14 2017-06-14 Robert Bosch Gmbh A method and apparatus for classifying eye opening data of at least one eye of an occupant of a vehicle and method and apparatus for detecting drowsiness and / or instantaneous sleep of an occupant of a vehicle
CN105740847A (en) * 2016-03-02 2016-07-06 同济大学 Fatigue grade discrimination algorithm based on driver eye portion identification and vehicle driving track
CN106236047A (en) * 2016-09-05 2016-12-21 合肥飞鸟信息技术有限公司 The control method of driver fatigue monitoring system
CN106355837A (en) * 2016-09-09 2017-01-25 常州大学 Fatigue driving monitoring method on basis of mobile phone
CN106251583B (en) * 2016-09-30 2018-09-25 江苏筑磊电子科技有限公司 Fatigue driving discrimination method based on driving behavior and eye movement characteristics
CN106580349B (en) * 2016-12-07 2020-01-21 中国民用航空总局第二研究所 Controller fatigue detection method and device and controller fatigue response method and device
CN107233103B (en) * 2017-05-27 2020-11-20 西南交通大学 Method and system for evaluating the fatigue state of high-speed rail dispatchers
CN107379967A (en) * 2017-07-14 2017-11-24 南京信息工程大学 A kind of Drunken driving measure and control device based on multi-sensor information fusion
CN107844777B (en) * 2017-11-16 2021-06-11 百度在线网络技术(北京)有限公司 Method and apparatus for generating information
CN108108766B (en) * 2017-12-28 2021-10-29 东南大学 Driving behavior recognition method and system based on multi-sensor data fusion
JP6714036B2 (en) * 2018-04-25 2020-06-24 株式会社日立物流 Management support system
CN110641467A (en) * 2018-06-25 2020-01-03 广州汽车集团股份有限公司 A vehicle distance control method and device for an adaptive cruise system
CN110177299B (en) * 2018-08-30 2021-04-30 深圳市野生动物园有限公司 Video playing speed adjusting platform
CN109584507B (en) * 2018-11-12 2020-11-13 深圳佑驾创新科技有限公司 Driving behavior monitoring method, device, system, vehicle and storage medium
DE102018221063A1 (en) * 2018-12-05 2020-06-10 Volkswagen Aktiengesellschaft Configuration of a control system for an at least partially autonomous motor vehicle
CN110143202A (en) * 2019-04-09 2019-08-20 南京交通职业技术学院 A kind of dangerous driving identification and method for early warning and system
CN110371133A (en) * 2019-06-28 2019-10-25 神龙汽车有限公司 Fatigue of automobile driver detection system and method
CN112233276B (en) * 2020-10-13 2022-04-29 重庆科技学院 Steering wheel corner statistical characteristic fusion method for fatigue state recognition
CN114820692B (en) * 2022-06-29 2023-07-07 珠海视熙科技有限公司 State analysis method, device, storage medium and terminal for tracking target
CN115366909B (en) * 2022-10-21 2023-04-07 四川省公路规划勘察设计研究院有限公司 Dynamic early warning method and device for driver accidents in long and large longitudinal slope section and electronic equipment
CN116098622A (en) * 2023-02-22 2023-05-12 北汽福田汽车股份有限公司 Fatigue detection method, device and vehicle
CN116682264B (en) * 2023-07-11 2023-12-26 营口港信科技有限公司 Active safety prevention and control system for port vehicle
CN117351648B (en) * 2023-10-08 2024-06-25 海南大学 Driver fatigue monitoring and early warning method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1407916A2 (en) * 2002-10-11 2004-04-14 Audi Ag Vehicle
CN1830389A (en) * 2006-04-21 2006-09-13 太原理工大学 Fatigue driving state monitoring device and method
CN101089906A (en) * 2006-06-14 2007-12-19 吴玉沈 Active vehicle driver fatigue driving alarm device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1407916A2 (en) * 2002-10-11 2004-04-14 Audi Ag Vehicle
CN1830389A (en) * 2006-04-21 2006-09-13 太原理工大学 Fatigue driving state monitoring device and method
CN101089906A (en) * 2006-06-14 2007-12-19 吴玉沈 Active vehicle driver fatigue driving alarm device and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李埃荣.《基于多传感器数据融合的驾驶状态监控研究》.《中国优秀硕士学位论文全文数据库(电子期刊)》.2008,(第4期), *

Also Published As

Publication number Publication date
CN101872171A (en) 2010-10-27

Similar Documents

Publication Publication Date Title
CN101872171B (en) Driver fatigue state recognition method and system based on information fusion
CN113743471B (en) A driving evaluation method and system thereof
Abou Elassad et al. A proactive decision support system for predicting traffic crash events: A critical analysis of imbalanced class distribution
CN105139070B (en) fatigue driving evaluation method based on artificial neural network and evidence theory
CN110796207B (en) Fatigue driving detection method and system
Sajid et al. An efficient deep learning framework for distracted driver detection
Li et al. Drunk driving detection based on classification of multivariate time series
CN101540090B (en) Driver fatigue monitoring method based on multivariate information fusion
CN104494600B (en) A Driver Intention Recognition Method Based on SVM Algorithm
CN106934368A (en) A kind of driving fatigue detecting system and recognition methods based on the dynamic achievement data of eye
CN111460950B (en) Cognitive distraction method based on head-eye evidence fusion in natural driving conversation behavior
CN113627740B (en) A driving load evaluation model construction system and construction method
CN110103816A (en) A kind of driving condition detection method
CN109229108A (en) A kind of driving behavior safe evaluation method based on driving fingerprint
CN101872419A (en) Method for detecting fatigue of automobile driver
CN110696835A (en) Automatic early warning method and automatic early warning system for dangerous driving behaviors of vehicle
CN108537161A (en) A kind of driving of view-based access control model characteristic is divert one's attention detection method
Li et al. Driver fatigue detection based on comprehensive facial features and gated recurrent unit
Chen et al. Deep learning approach for detection of unfavorable driving state based on multiple phase synchronization between multi-channel EEG signals
Zheng et al. Detection of perceived discomfort in sae l2 automated vehicles through driver takeovers and physiological spikes
Arfizurrahmanl et al. Real-Time Non-Intrusive Driver Fatigue Detection System using Belief Rule-Based Expert System.
Zhan et al. Rapidly and accurately estimating brain strain and strain rate across head impact types with transfer learning and data fusion
Ma et al. Research on drowsy-driving monitoring and warning system based on multi-feature comprehensive evaluation
Sun et al. Recognition method of drinking-driving behaviors based on PCA and RBF neural network
CN104224205B (en) A kind of based on simple electrocardiogram acquisition with the fatigue detection method of coupling

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120627

Termination date: 20130424