CN117197880A - Concentration monitoring method and system - Google Patents
Concentration monitoring method and system Download PDFInfo
- Publication number
- CN117197880A CN117197880A CN202310222460.8A CN202310222460A CN117197880A CN 117197880 A CN117197880 A CN 117197880A CN 202310222460 A CN202310222460 A CN 202310222460A CN 117197880 A CN117197880 A CN 117197880A
- Authority
- CN
- China
- Prior art keywords
- person
- screen
- tested
- concentration monitoring
- eye
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Eye Examination Apparatus (AREA)
- Alarm Systems (AREA)
- Monitoring And Testing Of Nuclear Reactors (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
Abstract
专注度监测方法和系统,涉及自动化监测技术领域,前述专注度监测方法包括以下步骤:一、利用图像获取装置采集待测人员和设备的图像信息;二、微行为识别检测;三、获取待测人员在显示器屏幕上的视线轨迹,并结合显示器屏幕的注意力热力图,比较两者之间的相似度;四、分析视线轨迹的运动趋势,判断其与先验知识是否吻合;五、对待测人员的眼部特征进行检测,得出待测人员的眨眼频率,判断其是否正常;六、综合上述各项检测结果并通过加权计算,判断待测人员是否处于工作状态。上述专注度监测方法可适用于航空调度和核电安全控制等环境下的人员分心检测,并且改善了隐蔽性和便捷性,同时提高了检测的准确度。
Concentration monitoring methods and systems relate to the field of automated monitoring technology. The aforementioned concentration monitoring method includes the following steps: 1. Using an image acquisition device to collect image information of the person and equipment to be tested; 2. Micro-behavior recognition and detection; 3. Obtaining the image information of the person to be tested The person's gaze trajectory on the monitor screen is combined with the attention heat map of the monitor screen to compare the similarity between the two; 4. Analyze the movement trend of the gaze trajectory and determine whether it is consistent with prior knowledge; 5. To be tested Detect the eye characteristics of the person to obtain the blink frequency of the person to be tested and determine whether it is normal; 6. Combine the above test results and use weighted calculation to determine whether the person to be tested is in working condition. The above concentration monitoring method can be applied to the detection of personnel distraction in environments such as aviation dispatch and nuclear power safety control, and improves concealment and convenience, while improving detection accuracy.
Description
技术领域Technical field
本发明涉及自动化监测技术领域,特别涉及一种专注度监测方法和系统。The present invention relates to the technical field of automated monitoring, and in particular to a concentration monitoring method and system.
背景技术Background technique
目前,在进行航空调度和核电安全控制等操作时,工作人员通常需要使用人机交互输出设备(比如显示器)、人机交互输入设备(比如键盘、鼠标)等辅助设备对主机(比如计算机)进行状态查看和操作控制。工作时,工作人员的专注度常常受其情绪、精神状态等原因的影响,容易在其情绪不佳、疲劳等情况下出现分心、走神、发呆等状况,导致专注度不佳,影响操作的及时性和准确性,因此,如何识别人员专注度则显得尤为重要。At present, when performing operations such as aviation dispatching and nuclear power safety control, staff usually need to use auxiliary devices such as human-computer interaction output devices (such as monitors) and human-computer interaction input devices (such as keyboards and mice) to perform operations on the host computer (such as a computer). Status viewing and operational control. When working, the concentration of workers is often affected by their emotions, mental state and other reasons. They are prone to distraction, wandering, and daze when they are in a bad mood, fatigue, etc., resulting in poor concentration and affecting the efficiency of operations. Timeliness and accuracy, therefore, how to identify personnel concentration is particularly important.
中国专利文献CN109878527A公开了一种分心感测系统,其使用接触式传感器采集皮肤电信号分析人员是否分心,该方法需要待检测人员穿戴设备,隐蔽性与便捷性尚有欠缺。中国专利文献CN104706366A公开了一种分心检测方法、装置及系统,采集待检测人员的心电数据,也有上述相同的问题。中国专利文献CN115246405A公开了一种驾驶员分心行为的检测方法及系统,采集驾驶人员的眼球特征,并换算成视线关注的区间范围,当超过预设的正常区间时判断待检测人员分心,该方法检测粒度太粗,无法区分是否在正常驾驶还是已分心。中国专利文献CN114026611A公开了一种使用热图检测驾驶员注意力,其分别生成场景注意力热力图和驾驶员注视热图,通过分析两个热力图之间的差异,判断驾驶员是否分心,该方法仅能识别待检测人员的视线是否与感兴趣区域吻合,对时间序列信息缺失,且对待检测人员发呆等情况很难辨析。Chinese patent document CN109878527A discloses a distraction sensing system that uses a contact sensor to collect skin electrode signals to analyze whether a person is distracted. This method requires the person to be detected to wear a device, and lacks concealment and convenience. Chinese patent document CN104706366A discloses a distraction detection method, device and system that collects electrocardiogram data of the person to be detected, but also has the same problems as mentioned above. Chinese patent document CN115246405A discloses a method and system for detecting driver distraction behavior. It collects the characteristics of the driver's eyeballs and converts them into an interval range of visual attention. When the preset normal interval is exceeded, it is judged that the person to be detected is distracted. The detection granularity of this method is too coarse and cannot distinguish whether the driver is driving normally or being distracted. Chinese patent document CN114026611A discloses a method of using heat maps to detect driver attention. It generates scene attention heat maps and driver gaze heat maps respectively. By analyzing the difference between the two heat maps, it is determined whether the driver is distracted. This method can only identify whether the line of sight of the person to be inspected matches the area of interest, but it lacks time series information, and it is difficult to distinguish situations such as when the person to be inspected is in a daze.
上述这些用于汽车驾驶员的分心检测方法都还存在着诸多不足,而且汽车驾驶环境又与航空调度和核电安全控制等操作环境相差较大,因此,它们并不适用航空调度和核电安全控制等工作环境下的人员分心检测。The above-mentioned distraction detection methods for car drivers still have many shortcomings, and the car driving environment is quite different from the operating environment such as aviation dispatch and nuclear power safety control. Therefore, they are not suitable for aviation dispatch and nuclear power safety control. Detection of distractions in work environments.
发明内容Contents of the invention
本发明所要解决的技术问题是提供一种可适用于航空调度和核电安全控制等工作环境下的人员分心检测方法。The technical problem to be solved by the present invention is to provide a method for detecting personnel distraction in working environments such as aviation dispatching and nuclear power safety control.
为了解决上述问题,本发明采用的技术方案如下:In order to solve the above problems, the technical solutions adopted by the present invention are as follows:
专注度监测方法,包括以下步骤:The concentration monitoring method includes the following steps:
一、利用图像获取装置采集待测人员和设备的图像信息;1. Use the image acquisition device to collect image information of the person and equipment to be tested;
二、以图像中人机交互输入设备所处区域为微行为识别检测区域,检测该区域内是否出现待测人员的手部,对出现在其中的手部进行微行为识别检测,得出待测人员的手部的状态,判断手部活动与否;2. Take the area of the human-computer interaction input device in the image as the micro-behavior recognition detection area, detect whether the hand of the person to be tested appears in the area, conduct micro-behavior recognition detection on the hand that appears in it, and obtain the result of the micro-behavior recognition detection area. The state of the person's hands to determine whether the hands are active or not;
具体为,以图像中人机交互输入设备所处区域为微行为识别检测区域,检测微行为识别检测区域内是否出现待测人员的手部特征,若没有则进入下一步,若有则对出现在其中的手部特征进行微行为识别检测,判断待测人员的手部是活动还是静止,若是活动,则判定待测人员为工作状态,若是静止,则进入下一步;Specifically, the area where the human-computer interaction input device in the image is located is the micro-behavior recognition detection area, and it is detected whether the hand characteristics of the person to be tested appear in the micro-behavior recognition detection area. If not, go to the next step. If there is, then the hand characteristics of the person to be tested are detected. Micro-behavior recognition detection is carried out on the hand features to determine whether the hand of the person to be tested is active or stationary. If it is active, it is determined that the person to be tested is in a working state. If it is stationary, it goes to the next step;
三、获取待测人员在显示器屏幕上的视线轨迹,并结合显示器屏幕中的注意力热力图,比较两者之间的相似度;3. Obtain the line of sight of the person to be tested on the monitor screen, and combine it with the attention heat map on the monitor screen to compare the similarity between the two;
四、分析视线轨迹的运动趋势,判断其与先验知识是否吻合;4. Analyze the movement trend of the sight trajectory and determine whether it is consistent with prior knowledge;
五、对待测人员的眼部特征进行检测,得出待测人员的眨眼频率,判断其是否正常;5. Detect the eye characteristics of the person to be tested, obtain the blink frequency of the person to be tested, and determine whether it is normal;
六、综合上述各项检测结果并通过加权计算,判断待测人员是否处于工作状态。6. Based on the above test results and through weighted calculation, determine whether the person to be tested is in working condition.
其中,步骤一中,图像获取装置包括相机,采集的待测人员的图像信息包括待测人员的眼部特征、手部特征和面部特征的图像信息,采集的设备的图像信息包括人机交互输入设备和显示器屏幕的图像信息。Among them, in step one, the image acquisition device includes a camera, the image information of the person to be tested includes the image information of eye features, hand features and facial features of the person to be tested, and the image information of the device collected includes human-computer interaction input. Image information for device and monitor screens.
其中,步骤二中,人机交互输入设备为键盘和鼠标,确定微行为识别检测区域的步骤包括:Among them, in step two, the human-computer interaction input device is a keyboard and a mouse, and the steps of determining the micro-behavior recognition detection area include:
a、利用图像处理中的目标检测技术,分别检测并确定键盘与鼠标在图像中的区域位置Rkeyboard和Rmouse,其中,Rkeyboard和Rmouse为:a. Use the target detection technology in image processing to detect and determine the regional positions of the keyboard and mouse in the image R keyboard and R mouse respectively, where R keyboard and R mouse are:
Rkeyboard=((xleft,yleft),(xright,yright));R keyboard = ((x left , y left ), (x right , y right ));
Rmouse=((xleft,yleft),(xright,yright));R mouse = ((x left , y left ), (x right , y right ));
其中,(xleft,yleft)表示区域R*左上角的坐标位置,(xright,yright)表示区域R*右下角的坐标位置。Among them, (x left , y left ) represents the coordinate position of the upper left corner of the region R * , and (x right , y right ) represents the coordinate position of the lower right corner of the region R * .
b、通过键盘和鼠标的区域位置,确定微行为识别检测区域Rdetection,区域坐标参数为:b. Determine the micro-behavior recognition detection area R detection through the regional positions of the keyboard and mouse. The area coordinate parameters are:
其中,步骤二中,微行为识别检测的步骤包括:Among them, in step two, the steps of micro-behavior recognition and detection include:
对出现在微行为识别检测区域中的手部特征上的包括手指指尖、手指关节、手掌中心位置和腕关节位置在内的关键点信息进行检测,再根据手部特征的关键点在视频序列中的轨迹来判断待测人员的手部是活动还是静止。Detect the key point information on the hand features that appear in the micro-behavior recognition detection area, including finger tips, finger joints, palm center position and wrist joint position, and then use the key points of the hand features in the video sequence The trajectory in the test is used to determine whether the hand of the person to be tested is active or stationary.
其中,步骤三中,获取视线轨迹的步骤包括:Among them, in step three, the steps to obtain the line of sight trajectory include:
a、提取并分析待测人员的眼部特征,得出视线方向;a. Extract and analyze the eye characteristics of the person to be tested and obtain the direction of sight;
b、检测人脸的若干个特征值,并与标准姿态下的人脸特征值进行比对,得出面部与图像获取装置之间的姿态关系;b. Detect several eigenvalues of the human face and compare them with the eigenvalues of the face in standard postures to obtain the posture relationship between the face and the image acquisition device;
c、结合待测人员的面部与显示器屏幕之间的位姿关系,计算出待测人员在显示器屏幕上的视线轨迹:c. Combined with the posture relationship between the face of the person to be tested and the monitor screen, calculate the line of sight trajectory of the person to be tested on the monitor screen:
(1)通过图像获取装置的空间坐标,与显示器屏幕平面位置的关系,得出面部与屏幕之间的转换矩阵R;(1) Through the relationship between the spatial coordinates of the image acquisition device and the plane position of the display screen, the conversion matrix R between the face and the screen is obtained;
(2)通过转换矩阵R计算出面部相对于屏幕的位姿,并通过计算的面部与屏幕之间的距离,实时计算出视线在屏幕上的位置Pt;(2) Calculate the pose of the face relative to the screen through the transformation matrix R, and calculate the position P t of the line of sight on the screen in real time through the calculated distance between the face and the screen;
(3)通过在时间上的累积,获取到视线在屏幕上的位置序列(Pi),其中i≥0。(3) Through accumulation over time, the position sequence (P i ) of the line of sight on the screen is obtained, where i≥0.
其中,步骤三中,屏幕注意力热力图的分布函数为ProbS:Among them, in step three, the distribution function of the screen attention heat map is Prob S :
其中,Bright(i,j)为屏幕坐标i-j位置的亮度值,Among them, Bright(i,j) is the brightness value of the screen coordinate i-j position,
其中,width和height为屏幕的分辨率,(i,j)为采样点在屏幕上的坐标位置。Among them, width and height are the resolution of the screen, and (i, j) is the coordinate position of the sampling point on the screen.
其中,步骤三中,检测视线轨迹与屏幕注意力热力图的相似度的步骤包括:Among them, in step three, the steps of detecting the similarity between the gaze trajectory and the screen attention heat map include:
将视线轨迹与屏幕注意力热力图进行归一化处理,并按屏幕分辨率进行采样,得到视线轨迹的分布函数ProbEye,屏幕注意力热力图的分布函数为ProbScreen,通过交叉熵计算得到两者之间的相似度:Normalize the gaze trajectory and the screen attention heat map, and sample according to the screen resolution to obtain the distribution function of the gaze trajectory Prob Eye , and the distribution function of the screen attention heat map is Prob Screen . The two are obtained through cross entropy calculation. Similarity between them:
其中,width和height分别为屏幕的分辨率,(i,j)为采样点在屏幕上的坐标位置;Among them, width and height are the resolution of the screen respectively, and (i, j) is the coordinate position of the sampling point on the screen;
将计算得出的视线轨迹与屏幕注意力热力图的相似度和预先设置的阈值对比,判断被测人员是否处在工作状态。Compare the similarity between the calculated gaze trajectory and the screen attention heat map with the preset threshold to determine whether the person being tested is in a working state.
其中,步骤四中,先验知识为被测人员的视线从显示器屏幕中间开始,并从左上位置到右下位置移动;判断视线轨迹移动的方向的步骤包括:Among them, in step four, the prior knowledge is that the subject's line of sight starts from the middle of the monitor screen and moves from the upper left position to the lower right position; the steps to determine the direction of the line of sight trajectory include:
通过计算视线轨迹的梯度,判断视线在屏幕上移动的方向:By calculating the gradient of the gaze trajectory, we can determine the direction in which the gaze moves on the screen:
其中,Traili(x,y)为视线轨迹第i个点的坐标信息;Among them, Trail i (x, y) is the coordinate information of the i-th point of the sight trajectory;
通过中两个维度的正负号,判断轨迹移动的方向。pass The positive and negative signs of the two dimensions are used to determine the direction of trajectory movement.
其中,步骤五中,检测待测人员的眨眼频率的步骤包括:Among them, in step five, the steps of detecting the blink frequency of the person to be tested include:
a、利用图像处理技术,提取出图像中人脸位置,并分割出眼部区域;a. Use image processing technology to extract the face position in the image and segment the eye area;
b、从眼部区域中,检测到眼睛的若干个关键点的位置,包括内眼角、外眼角、上眼睑及下眼睑的位置;b. From the eye area, the positions of several key points of the eyes are detected, including the positions of the inner corner of the eye, the outer corner of the eye, the upper eyelid and the lower eyelid;
c、计算上下眼睑距离与内外眼角距离的比值,结合时间信息,计算出眨眼频率和闭眼的持续时间;c. Calculate the ratio of the distance between the upper and lower eyelids to the distance between the inner and outer canthus of the eye, and combine the time information to calculate the blink frequency and the duration of eye closure;
d、眨眼判别参数的计算方法为:d. The calculation method of blink discrimination parameters is:
其中,DisW为内眼角P1与外眼角P4之间的距离,DisH为上眼睑P2与下眼睑P6之间的距离;Among them, Dis W is the distance between the inner canthus P 1 and the outer canthus P 4 , Dis H is the distance between the upper eyelid P 2 and the lower eyelid P 6 ;
周期性的计算眨眼判别参数Blink,得出基于时间的变化趋势;Periodically calculate the blink discrimination parameter Blink to obtain a time-based change trend;
e、根据闭眼的持续时间Timeclose,判断被测人员是否疲劳工作并对待测人员的异常状态做出判断:e. Based on the duration of eyes closed, Time close , determine whether the person being tested is tired from work and make a judgment on the abnormal state of the person being tested:
其中,threshold为闭眼的持续时间的阈值。Among them, threshold is the threshold of the duration of eyes closing.
本发明的另一目的在于提供一种专注度监测系统,其包括存储单元和运算单元,所述存储单元存储有专注度监测程序,所述专注度监测程序通过运算单元运行以执行所述专注度监测方法中的步骤。Another object of the present invention is to provide a concentration monitoring system, which includes a storage unit and a computing unit. The storage unit stores a concentration monitoring program. The concentration monitoring program is run through the computing unit to execute the concentration. Steps in the monitoring method.
本发明提供的专注度监测方法不同于以往常见的接触式分心检测方法,其采用的是非接触的检测方法,无需待测人员穿戴设备,其只需使用摄像头、相机等非接触式设备采集图像信息,隐蔽性和便捷性更好,并且通过微行为识别、视线轨迹与屏幕注意力热力图检测、眨眼特征检测等多种方式判定待测人员的状态,通过加权计算(各项指标的权重值可以通过实验确定),综合分析出待测人员的专注度情况,据此得出的检测结果相较于单一的判断方式而言,准确度更高。本发明提供的专注度监测方法相较于现有的汽车驾驶员分心检测方法而言,隐蔽性和便捷性更好,并且检测准确度更高,能够适用于航空调度和核电安全控制等操作环境下的人员分心检测。The concentration monitoring method provided by the present invention is different from the common contact distraction detection method in the past. It adopts a non-contact detection method, which does not require the person to be tested to wear equipment. It only needs to use non-contact equipment such as cameras and cameras to collect images. Information, concealment and convenience are better, and the status of the person to be tested is determined through various methods such as micro-behavior recognition, gaze trajectory and screen attention heat map detection, blink feature detection, etc., and through weighted calculation (the weight value of each indicator can be determined through experiments), comprehensively analyze the concentration of the person to be tested, and the test results obtained based on this are more accurate than a single judgment method. Compared with the existing car driver distraction detection method, the concentration monitoring method provided by the present invention is more concealed and convenient, and has higher detection accuracy, and can be suitable for operations such as aviation dispatch and nuclear power safety control. Human distraction detection in environments.
附图说明Description of the drawings
图1为实施例中专注度监测的流程示意图;Figure 1 is a schematic flow chart of concentration monitoring in the embodiment;
图2为实施例中眼部特征的示意图;Figure 2 is a schematic diagram of eye features in the embodiment;
图3为实施例中人脸的若干特征点的示意图;Figure 3 is a schematic diagram of several feature points of the human face in the embodiment;
图4为实施例中人脸特征的示意图;Figure 4 is a schematic diagram of facial features in the embodiment;
图5为实施例中计算待测人员的眼睛在屏幕上的视线轨迹的示意图;Figure 5 is a schematic diagram of calculating the line of sight trajectory of the person to be tested's eyes on the screen in the embodiment;
图6为实施例中屏幕注意力热力图;Figure 6 is a heat map of screen attention in the embodiment;
图7为实施例中确定的微行为识别检测区域的示意图;Figure 7 is a schematic diagram of the micro-behavior recognition detection area determined in the embodiment;
图8为实施例中手部的关键点位置的示意图;Figure 8 is a schematic diagram of the key point positions of the hand in the embodiment;
图9为实施中手部关键点在视频序列中的轨迹示意图;Figure 9 is a schematic diagram of the trajectory of hand key points in the video sequence during implementation;
图10为实施例中眼部区域的内眼角、外眼角、上眼睑及下眼睑的位置示意图;Figure 10 is a schematic diagram of the positions of the inner canthus, outer canthus, upper eyelid and lower eyelid of the eye area in the embodiment;
图11为实施例中眨眼状态变化示意图,其中,(a)是睁眼,(b)是闭眼,(c)是眨眼辨别参数基于时间的变化趋势。Figure 11 is a schematic diagram of blink state changes in the embodiment, in which (a) is the eyes open, (b) is the eyes closed, and (c) is the change trend of the blink discrimination parameters based on time.
具体实施方式Detailed ways
为了便于本领域技术人员更好地理解本发明相对于现有技术的改进之处,下面结合附图和实施例对本发明作进一步的说明。In order to facilitate those skilled in the art to better understand the improvements of the present invention over the prior art, the present invention will be further described below in conjunction with the accompanying drawings and examples.
针对航空调度和核电安全控制等操作环境下对于工作人员专注度的极高要求,本实施例提供了一种专注度监测方法,用以监测工作人员在操作过程中的分心状况,尽可能地消除因工作人员分心而导致的不安定因素,从而保证设备的安全操作和运行。然而,现有的针对汽车驾驶员的相关分心检测的方法,很多都是利用接触式传感器来检测驾驶员的分心状况,其隐蔽性和便捷性较差,若是将其直接应用至航空调度和核电安全控制等操作环境中,穿戴在身体上的接触式传感器显然会对工作人员造成一定影响,甚至于会因为穿戴了这些设备产生不适而引起分心,因此,接触式的采集方式并不适用于该操作环境中。虽然现有的针对汽车驾驶员的相关分心检测的方法当中,也有一些采用非接触式的检测方法,但是这些方法大多还存在着检测粒度太粗而导致的判断准确度不高的问题,以及由于时间序列信息缺失而导致的难以辨析待测人员发呆等情况的问题。为了解决这些问题,本实施例提供了一种非接触式的专注度监测方法,并提高检测的准确度,通过微行为识别、视线轨迹与屏幕注意力热力图检测、眨眼特征检测等多种检测方式的加权分析,可以更加准确地判断出待测人员是否处在分心状态,从而减少误判和漏判。In view of the extremely high requirements for the concentration of workers in operating environments such as aviation dispatching and nuclear power safety control, this embodiment provides a concentration monitoring method to monitor the distraction of workers during the operation, as much as possible Eliminate unrest caused by worker distraction to ensure safe operation and operation of equipment. However, many of the existing distraction detection methods for car drivers use contact sensors to detect the driver's distraction. Their concealment and convenience are poor. If they are directly applied to aviation dispatching In operating environments such as nuclear power safety control and nuclear power safety control, contact sensors worn on the body will obviously have a certain impact on workers, and may even cause discomfort and distraction due to wearing these devices. Therefore, contact collection methods are not suitable for Applicable to this operating environment. Although some of the existing distraction detection methods for car drivers use non-contact detection methods, most of these methods still have the problem of low accuracy of judgment due to too coarse detection granularity, and Due to the lack of time series information, it is difficult to distinguish the situation of the person being tested being in a daze. In order to solve these problems, this embodiment provides a non-contact concentration monitoring method and improves the accuracy of detection through various detections such as micro-behavior recognition, gaze trajectory and screen attention heat map detection, blink feature detection, etc. The weighted analysis of the method can more accurately determine whether the person to be tested is distracted, thereby reducing misjudgments and missed judgments.
本发明采用非接触方式,利用预设的相机等非接触式传感器,与分心检测分析系统,在不接触被检测者的同时,可对电脑面前的办公人员的分心状态进行非接触式的检测,达到自动检测办公人员使用电脑时的分心情况的目的。其中的专注度监测方法通过多个检测子功能,汇聚结果后通过加权计算(各项指标的权重值可以通过实验确定)即可分析出待检测人员的分心状态。办公人员(不仅包括航空调度和核电安全控制的操作人员,还可以包括使用电脑工作的其他办公人员)分心具有行为静止、视线轨迹异常和眨眼特征异常等情况,所以该检测方法包含以下几个检测系统:1.视线轨迹与屏幕注意力热力图检测;2.微行为识别;3.眨眼特征检测等部分。整个检测的流程如附图1所示。下面以坐在办公桌前并在电脑面前工作的办公人员举例说明,可参见附图1~11。The invention adopts a non-contact method, using preset cameras and other non-contact sensors, and a distraction detection and analysis system to conduct non-contact detection of the distraction state of office workers in front of the computer without contacting the person being detected. Detection to achieve the purpose of automatically detecting the distraction of office workers when using computers. The concentration monitoring method uses multiple detection sub-functions, and after aggregating the results through weighted calculation (the weight value of each indicator can be determined through experiments), the distraction state of the person to be detected can be analyzed. Distracted office workers (including not only operators of aviation dispatch and nuclear power safety control, but also other office workers who use computers) have behaviors such as static behavior, abnormal sight trajectories, and abnormal blink characteristics, so the detection method includes the following Detection system: 1. Gaze trajectory and screen attention heat map detection; 2. Micro-behavior recognition; 3. Blink feature detection and other parts. The entire detection process is shown in Figure 1. The following is an example of an office worker sitting at a desk and working in front of a computer. See Figures 1 to 11.
一、采集图像信息。1. Collect image information.
主要是利用图像获取装置采集待测人员和设备的图像信息。图像获取装置包括相机,还可以包括摄像机,采集的待测人员的图像信息包括待测人员的眼部特征、手部特征和面部特征的图像信息,采集的设备的图像信息包括人机交互输入设备和显示器屏幕的图像信息。其中的人机交互输入设备主要指的是键盘和鼠标,当然也可以是控制摇杆、手柄等其他设备。It mainly uses image acquisition devices to collect image information of people and equipment to be tested. The image acquisition device includes a camera, and may also include a video camera. The image information collected by the person to be tested includes the image information of the eye features, hand features and facial features of the person to be tested. The image information collected by the device includes a human-computer interaction input device. and image information on the monitor screen. The human-computer interaction input devices mainly refer to keyboards and mice. Of course, they can also be control joysticks, handles and other devices.
二、微行为识别检测。2. Micro-behavior recognition detection.
办公人员使用计算机办公时,其行为动作幅度较小,仅涉及敲击键盘与操控鼠标等操作,因此可对办公人员的双手进行微行为识别检测,利用相机拍摄办公桌区域,检测办公人员的双手状态。When office workers use computers to work, their behavioral movements are small, involving only keyboard typing and mouse manipulation. Therefore, micro-behavior recognition and detection can be performed on the hands of office workers. The camera can be used to capture the desk area and detect the hands of office workers. state.
以图像中人机交互输入设备所处区域为微行为识别检测区域,检测微行为识别检测区域内是否出现待测人员的手部特征,若没有则进入下一步,进行其他方式的检测,若有则对出现在其中的手部特征进行微行为识别检测,判断待测人员的手部是活动还是静止,若是活动,则判定待测人员为工作状态,若是静止,则进入下一步,进行其他方式的检测。Take the area of the human-computer interaction input device in the image as the micro-behavior recognition detection area, and detect whether the hand characteristics of the person to be tested appear in the micro-behavior recognition detection area. If not, go to the next step and perform other detection methods. Then perform micro-behavior recognition detection on the hand features that appear in it, and determine whether the hand of the person to be tested is active or stationary. If it is active, it is determined that the person to be tested is in a working state. If it is stationary, it goes to the next step and performs other methods. detection.
其中,确定微行为识别检测区域的步骤包括:Among them, the steps to determine the micro-behavior recognition detection area include:
a、利用图像处理中的目标检测技术,分别检测并确定键盘与鼠标在图像中的区域位置Rkeyboard和Rmouse,其中,Rkeyboard和Rmouse的定义为:a. Use the target detection technology in image processing to detect and determine the regional positions of the keyboard and mouse in the image R keyboard and R mouse respectively, where R keyboard and R mouse are defined as:
Rkeyboard=((xleft,yleft),(xright,yright));R keyboard = ((x left , y left ), (x right , y right ));
Rmouse=((xleft,yleft),(xright,yright));R mouse = ((x left , y left ), (x right , y right ));
其中,(xleft,yleft)表示区域R*左上角的坐标位置,(xright,yright)表示区域R*右下角的坐标位置。Among them, (x left , y left ) represents the coordinate position of the upper left corner of the region R * , and (x right , y right ) represents the coordinate position of the lower right corner of the region R * .
b、通过键盘和鼠标,确定待检测微行为识别的图像区域Rdetection,区域坐标参数的计算方式为:b. Use the keyboard and mouse to determine the image area R detection for micro-behavior recognition to be detected. The calculation method of the area coordinate parameters is:
附图7中白色的矩形区域,就是计算出的微行为识别检测区域。The white rectangular area in Figure 7 is the calculated micro-behavior recognition detection area.
其中,微行为识别检测的步骤包括:Among them, the steps of micro-behavior recognition detection include:
对出现在微行为识别检测区域中的手部特征上的关键点信息(包括手指指尖、手指关节、手掌中心位置和腕关节位置)进行检测,再根据手部特征的关键点在视频序列中的轨迹来判断待测人员的手部是活动还是静止。Detect the key point information on the hand features that appears in the micro-behavior recognition detection area (including finger tips, finger joints, palm center position and wrist joint position), and then based on the key points of the hand features in the video sequence The trajectory is used to determine whether the hand of the person to be tested is active or stationary.
具体为,在区域内检测是否有双手在区域内,并检测双手的关键点信息。双手的关键点信息包括每根手指的指尖和各关节,及手掌中心位置和腕关节的位置;可参见附图8;Specifically, it detects whether there are hands in the area and detects the key point information of both hands. The key point information of both hands includes the fingertips and joints of each finger, as well as the center position of the palm and the position of the wrist joint; see Figure 8;
通过双手关键点在视频序列中的轨迹,可以判断双手处于静止状态,还是在敲击键盘或控制鼠标。若双手有对鼠标和键盘的操作,则说明被检测人员正在工作,没有进入分心状态。可参见附图9。Through the trajectory of the key points of both hands in the video sequence, it can be determined whether the hands are in a static state or whether they are typing on the keyboard or controlling the mouse. If both hands are operating the mouse and keyboard, it means that the person being tested is working and is not distracted. See Figure 9.
三、待测人员的眼睛在显示器屏幕上的视线轨迹与屏幕注意力热力图检测。3. Detection of the eye trace of the person to be tested on the monitor screen and screen attention heat map.
视线轨迹与屏幕注意力热力图检测是指,分别获取待检测人员在屏幕上的视线轨迹,并结合计算机屏幕中的注意力区域,通过比较两者之间的相似度,判断待检测人员是否分心。Gaze track and screen attention heat map detection means to separately obtain the gaze track of the person to be detected on the screen, and combine it with the attention area on the computer screen to determine whether the person to be detected is distinguished by comparing the similarity between the two. Heart.
其中,获取视线轨迹的步骤包括:Among them, the steps to obtain the sight trajectory include:
a、提取并分析待测人员的眼部特征,通过眼部特征推算出视线方向;可选择深度神经网络为工具,通过检测到的眼部图片,拟合出眼球的两个角度参数:pitch(垂直方向)和yaw(水平方向);可参见附图2。a. Extract and analyze the eye characteristics of the person to be tested, and calculate the direction of sight through the eye characteristics; you can choose the deep neural network as a tool and fit the two angle parameters of the eyeball through the detected eye pictures: pitch( vertical direction) and yaw (horizontal direction); see Figure 2.
b、检测人脸的若干个特征值(如目前常用的68个特征点),并与标准姿态下的人脸特征值进行比对,从而估算出面部与相机之间的姿态关系;可参见附图3、4;b. Detect several feature values of the face (such as the currently commonly used 68 feature points), and compare them with the face feature values in standard poses to estimate the pose relationship between the face and the camera; see the attachment Figures 3 and 4;
在真实面部图像中,选取未被遮挡的若干个特征点,利用PnP方法进行的头部姿态估计,从而获取面部相对于相机成像平面的姿态信息。In the real facial image, several unoccluded feature points are selected and the head pose estimation is performed using the PnP method to obtain the pose information of the face relative to the camera imaging plane.
c、结合待测人员的面部与显示器屏幕之间的位姿关系,计算出待测人员在显示器屏幕上的视线轨迹,可参见附图5;c. Combined with the posture relationship between the face of the person to be tested and the monitor screen, calculate the line of sight trajectory of the person to be tested on the monitor screen, as shown in Figure 5;
(1)通过安装相机的空间坐标,与计算机屏幕平面位置的关系,可以换算出面部与屏幕之间的转换矩阵R;(1) Through the relationship between the spatial coordinates of the installed camera and the plane position of the computer screen, the conversion matrix R between the face and the screen can be calculated;
(2)通过转换矩阵即可计算出面部相对于屏幕的位姿,并通过估算的面部与屏幕之间的距离,可实时计算出视线在屏幕上的位置Pt;(2) The pose of the face relative to the screen can be calculated through the transformation matrix, and the position of the line of sight on the screen P t can be calculated in real time through the estimated distance between the face and the screen;
(3)通过在时间上的累积,可以获取到视线在屏幕上的位置序列(Pi),其中i≥0,序列长度与采样频率和采样时长有关。(3) Through accumulation over time, the position sequence (P i ) of the line of sight on the screen can be obtained, where i≥0, and the length of the sequence is related to the sampling frequency and sampling duration.
其中,检测程序检测屏幕内容,并利用注意力检测网络,对屏幕的内容进行分析,生成屏幕注意力热力图,可参见附图6。Among them, the detection program detects the content of the screen, uses the attention detection network to analyze the content of the screen, and generates a screen attention heat map, which can be seen in Figure 6.
热力图与屏幕尺寸相同,各位置的亮度大小与对应内容的重要性成正比。通过各位置的亮度信息,可以转换为一个二维分布ProbS,转换方式为:The heat map is the same size as the screen, and the brightness of each position is proportional to the importance of the corresponding content. The brightness information at each position can be converted into a two-dimensional distribution Prob S. The conversion method is:
其中,Bright(i,j)为屏幕坐标i-j位置的亮度值,由计算结果易知Among them, Bright(i,j) is the brightness value of the screen coordinate i-j position. It is easy to know from the calculation results
其中,width和height分别为屏幕的分辨率,(i,j)为采样点在屏幕上的坐标位置。Among them, width and height are the resolution of the screen respectively, and (i, j) is the coordinate position of the sampling point on the screen.
然后,计算注意力热力图与视线轨迹的吻合度,并与设置的阈值进行比较,当超过阈值时认为被检测者在认真工作,否则则有分心的可能。Then, the fit between the attention heat map and the gaze trajectory is calculated and compared with the set threshold. When the threshold is exceeded, it is considered that the person being detected is working seriously, otherwise there is a possibility of distraction.
计算方法为,将眼睛在屏幕上的视线轨迹,与屏幕的注意力热力图,均看做一个二维分布。分别对两个分布进行归一化后,并按屏幕分辨率进行采样,得到视线轨迹的分布函数ProbEye,屏幕注意力热力图的分布为ProbScreen,则两者之间的相似度可通过交叉熵计算:The calculation method is to regard the eye's line of sight trajectory on the screen and the screen's attention heat map as a two-dimensional distribution. After normalizing the two distributions respectively, and sampling according to the screen resolution, the distribution function of the eye track Prob Eye is obtained, and the distribution of the screen attention heat map is Prob Screen . The similarity between the two can be calculated by crossing Entropy calculation:
其中,width和height分别为屏幕的分辨率,(i,j)为采样点在屏幕上的坐标位置。Among them, width and height are the resolution of the screen respectively, and (i, j) is the coordinate position of the sampling point on the screen.
四、待测人员的眼睛在显示器屏幕上的视线轨迹的走势与先验知识检测。4. Detection of the trend and prior knowledge of the line of sight trajectory of the person to be tested’s eyes on the monitor screen.
在检测完吻合度后,还需判断视线轨迹的方向性。由于计算机操作的特殊性,操作人员的视线一般是从中间开始,并从左上到右下移动。分析视线轨迹的运动趋势,是否与先验知识吻合。After detecting the degree of agreement, it is also necessary to determine the directionality of the line of sight trajectory. Due to the particularity of computer operations, the operator's line of sight generally starts from the middle and moves from the upper left to the lower right. Analyze the movement trend of the sight trajectory to see whether it is consistent with prior knowledge.
设视线轨迹第i个点的坐标信息为Traili(x,y),可通过计算视线轨迹的梯度,判断视线在屏幕上移动的方向。Assume that the coordinate information of the i-th point of the line of sight trajectory is Trail i (x, y). The direction of movement of the line of sight on the screen can be determined by calculating the gradient of the line of sight trajectory.
则通过中两个维度的正负号,即可判断轨迹移动的方向。then pass The direction of trajectory movement can be judged by the positive and negative signs of the two dimensions.
五、眨眼特征检测与分析。5. Blink feature detection and analysis.
眨眼特征是指,计算机视觉技术采集到待检测人员的眼部特征点后,根据特征点参数的技术,分析出的一些眨眼相关的信息。Blink features refer to some blink-related information that is analyzed based on the feature point parameter technology after computer vision technology collects the eye feature points of the person to be detected.
1.利用图像处理技术,提取出图像中人脸位置,并分割出眼部区域;1. Use image processing technology to extract the position of the face in the image and segment the eye area;
2.从眼部区域中,检测到眼睛的若干个关键点的位置,包括内眼角、外眼角、上眼睑及下眼睑的位置;可参见附图10;2. From the eye area, the positions of several key points of the eyes are detected, including the positions of the inner canthus, outer canthus, upper eyelid and lower eyelid; see Figure 10;
3.计算上下眼睑距离与内外眼角距离的比值。因为左右眼角的距离不变,而上下眼睑的距离与睁眼闭眼的状态有关,所以可以根据上下眼睑距离与内外眼角距离的比值,结合时间信息,计算出眨眼频率和闭眼的持续时间;3. Calculate the ratio of the distance between the upper and lower eyelids to the distance between the inner and outer canthus of the eye. Because the distance between the left and right canthus remains unchanged, and the distance between the upper and lower eyelids is related to the state of opening and closing the eyes, the blink frequency and the duration of eye closure can be calculated based on the ratio of the distance between the upper and lower eyelids to the distance between the inner and outer canthus, combined with time information;
设内眼角P1与外眼角P4之间的距离为DisW,上眼睑P2与下眼睑P6之间的距离为DisH,则眨眼判别参数的计算方法为:Assuming that the distance between the inner canthus P 1 and the outer canthus P 4 is Dis W , and the distance between the upper eyelid P 2 and the lower eyelid P 6 is Dis H , then the calculation method of the blink discrimination parameter is:
周期性地计算眨眼判别参数Blink(如10Hz),并绘制出基于时间的变化趋势,则可以得到眨眼状态变化图;可参见附图11。By periodically calculating the blink discrimination parameter Blink (such as 10Hz) and drawing the time-based change trend, you can obtain a blink state change diagram; see Figure 11.
4.根据闭眼的持续时间Timeclose,可以判断出被监测人员是否疲劳工作并对待检测人员的异常状态做出判断。4. According to the duration of eyes closed , it can be judged whether the person being monitored is working tiredly and the abnormal state of the person to be detected can be judged.
六、汇聚检测结果后通过加权计算即可分析出待检测人员的分心状态。6. After aggregating the detection results, the distraction state of the person to be detected can be analyzed through weighted calculation.
另外,设视线轨迹检测结果为Resultpraj,手部微行为检测结果为Resultaetion,眨眼检测结果为Resultblink,则整个系统的检测结果为In addition, assuming that the gaze track detection result is Result praj , the hand micro-behavior detection result is Result aetion , and the blink detection result is Result blink , then the detection result of the entire system is
Result=α×Resultpraj+β×Resultaction+(1-α-β)×Resultblmk;Result=α×Result praj +β×Result action +(1-α-β)×Result blmk ;
其中,0≤α≤1,0≤β≤1,0≤α+β≤1。Among them, 0≤α≤1, 0≤β≤1, 0≤α+β≤1.
本发明提供的专注度监测方法不同于以往常见的接触式分心检测方法,其采用的是非接触的检测方法,无需待测人员穿戴设备,其只需使用摄像头、相机等非接触式设备采集图像信息,隐蔽性和便捷性更好,并且通过微行为识别、视线轨迹与屏幕注意力热力图检测、眨眼特征检测等多种方式判定待测人员的状态,通过加权计算(各项指标的权重值可以通过实验确定),综合分析出待测人员的专注度情况,据此得出的检测结果相较于单一的判断方式而言,准确度更高。本发明提供的专注度监测方法相较于现有的汽车驾驶员分心检测方法而言,隐蔽性和便捷性更好,并且检测准确度更高,能够适用于航空调度和核电安全控制等操作环境下的人员分心检测。The concentration monitoring method provided by the present invention is different from the common contact distraction detection method in the past. It adopts a non-contact detection method, which does not require the person to be tested to wear equipment. It only needs to use non-contact equipment such as cameras and cameras to collect images. Information, concealment and convenience are better, and the status of the person to be tested is determined through various methods such as micro-behavior recognition, gaze trajectory and screen attention heat map detection, blink feature detection, etc., and through weighted calculation (the weight value of each indicator can be determined through experiments), comprehensively analyze the concentration of the person to be tested, and the test results obtained based on this are more accurate than a single judgment method. Compared with the existing car driver distraction detection method, the concentration monitoring method provided by the present invention is more concealed and convenient, and has higher detection accuracy, and can be suitable for operations such as aviation dispatch and nuclear power safety control. Human distraction detection in environments.
需要说明的是,本发明的专注度监测方法,利用相机等非接触式设备采集所需图像信息,比如采集用于微行为识别检测的待测人员手部和键盘鼠标的图像、用于眨眼检测的待测人员眼部的图像、用于检测视线轨迹的人脸的图像等等,然后根据微行为识别检测区域内是否出现待测人员的手部进行初步判断,接着再根据出现在微行为识别检测区域内的手部的活动状态作进一步判断,然后再通过对比待测人员在显示器屏幕上的视线轨迹与屏幕注意力热力图来作更进一步判断,然后再通过对比视线轨迹与先验知识作更进一步判断,而后根据待测人员的眨眼频率作更进一步判断,最后汇集各项检测结果并进行加权计算,判断待测人员的当前状态(工作状态、分心状态)。本发明在多种检测方式的结合下可以规避不少误判和漏判的情况,有效提高检测的准确度。其中,在微行为识别过程中,双手未操作键盘鼠标可能分心也可能并未分心,若是通过传统的单一判断方式直接判定人员状态(比如,根据双手是否操作键盘鼠标来判定人员工作还是分心),显然会存在误判的情况,而本发明考虑多种可能性,通过多阶检测,逐级多重优化,使得分心检测更加准确。It should be noted that the concentration monitoring method of the present invention uses non-contact devices such as cameras to collect required image information, such as collecting images of the person's hands and keyboard and mouse for micro-behavior recognition detection, and for blink detection. Images of the eyes of the person to be tested, images of faces used to detect gaze trajectories, etc., and then a preliminary judgment is made based on whether the hands of the person to be tested appear in the micro-behavior recognition detection area, and then based on the presence of the hand of the person to be tested in the micro-behavior recognition detection area The activity status of the hands in the detection area is further judged, and then further judgment is made by comparing the line of sight trace of the person to be tested on the monitor screen with the screen attention heat map, and then the judgment is made by comparing the line of sight trace with prior knowledge. Further judgment is made, and then further judgment is made based on the blink frequency of the person to be tested. Finally, various test results are collected and weighted calculations are performed to determine the current state of the person to be tested (working state, distracted state). The present invention can avoid many misjudgments and missed judgments by combining multiple detection methods, and effectively improves the accuracy of detection. Among them, in the process of micro-behavior recognition, if the hands are not operating the keyboard and mouse, they may or may not be distracted. If the person's status is directly determined through the traditional single judgment method (for example, whether the person is working or working based on whether the hands are operating the keyboard and mouse), (Mind), there will obviously be misjudgments, and the present invention considers multiple possibilities and makes distraction detection more accurate through multi-level detection and step-by-step multiple optimization.
另外,本实施例还提供了一种专注度监测系统,其包括存储单元和运算单元,存储单元存储有专注度监测程序,专注度监测程序通过运算单元运行以执行前面所述专注度监测方法中的步骤。In addition, this embodiment also provides a concentration monitoring system, which includes a storage unit and an arithmetic unit. The storage unit stores a concentration monitoring program. The concentration monitoring program is run through the arithmetic unit to perform the concentration monitoring method described above. A step of.
上述实施例为本发明较佳的实现方案,除此之外,本发明还可以其它方式实现,在不脱离本技术方案构思的前提下任何显而易见的替换均在本发明的保护范围之内。The above embodiments are preferred implementation solutions of the present invention. In addition, the present invention can also be implemented in other ways. Any obvious substitutions are within the protection scope of the present invention without departing from the concept of the technical solution.
为了让本领域普通技术人员更方便地理解本发明相对于现有技术的改进之处,本发明的一些附图和描述已经被简化,并且为了清楚起见,本申请文件还省略了一些其它要素,本领域普通技术人员应该意识到这些省略的要素也可构成本发明的内容。In order to allow those of ordinary skill in the art to more easily understand the improvements of the present invention over the prior art, some drawings and descriptions of the present invention have been simplified, and for the sake of clarity, some other elements have been omitted in this application document. Those of ordinary skill in the art should realize that these omitted elements may also constitute the content of the present invention.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310222460.8A CN117197880B (en) | 2023-03-09 | 2023-03-09 | Concentration monitoring method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310222460.8A CN117197880B (en) | 2023-03-09 | 2023-03-09 | Concentration monitoring method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117197880A true CN117197880A (en) | 2023-12-08 |
CN117197880B CN117197880B (en) | 2025-02-11 |
Family
ID=88987521
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310222460.8A Active CN117197880B (en) | 2023-03-09 | 2023-03-09 | Concentration monitoring method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117197880B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118135621A (en) * | 2024-04-10 | 2024-06-04 | 连云港智源电力设计有限公司 | Intelligent building site management and control method and system based on machine vision technology |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107818310A (en) * | 2017-11-03 | 2018-03-20 | 电子科技大学 | A kind of driver attention's detection method based on sight |
CN112836630A (en) * | 2021-02-01 | 2021-05-25 | 清华大学深圳国际研究生院 | Attention detection system and method based on CNN |
WO2021098454A1 (en) * | 2019-11-21 | 2021-05-27 | 深圳云天励飞技术股份有限公司 | Region of concern detection method and apparatus, and readable storage medium and terminal device |
CN113688733A (en) * | 2021-08-25 | 2021-11-23 | 深圳龙岗智能视听研究院 | Eye detection and tracking method, system, equipment and application based on event camera |
CN113743471A (en) * | 2021-08-05 | 2021-12-03 | 暨南大学 | Driving evaluation method and system |
CN113780051A (en) * | 2021-06-29 | 2021-12-10 | 华为技术有限公司 | Methods and devices for assessing students' concentration |
CN114998870A (en) * | 2022-05-31 | 2022-09-02 | 福思(杭州)智能科技有限公司 | Driving behavior state recognition method, device, equipment and storage medium |
WO2023017595A1 (en) * | 2021-08-12 | 2023-02-16 | 三菱電機株式会社 | Occupant state determining device, occupant state determining method, and occupant state determining system |
-
2023
- 2023-03-09 CN CN202310222460.8A patent/CN117197880B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107818310A (en) * | 2017-11-03 | 2018-03-20 | 电子科技大学 | A kind of driver attention's detection method based on sight |
WO2021098454A1 (en) * | 2019-11-21 | 2021-05-27 | 深圳云天励飞技术股份有限公司 | Region of concern detection method and apparatus, and readable storage medium and terminal device |
CN112836630A (en) * | 2021-02-01 | 2021-05-25 | 清华大学深圳国际研究生院 | Attention detection system and method based on CNN |
CN113780051A (en) * | 2021-06-29 | 2021-12-10 | 华为技术有限公司 | Methods and devices for assessing students' concentration |
CN113743471A (en) * | 2021-08-05 | 2021-12-03 | 暨南大学 | Driving evaluation method and system |
WO2023017595A1 (en) * | 2021-08-12 | 2023-02-16 | 三菱電機株式会社 | Occupant state determining device, occupant state determining method, and occupant state determining system |
CN113688733A (en) * | 2021-08-25 | 2021-11-23 | 深圳龙岗智能视听研究院 | Eye detection and tracking method, system, equipment and application based on event camera |
CN114998870A (en) * | 2022-05-31 | 2022-09-02 | 福思(杭州)智能科技有限公司 | Driving behavior state recognition method, device, equipment and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118135621A (en) * | 2024-04-10 | 2024-06-04 | 连云港智源电力设计有限公司 | Intelligent building site management and control method and system based on machine vision technology |
Also Published As
Publication number | Publication date |
---|---|
CN117197880B (en) | 2025-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liu et al. | Effects of dataset characteristics on the performance of fatigue detection for crane operators using hybrid deep neural networks | |
CN103340637B (en) | Move and driver's Alertness intelligent monitor system of brain electro' asion and method based on eye | |
EP1799105B1 (en) | System and method for mental workload measurement based on rapid eye movement | |
US9566004B1 (en) | Apparatus, method and system for measuring repetitive motion activity | |
CN102749991B (en) | A kind of contactless free space sight tracing being applicable to man-machine interaction | |
CN110221699A (en) | A kind of eye movement Activity recognition method of front camera video source | |
CN114742090B (en) | Cabin man-machine interaction system based on mental fatigue monitoring | |
CN110495895A (en) | A fatigue detection method and system based on eye movement tracking | |
CN114821753B (en) | Eye movement interaction system based on visual image information | |
CN112000227A (en) | Working state monitoring feedback system | |
CN115937928A (en) | Learning status monitoring method and system based on multi-visual feature fusion | |
CN117197880A (en) | Concentration monitoring method and system | |
Mehmood et al. | Non-invasive detection of mental fatigue in construction equipment operators through geometric measurements of facial features | |
Alagarsamy et al. | Control the movement of mouse using computer vision technique | |
Parmar et al. | Facial-feature based Human-Computer Interface for disabled people | |
Sivaramakrishnan et al. | Eye‐Based Cursor Control and Eye Coding Using Hog Algorithm and Neural Network | |
Zhou et al. | Driver fatigue tracking and detection method based on OpenMV | |
CN114399752A (en) | Eye movement multi-feature fusion fatigue detection system and method based on micro eye jump characteristics | |
Yunardi et al. | Visual and gyroscope sensor for head movement controller system on meal-assistance application | |
JP5181060B2 (en) | calculator | |
Li et al. | EM-SAM: Eye-Movement-Guided Segment Anything Model for Object Detection and Recognition in Complex Scenes | |
Matsuno et al. | An analysis method for eye motion and eye blink detection from colour images around ocular region | |
CN116189101B (en) | Method and system for identifying, judging and guiding visual operation specification of security inspector | |
Rashidan et al. | Frontal Face Tracking in the Thermal Infrared Imaging for Autism Spectrum Disorder Children | |
Aditia et al. | COMPARISON OF ROBUSTNESS TEST RESULTS OF THE EYE ASPECT RATIO METHOD AND IRIS-SCLERA PATTERN ANALYSIS TO DETECT DROWSINESS WHILE DRIVING |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |