CN119924833A - Non-contact mental state recognition method, device, robot and medium - Google Patents
Non-contact mental state recognition method, device, robot and medium Download PDFInfo
- Publication number
- CN119924833A CN119924833A CN202411994012.0A CN202411994012A CN119924833A CN 119924833 A CN119924833 A CN 119924833A CN 202411994012 A CN202411994012 A CN 202411994012A CN 119924833 A CN119924833 A CN 119924833A
- Authority
- CN
- China
- Prior art keywords
- detection period
- fluctuation
- subject
- pulse rate
- eye
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
The embodiment of the application provides a non-contact psychological state identification method, a non-contact psychological state identification device, a robot and a non-contact psychological state identification medium, and belongs to the technical field of artificial intelligence. The method comprises the steps of obtaining a plurality of facial skin images of a subject in each detection period, obtaining pulse rate data of the subject in each detection period according to the plurality of facial skin images of each detection period, obtaining a plurality of eye images of the subject in each detection period, obtaining eye change data of the subject in each detection period according to the plurality of eye images of each detection period, generating vital sign fluctuation rate of the subject according to the pulse rate data and the eye change data in each detection period, determining fluctuation deviation of the two vital sign fluctuation rates corresponding to two adjacent detection periods, and identifying psychological states of the subject according to the fluctuation deviation. In this way, the vital sign fluctuation rate of the current subject is obtained by combining the eye change data and the pulse rate change data, so that the psychological state is identified, and the psychological state assessment accuracy is improved.
Description
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a non-contact psychological state identification method, a non-contact psychological state identification device, a non-contact psychological state identification robot and a non-contact psychological state identification medium.
Background
In general, in the case of physical stress, pulse rate and blood pressure rise, and at the same time blink frequency increases. When people are highly concentrated, vital signs and blink frequency can be reduced relative to daily life. Currently, no solution for assessing the psychological state of a user by a non-contact instrument exists in the industry.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides a non-contact psychological state identification method, a non-contact psychological state identification device, a non-contact psychological state identification robot and a non-contact psychological state identification medium.
In a first aspect, an embodiment of the present application provides a non-contact mental state recognition method, where the method includes:
Acquiring pulse rate data of the subject in each detection period according to the plurality of facial skin images in each detection period;
acquiring a plurality of eye images of the subject in each detection period through an eye tracker; acquiring eye change data of the subject in each detection period according to a plurality of eye images in each detection period;
Generating vital sign fluctuation rates of the subject in each detection period according to the pulse rate data and the eye change data in each detection period;
Determining fluctuation deviation of the fluctuation rates of the two vital signs corresponding to the two adjacent detection periods;
Identifying a psychological state of the subject based on the fluctuation variance.
In one embodiment, the acquiring pulse rate data of the subject in each detection period according to the plurality of facial skin images in each detection period includes:
acquiring green light image data corresponding to each facial skin image;
acquiring volume pulse waves of each detection period based on the green light image data;
and determining the pulse rate data of each detection period according to each volume pulse wave.
In an embodiment, the generating the vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period includes:
Analyzing the pulse rate data in each detection period to obtain pulse rate fluctuation parameters of each detection period;
analyzing the eye change data of each detection period to obtain pupil fluctuation parameters and blink frequency comparison results of each detection period;
and carrying out fusion calculation according to the pulse rate fluctuation parameter, the pupil fluctuation parameter and the blink frequency comparison result to obtain the vital sign fluctuation rate in the detection period.
In an embodiment, the pulse rate fluctuation parameter comprises pulse rate change rate and pulse rate stability comparison results, the pupil fluctuation parameter comprises normal eye movement target contrast, eye movement target movement time interval and blink frequency comparison results, and the fusion calculation is performed according to the pulse rate fluctuation parameter, the pupil fluctuation parameter and the blink frequency comparison results, and the fusion calculation comprises the following steps:
And weighting or logarithmically calculating pulse rate change rate, pulse rate stability comparison results, normal eye movement target contrast, movement time interval of the eye movement target and blink frequency comparison results in each detection period.
In one embodiment, the non-contact mental state recognition method further comprises the steps of collecting voice characteristic data of the subject in each detection period, and obtaining voice fluctuation data according to the voice characteristic data in each detection period;
Generating a vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period, including:
And generating vital sign fluctuation rates of the subject in each detection period according to the pulse rate data, the eye change data and the voice fluctuation data in each detection period.
In one embodiment, the identifying the psychological state of the subject based on the fluctuation variance comprises:
judging whether the fluctuation deviation is larger than or equal to a preset fluctuation deviation threshold value or not;
If the fluctuation deviation is greater than or equal to the preset fluctuation deviation threshold, determining that the subject is in a stress psychological state;
And if the fluctuation deviation is smaller than the preset fluctuation deviation threshold, determining that the subject is in a stable psychological state.
In one embodiment, the non-contact mental state recognition method further comprises determining an emotion fluctuation phase of the subject according to the fluctuation deviation;
And selecting a target emotion control strategy from preset emotion stability control strategies according to the emotion fluctuation stage, and executing the target emotion control strategy.
In a second aspect, an embodiment of the present application provides a non-contact mental state recognition apparatus, the apparatus including:
the device comprises a first acquisition module, a first detection module, a second acquisition module, a first detection module and a second detection module, wherein the first acquisition module is used for acquiring a plurality of facial skin images of a subject in each detection period through a camera;
The second acquisition module is used for acquiring a plurality of eye images of the subject in each detection period through an eye tracker; acquiring eye change data of the subject in each detection period according to a plurality of eye images in each detection period;
the generation module is used for generating vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period;
the determining module is used for determining fluctuation deviation of the fluctuation rates of the two vital signs corresponding to the two adjacent detection periods;
and the identification module is used for identifying the psychological state of the subject according to the fluctuation deviation.
In a third aspect, an embodiment of the present application provides a robot, including a memory and a processor, where the memory is configured to store a computer program, and the computer program executes the non-contact mental state recognition method provided in the first aspect when the processor is executed.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium storing a computer program which, when run on a processor, performs the non-contact mental state recognition method provided in the first aspect.
The non-contact psychological state identification method, device, robot and medium provided by the application are characterized by acquiring a plurality of facial skin images of a subject in each detection period through a camera, acquiring pulse rate data of the subject in each detection period according to the plurality of facial skin images of each detection period, acquiring a plurality of eye images of the subject in each detection period through an eye tracker, acquiring eye change data of the subject in each detection period according to the plurality of eye images of each detection period, generating vital sign fluctuation rates of the subject in each detection period according to the pulse rate data and the eye change data in each detection period, determining fluctuation deviation of the two vital sign fluctuation rates corresponding to two adjacent detection periods, and identifying the psychological state of the subject according to the fluctuation deviation. The method comprises the steps of collecting eye change data and pulse rate change data of a subject, and acquiring vital sign fluctuation rate of the current subject by combining the eye change data and the pulse rate change data, so that psychological states of the subject are identified, psychological state assessment accuracy is improved, the method can be applied to clinical psychological assessment, and labor cost is saved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are required for the embodiments will be briefly described, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope of the present application. Like elements are numbered alike in the various figures.
Fig. 1 is a schematic flow chart of a non-contact mental state recognition method according to an embodiment of the present application;
FIG. 2 is a schematic diagram showing the spectral generation depth of volume pulse waves according to an embodiment of the present application;
Fig. 3 is another flow chart of a non-contact mental state recognition method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a non-contact mental state recognition method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a non-contact mental state recognition device according to an embodiment of the present application.
The icon is 500-a non-contact psychological state identification device, 501-a first acquisition module, 502-a second acquisition module, 503-a generation module, 504-a determination module and 505-an identification module.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments.
The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
The terms "comprises," "comprising," "including," or any other variation thereof, are intended to cover a specific feature, number, step, operation, element, component, or combination of the foregoing, which may be used in various embodiments of the present application, and are not intended to first exclude the presence of or increase the likelihood of one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing.
Furthermore, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of the application belong. The terms (such as those defined in commonly used dictionaries) will be interpreted as having a meaning that is the same as the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in connection with the various embodiments of the application.
Example 1
The embodiment of the application provides a non-contact psychological state identification method, which is used for acquiring eye change data and pulse rate change data of a subject, acquiring vital sign fluctuation rate of the current subject by combining the eye change data and the pulse rate change data, identifying psychological states of the subject and improving psychological state assessment accuracy.
In this embodiment, the non-contact mental state recognition method may be applied to a robot, where the robot includes a camera, an eye tracker, and a display screen, and the camera may be a far infrared camera. When the subject faces the robot, the robot can control the camera and the eye movement instrument to collect corresponding data, and can control the display screen to perform relevant display. The non-contact mental state recognition method will be described with reference to fig. 1.
Referring to fig. 1, the non-contact mental state recognition method includes:
Step S101, acquiring a plurality of facial skin images of the subject in each detection period through a camera, and acquiring pulse rate data of the subject in each detection period according to the plurality of facial skin images of the subject in each detection period.
In this embodiment, a facial skin image of a subject is acquired by a far infrared camera of a robot, and pulse rate data is obtained based on the facial skin image. When the light beam emitted by the far infrared camera irradiates the surface of the facial skin, blood in the facial skin has absorption and attenuation effects on the light beam, and the attenuation amount depends on the blood volume. Under the action of heart pulsation, the blood volume of arterial blood vessels in the face skin changes, and the attenuation of the irradiation light beam correspondingly shows fluctuation change, so that the heart pulsation and the respiratory rate of a subject can be indirectly obtained by detecting the change of the reflected light intensity of the skin (namely, the change of the brightness value of an image) through a camera. Thus, pulse rate data such as heart beat and respiratory rate of the subject can be measured using the time lapse sequence images. Exemplary, face images are continuously collected for 30 seconds through a camera, specific areas of cheek parts of a human body are intercepted in the images, average brightness values of the cheek specific areas in each frame of images are obtained, a group of brightness time series signals are sequentially subjected to first-order difference, low-pass filtering and related model power spectrum analysis, and two remarkable peaks in the obtained power spectrum correspond to heartbeat and respiratory frequency respectively. The absorption light used may be red light, green light, blue light, etc.
Referring to fig. 2, the spectral generation depth of the volume pulse wave is larger than the spectral penetration depth thereof. This means that in the visible-infrared band, the volume pulse wave is indeed generated by the pulsating nature of the dermis. This also means that the volume pulse wave generation depth of blue light is smaller than that of green light and red light, and the volume pulse wave of blue light is less modulated by arterial blood pulse characteristics, and thus is susceptible to noise. This is also one of the reasons that current physiological testing devices use less blue light. The volume pulse wave generation depth of red light is large, but the volume pulse wave of red light is modulated less by the blood absorption than the volume pulse wave of green light, and thus the magnitude of the volume pulse wave of red light is smaller than that of green light. Therefore, the depth of the spectrum generation of the volume pulse wave can be known, and the green light is more suitable for effectively measuring the volume pulse wave in the infrared band range.
In one embodiment, the step of acquiring pulse rate data of the subject in each detection period according to the plurality of facial skin images in each detection period comprises the steps of acquiring green light image data corresponding to each facial skin image, acquiring volume pulse waves of each detection period based on the green light image data, and determining the pulse rate data of each detection period according to each volume pulse wave.
For example, the detection period may be 15 seconds, and a weighted average pulse rate value of 15 seconds is taken as the pulse rate data. The detection period may be set according to the actual application, for example, the detection period may be 30 seconds, 45 seconds or 60 seconds, and in this embodiment, the default value of the detection period is 15 seconds.
Thus, by adopting a green light algorithm, more accurate volume pulse waves can be obtained, and the accuracy of pulse rate data is improved.
Step S102, acquiring a plurality of eye images of the subject in each detection period through an eye tracker, and acquiring eye change data of the subject in each detection period according to the plurality of eye images of the subject in each detection period.
It will be appreciated that an eye tracker is an instrument that recognizes the pupil position and blink condition of a user in real time. The eye movement instrument comprises a far infrared camera, wherein the far infrared camera is used for tracking the pupil of a subject in real time in a detection period, and recording corresponding position changes of the pupil, eye change data such as the blink frequency of the subject in the detection period, and the like.
Exemplary, the eye tracker acquires eye images, i.e., 120 eye images per second, of a subject using a frequency of 120HZ to obtain eye change data of the subject, including pupil position, time intervals of pupil displacement, and the number of blinks during a detection period. The detection period of the eye movement device is the same as the detection period of the camera, for example, when the detection period of the camera is 15 seconds, the detection period of the eye movement device is also 15 seconds, and the pupil position, the time interval of the pupil displacement and the blink number of times of 15 seconds are determined as the eye change data in the detection period.
Step S103, generating a vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period.
Referring to fig. 3, step S103 includes:
Step S1031, analyzing the pulse rate data in each detection period to obtain pulse rate fluctuation parameters of each detection period.
In this embodiment, the pulse rate fluctuation parameter includes pulse rate change rate and pulse rate stability comparison result. The quotient of the pulse rate difference value and the detection period can be used as the pulse rate change rate, and the pulse rate value in the detection period can be compared with the normal pulse rate threshold value to obtain a pulse rate stability comparison result. Wherein the normal pulse rate threshold is set according to the age of the subject, for example, the normal pulse rate threshold may be set to a contrast value of 50-100 times per minute.
Step S1032, analyzing the eye change data of each detection period to obtain the comparison result of pupil fluctuation parameters and blink frequency of each detection period.
The pupil fluctuation parameters comprise normal eye movement target contrast and movement time intervals of the eye movement target, and the blink frequency in the detection period is compared with a normal blink frequency threshold value to obtain blink frequency comparison results, wherein the normal blink frequency threshold value is a default comparison value of 15-20 times per minute.
Step S1033, performing fusion calculation according to the pulse rate fluctuation parameter, the pupil fluctuation parameter and the blink frequency comparison result, to obtain the vital sign fluctuation rate in the detection period.
In one embodiment, step S1033 includes weighting or logarithmically calculating pulse rate change rate, pulse rate stationarity contrast, normal eye movement target contrast, movement time interval of the eye movement target, and blink frequency contrast for each of the detection periods.
In this embodiment, the pulse rate change rate and blink frequency comparison result are used as main weighting parameters, the pulse rate stationarity comparison result, the normal eye movement target contrast and the movement time interval of the eye movement target are used as auxiliary weighting parameters, and the weighting calculation is performed.
Exemplary, the weighting calculation is performed on the pulse rate change rate, pulse rate stability comparison result, normal eye movement target contrast, movement time interval of the eye movement target and blink frequency comparison result in each detection period to obtain the vital sign fluctuation rate in the detection period, which comprises the following steps:
Setting a first main weighting parameter of the pulse rate change rate, a second main weighting parameter of the blink frequency, a first auxiliary weighting parameter of the pulse rate stability comparison result, a second auxiliary weighting parameter of the normal eye movement target contrast and a third auxiliary weighting parameter of the movement time interval of the eye movement target;
And carrying out weighted calculation according to the pulse rate change rate, the pulse rate stability comparison result, the normal eye movement target contrast, the movement time interval of the eye movement target and the blink frequency comparison result, the first main weighting parameter, the second main weighting parameter, the first auxiliary weighting parameter, the second auxiliary weighting parameter and the third auxiliary weighting parameter, and taking the weighted sum value as the vital sign fluctuation rate in the detection period.
In one embodiment, the log calculation is performed on the pulse rate change rate, the pulse rate stability comparison result, the normal eye movement target contrast, the movement time interval of the eye movement target and the blink frequency comparison result in each detection period, and the log calculation is performed on the pulse rate change rate, the pulse rate stability comparison result, the normal eye movement target contrast, the movement time interval of the eye movement target and the blink frequency comparison result in each detection period respectively, and the log calculation results are added to obtain a sum value which is used as the vital sign fluctuation rate in the detection period.
It should be noted that, to improve accuracy of the fluctuation rate, external change data of the subject may be obtained from other dimensions, for example, fluctuation rate analysis may be performed by obtaining voice feature data of the subject. The non-contact psychological state recognition method further comprises the steps of collecting voice characteristic data of the subject in each detection period, and obtaining voice fluctuation data according to the voice characteristic data in each detection period.
In one embodiment, step S103 may include generating a vital sign fluctuation rate of the subject for each of the detection periods based on the pulse rate data, the eye change data, and the voice fluctuation data for each of the detection periods.
It should be noted that, the voice fluctuation data may include a voice amplitude fluctuation value, a voice frequency fluctuation value, and the like, and the pulse rate change rate, the pulse rate stationarity comparison result, the normal eye movement target contrast, the movement time interval of the eye movement target, the blink frequency comparison result, the voice amplitude fluctuation value, and the voice frequency fluctuation value in each detection period may be further weighted or calculated by taking the logarithm, so as to obtain the vital sign fluctuation rate corresponding to each detection period.
Step S104, determining fluctuation deviation of the fluctuation rates of the two vital signs corresponding to the two adjacent detection periods.
The two adjacent detection periods are a first detection period and a second detection period, and a difference value between a first vital sign fluctuation rate corresponding to the first detection period and a second vital sign fluctuation rate corresponding to the second detection period is used as a fluctuation deviation.
Step S105, identifying a psychological state of the subject according to the fluctuation deviation.
It will be appreciated that the greater the fluctuation deviation, the more stressful the psychological state of the subject, and the less the fluctuation deviation, the more stable the psychological state of the subject.
Referring to fig. 4, step S105 includes:
Step S1051, determining whether the fluctuation deviation is greater than or equal to a preset fluctuation deviation threshold.
In this embodiment, the preset fluctuation deviation threshold may be a fluctuation deviation ratio value, for example, the fluctuation deviation ratio value may be set to 20%, or may be another ratio value, which is not limited herein.
Step S1052, if the fluctuation deviation is greater than or equal to the preset fluctuation deviation threshold, determining that the subject is in a stress psychological state.
Step S1053, if the fluctuation deviation is smaller than the preset fluctuation deviation threshold, determining that the subject is in a steady psychological state.
For example, if the preset fluctuation deviation threshold is 20%, when the calculated fluctuation deviation is greater than or equal to 20%, the subject is indicated to have larger psychological fluctuation and to be in a stress psychological state, and when the calculated fluctuation deviation is less than 20%, the subject is indicated to have smaller psychological fluctuation and to be in a stable psychological state.
In this embodiment, the fluctuation deviation is greater than or equal to the preset fluctuation deviation threshold, and it is determined that the subject is in a stress psychological state, and a mood stabilization control strategy needs to be started.
In the embodiment, the non-contact psychological state recognition method further comprises the steps of determining an emotion fluctuation stage of the subject according to the fluctuation deviation, selecting a target emotion control strategy from preset emotion stability control strategies according to the emotion fluctuation stage, and executing the target emotion control strategy.
The emotion fluctuation stage of the person is identified in real time, the emotion fluctuation stage can comprise an emotion stability stage, an emotion fluctuation rising stage or an emotion fluctuation falling stage, and corresponding preset emotion stability control strategies are respectively set for each stage and provide data playing of corresponding voice or man-machine vision. For example, when the mood swings in the rising period, a soothing audio/video is played to ease the stress psychological state of the subject. Therefore, the humanoid of the robot is greatly improved, and the man-machine interaction capability is improved. The non-contact psychological state identification method can be further oriented to psychological assessment of special application occasions such as medical clinical auxiliary diagnosis and criminal investigation, and the degree of understanding of the psychological state of the subject is improved.
The non-contact psychological state recognition method provided by the embodiment comprises the steps of obtaining a plurality of facial skin images of a subject in each detection period through a camera, obtaining pulse rate data of the subject in each detection period according to the plurality of facial skin images of each detection period, obtaining a plurality of eye images of the subject in each detection period through an eye tracker, obtaining eye change data of the subject in each detection period according to the plurality of eye images of each detection period, generating vital sign fluctuation rates of the subject in each detection period according to the pulse rate data and the eye change data in each detection period, determining fluctuation deviation of two vital sign fluctuation rates of two adjacent detection periods, and recognizing psychological states of the subject according to the fluctuation deviation. The method comprises the steps of collecting eye change data and pulse rate change data of a subject, and acquiring vital sign fluctuation rate of the current subject by combining the eye change data and the pulse rate change data, so that psychological states of the subject are identified, psychological state assessment accuracy is improved, the method can be applied to clinical psychological assessment, and labor cost is saved.
Example 2
In addition, the embodiment of the application provides a non-contact psychological state recognition device.
As shown in fig. 5, the non-contact mental state recognition apparatus 500 includes:
The first obtaining module 501 is configured to obtain, by using a camera, a plurality of facial skin images of the subject in each detection period;
The second obtaining module 502 is configured to obtain, by using an eye tracker, a plurality of eye images of the subject in each detection period;
A generating module 503, configured to generate a vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period;
A determining module 504, configured to determine fluctuation deviations of the two vital sign fluctuation rates corresponding to two adjacent detection periods;
An identification module 505 for identifying a psychological state of the subject based on the fluctuation deviation.
In an embodiment, the first obtaining module 501 is further configured to obtain green image data corresponding to each of the facial skin images;
acquiring volume pulse waves of each detection period based on the green light image data;
and determining the pulse rate data of each detection period according to each volume pulse wave.
In an embodiment, the generating module 503 is further configured to analyze the pulse rate data in each detection period to obtain a pulse rate fluctuation parameter of each detection period;
analyzing the eye change data of each detection period to obtain pupil fluctuation parameters and blink frequency comparison results of each detection period;
and carrying out fusion calculation according to the pulse rate fluctuation parameter, the pupil fluctuation parameter and the blink frequency comparison result to obtain the vital sign fluctuation rate in the detection period.
In an embodiment, the pulse rate fluctuation parameter includes pulse rate change rate and pulse rate stability comparison results, the pupil fluctuation parameter includes normal eye movement target contrast, movement time interval of the eye movement target and blink frequency comparison results, and the generation module 503 is further configured to weight or logarithmically calculate the pulse rate change rate, pulse rate stability comparison results, normal eye movement target contrast, movement time interval of the eye movement target and blink frequency comparison results in each detection period.
In one embodiment, the non-contact mental state recognition apparatus 500 further includes:
The third acquisition module is used for acquiring voice characteristic data of the subject in each detection period and acquiring voice fluctuation data according to the voice characteristic data in each detection period;
The generating module 503 is further configured to generate a vital sign fluctuation rate of the subject in each detection period according to the pulse rate data, the eye change data, and the voice fluctuation data in each detection period.
In one embodiment, the identifying module 505 is configured to determine whether the fluctuation deviation is greater than or equal to a preset fluctuation deviation threshold;
If the fluctuation deviation is greater than or equal to the preset fluctuation deviation threshold, determining that the subject is in a stress psychological state;
And if the fluctuation deviation is smaller than the preset fluctuation deviation threshold, determining that the subject is in a stable psychological state.
In one embodiment, the non-contact mental state recognition apparatus 500 further includes:
A processing module for determining an mood swing phase of the subject from the swing deviation;
And selecting a target emotion control strategy from preset emotion stability control strategies according to the emotion fluctuation stage, and executing the target emotion control strategy.
The non-contact mental state recognition device 500 provided in this embodiment can implement the non-contact mental state recognition method provided in embodiment 1, and in order to avoid repetition, the description is omitted here.
The non-contact psychological state recognition device provided by the embodiment obtains a plurality of facial skin images of a subject in each detection period through a camera, obtains pulse rate data of the subject in each detection period according to the plurality of facial skin images of each detection period, obtains a plurality of eye images of the subject in each detection period through an eye tracker, obtains eye change data of the subject in each detection period according to the plurality of eye images of each detection period, generates vital sign fluctuation rates of the subject in each detection period according to the pulse rate data and the eye change data in each detection period, determines fluctuation deviation of two vital sign fluctuation rates of two adjacent detection periods, and recognizes psychological states of the subject according to the fluctuation deviation. The method comprises the steps of collecting eye change data and pulse rate change data of a subject, and acquiring vital sign fluctuation rate of the current subject by combining the eye change data and the pulse rate change data, so that psychological states of the subject are identified, psychological state assessment accuracy is improved, the method can be applied to clinical psychological assessment, and labor cost is saved.
Example 3
Furthermore, an embodiment of the present application provides a robot including a memory and a processor, the memory storing a computer program which, when run on the processor, performs the non-contact mental state recognition method provided in embodiment 1.
The robot provided in this embodiment can implement the non-contact mental state recognition method provided in embodiment 1, and in order to avoid repetition, a detailed description is omitted here.
Example 4
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the non-contact mental state recognition method provided by embodiment 1.
In the present embodiment, the computer readable storage medium may be a Read-Only Memory (ROM), a random access Memory (Random Access Memory RAM), a magnetic disk, an optical disk, or the like.
The computer readable storage medium provided in this embodiment can implement the non-contact mental state recognition method provided in embodiment 1, and in order to avoid repetition, the description is omitted here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.
Claims (10)
1. A method for non-contact mental state identification, the method comprising:
Acquiring pulse rate data of the subject in each detection period according to the plurality of facial skin images in each detection period;
acquiring a plurality of eye images of the subject in each detection period through an eye tracker; acquiring eye change data of the subject in each detection period according to a plurality of eye images in each detection period;
Generating vital sign fluctuation rates of the subject in each detection period according to the pulse rate data and the eye change data in each detection period;
Determining fluctuation deviation of the fluctuation rates of the two vital signs corresponding to the two adjacent detection periods;
Identifying a psychological state of the subject based on the fluctuation variance.
2. The method of claim 1, wherein the acquiring pulse rate data for the subject at each detection cycle from the plurality of facial skin images at each detection cycle comprises:
acquiring green light image data corresponding to each facial skin image;
acquiring volume pulse waves of each detection period based on the green light image data;
and determining the pulse rate data of each detection period according to each volume pulse wave.
3. The method of claim 1, wherein generating a vital sign fluctuation rate of the subject for each of the detection periods from the pulse rate data and the eye change data for each of the detection periods comprises:
Analyzing the pulse rate data in each detection period to obtain pulse rate fluctuation parameters of each detection period;
analyzing the eye change data of each detection period to obtain pupil fluctuation parameters and blink frequency comparison results of each detection period;
and carrying out fusion calculation according to the pulse rate fluctuation parameter, the pupil fluctuation parameter and the blink frequency comparison result to obtain the vital sign fluctuation rate in the detection period.
4. The method of claim 3, wherein the pulse rate fluctuation parameter comprises pulse rate change rate and pulse rate stationarity contrast results, wherein the pupil fluctuation parameter comprises normal eye movement target contrast, movement time interval of the eye movement target, and blink frequency contrast results, and wherein the performing the fusion calculation based on the pulse rate fluctuation parameter, the pupil fluctuation parameter, and the blink frequency contrast results comprises:
And weighting or logarithmically calculating pulse rate change rate, pulse rate stability comparison results, normal eye movement target contrast, movement time interval of the eye movement target and blink frequency comparison results in each detection period.
5. The method as recited in claim 1, further comprising:
Collecting voice characteristic data of the subject in each detection period, and acquiring voice fluctuation data according to the voice characteristic data in each detection period;
Generating a vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period, including:
And generating vital sign fluctuation rates of the subject in each detection period according to the pulse rate data, the eye change data and the voice fluctuation data in each detection period.
6. The method of claim 1, wherein said identifying the mental state of the subject from the fluctuation variance comprises:
judging whether the fluctuation deviation is larger than or equal to a preset fluctuation deviation threshold value or not;
If the fluctuation deviation is greater than or equal to the preset fluctuation deviation threshold, determining that the subject is in a stress psychological state;
And if the fluctuation deviation is smaller than the preset fluctuation deviation threshold, determining that the subject is in a stable psychological state.
7. The method according to claim 1, wherein the method further comprises:
Determining a mood swing phase of the subject from the swing deviation;
And selecting a target emotion control strategy from preset emotion stability control strategies according to the emotion fluctuation stage, and executing the target emotion control strategy.
8. A non-contact mental state recognition apparatus, the apparatus comprising:
the device comprises a first acquisition module, a first detection module, a second acquisition module, a first detection module and a second detection module, wherein the first acquisition module is used for acquiring a plurality of facial skin images of a subject in each detection period through a camera;
The second acquisition module is used for acquiring a plurality of eye images of the subject in each detection period through an eye tracker; acquiring eye change data of the subject in each detection period according to a plurality of eye images in each detection period;
the generation module is used for generating vital sign fluctuation rate of the subject in each detection period according to the pulse rate data and the eye change data in each detection period;
the determining module is used for determining fluctuation deviation of the fluctuation rates of the two vital signs corresponding to the two adjacent detection periods;
and the identification module is used for identifying the psychological state of the subject according to the fluctuation deviation.
9. A robot comprising a memory and a processor, the memory storing a computer program which, when run by the processor, performs the non-contact mental state recognition method of any of claims 1 to 7.
10. A computer readable storage medium, characterized in that it stores a computer program which, when run on a processor, performs the non-contact mental state recognition method according to any one of claims 1 to 7.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411994012.0A CN119924833A (en) | 2024-12-30 | 2024-12-30 | Non-contact mental state recognition method, device, robot and medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411994012.0A CN119924833A (en) | 2024-12-30 | 2024-12-30 | Non-contact mental state recognition method, device, robot and medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN119924833A true CN119924833A (en) | 2025-05-06 |
Family
ID=95544681
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202411994012.0A Pending CN119924833A (en) | 2024-12-30 | 2024-12-30 | Non-contact mental state recognition method, device, robot and medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119924833A (en) |
-
2024
- 2024-12-30 CN CN202411994012.0A patent/CN119924833A/en active Pending
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6721155B2 (en) | Biological information analysis device, system, and program | |
| US9109971B2 (en) | Respiratory condition analysis apparatus, respiratory condition display apparatus, processing method therein, and program | |
| EP2829223B1 (en) | Monitoring physiological parameters | |
| CN107028603B (en) | Apparatus and method for detecting diabetes in a human body using pulse palpation signals | |
| US20180228383A1 (en) | Blood pressure measuring device, blood pressure measurement method and blood pressure measurement program | |
| US20200245952A1 (en) | Feature extraction apparatus and method for biometric information detection, biometric information detection apparatus, and wearable device | |
| US20130267796A1 (en) | System and method for the simultaneous, non-invasive estimation of blood glucose, glucocorticoid level and blood pressure | |
| CN114159038B (en) | Blood pressure measurement method, device, electronic device and readable storage medium | |
| CN116058814B (en) | Heart rate detection method and electronic device | |
| US20140257124A1 (en) | Atrial fibrillation analyzer and program | |
| US12369803B2 (en) | Heart rate detection method and electronic device | |
| KR102243012B1 (en) | Estimation method of blood vessel elasticity and arrhythmia using skin image | |
| KR102243017B1 (en) | Depression Index Estimation Method Using Skin Image | |
| JP2020048622A (en) | Biological condition estimation device | |
| CN118098581A (en) | Emotional state monitoring method and system | |
| Athaya et al. | An efficient fingertip photoplethysmographic signal artifact detection method: A machine learning approach | |
| CN115840890B (en) | A method and device for emotion recognition based on non-contact physiological signals | |
| CN120899211B (en) | Blood pressure compensation measurement methods, devices, and smartwatches for sports scenarios | |
| KR102627661B1 (en) | Method for analyzing photoplethysmography data and recording medium storing program to implement the method | |
| EP3815602B1 (en) | Apparatus and method for monitoring health, and mobile device | |
| US20240180434A1 (en) | System and method for blood pressure measurement, computer program product using the method, and computer-readable recording medium thereof | |
| CN119924833A (en) | Non-contact mental state recognition method, device, robot and medium | |
| US20160354000A1 (en) | Wearable device and determination method thereof | |
| CN118427589A (en) | Heart sound segmentation method, device, electronic device and readable storage medium | |
| JP4882052B2 (en) | Pulse wave diagnosis system using self-organizing map, self-organizing map generating program and generating method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |