Disclosure of Invention
The embodiments of the present invention mainly aim to provide an online social method, an online social device, and a storage medium based on emotion recognition, which can at least solve the problems in the related art that chat content cannot be accurately understood and chat interaction effectiveness is poor due to the fact that two parties of a chat cannot accurately know the emotion of each other.
In order to achieve the above object, a first aspect of the embodiments of the present invention provides an online social method based on emotion recognition, where the method includes:
acquiring physiological characteristic data of a local chat user through a sensor;
determining a corresponding user emotion recognition result based on the physiological characteristic data;
acquiring corresponding emotion expression content according to the user emotion recognition result; the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and sending the emotion expression content to a chat client used by an opposite-end chat user in real time.
In order to achieve the above object, a second aspect of the embodiments of the present invention provides an online social device based on emotion recognition, including:
the acquisition module is used for acquiring physiological characteristic data of the local chat user through the sensor;
a determining module for determining a corresponding user emotion recognition result based on the physiological characteristic data;
the acquisition module is used for acquiring corresponding emotion expression content according to the user emotion recognition result; the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and the sending module is used for sending the emotion expression content to the chat client used by the opposite-end chat user in real time.
To achieve the above object, a third aspect of embodiments of the present invention provides an electronic apparatus, including: a processor, a memory, and a communication bus;
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is configured to execute one or more programs stored in the memory to implement the steps of any one of the above-mentioned online social methods.
To achieve the above object, a fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing one or more programs, which are executable by one or more processors to implement the steps of any one of the above-mentioned online social methods.
According to the online social contact method, device and storage medium based on emotion recognition, provided by the embodiment of the invention, physiological characteristic data of a local chat user is acquired through a sensor; determining a corresponding user emotion recognition result based on the physiological characteristic data; acquiring corresponding emotion expression content according to the emotion recognition result of the user; and sending the emotion expression content to the chat client used by the opposite-end chat user in real time for displaying. Through the implementation of the invention, the emotion of the local chat user is identified based on the physiological characteristic data of the user, and emotion expression contents such as emotion expression animation, emotion expression dynamic characters and the like corresponding to the emotion of the user are sent to the chat client used by the opposite chat user in real time for displaying, so that the emotion information of the chat user is transmitted through the animation, the dynamic characters and the like, the two chat parties are effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.
Other features and corresponding effects of the present invention are set forth in the following portions of the specification, and it should be understood that at least some of the effects are apparent from the description of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment:
in order to solve the problems that chat contents cannot be accurately understood and chat interaction effectiveness is poor due to the fact that two chat parties cannot accurately know the emotion of each other in the related art, an online social method based on emotion recognition is provided in the present embodiment, and as shown in fig. 1, a basic flow diagram of the online social method provided in the present embodiment is provided, and the online social method provided in the present embodiment includes the following steps:
step 101, collecting physiological characteristic data of a local chat user through a sensor.
In particular, linguistic studies indicate that the information conveyed by text chatting accounts for only 20% to 30% of the information conveyed in conversations, thus demonstrating the importance of non-verbal information, such as mood, which is essential for human cognition and affects different aspects of human life. For example, when we are excited, our perception tends to select a happy event, while negative emotions are reversed. This indicates that online chatting would benefit from an understanding of the emotional state of others. With the information of the emotional state of the user, the system can interact with the user more efficiently.
In practical applications, a user connects a specific sensor during a chat process, the sensor performs digital sampling, and data capture can be realized by using a module written in Visual C + +.
In an optional implementation manner of this embodiment, the physiological characteristic data may be galvanic skin response data and galvanic muscle response data; the Skin Galvanic reaction data is collected by a Skin Galvanic reaction instrument (GSR), which is an instrument for recording Skin Galvanic reaction changes in a curved ripple mode, and the muscle Galvanic reaction data can be collected by an electromyograph to record muscle bioelectricity graphs.
In another optional implementation manner of this embodiment, the physiological characteristic data may also be gaze tracking data, and the gaze tracking data may include pupil dilation data, eye movement data, and the like, and in practical application, the image sensor may be used to perform eye image acquisition on the user, and then the image recognition algorithm may recognize a plurality of consecutive eye images to acquire the gaze tracking data.
And 102, determining a corresponding emotion recognition result of the user based on the physiological characteristic data.
Specifically, in practical applications, when the emotion of the user changes, the physiological characteristics of the user generally change, and thus the present embodiment can adaptively sense the emotion of the user through the physiological characteristic data.
In the present embodiment, specific implementations of determining the corresponding emotion recognition result of the user based on the physiological characteristic data include, but are not limited to, the following two types:
determining corresponding emotional arousal data based on galvanic skin response data and corresponding emotional valence data based on galvanic muscle response data; and determining a corresponding emotion recognition result of the user by combining the emotional arousal data and the emotional valence data.
Specifically, since the GSR data has less noise signals relative to the BVP data, the present embodiment uses a GSR sensor to detect the emotional arousal data, the GSR sensor that measures Skin Conductance (SC) is connected to the middle and index fingers of the user's non-dominant hand, and the signal can be recorded with a ProComp + device. SC varies linearly with overall level of arousal, increasing with increased anxiety and stress. In an embodiment, emotional arousal data may be detected by analyzing peaks and troughs of the GSR data.
Fig. 2 shows an emotion recognition graph provided in this embodiment, in which a horizontal axis represents emotion valence data, a vertical axis represents emotion awakening data, and elements in each quadrant represent different types of emotion recognition results, including excitement, relaxation, sadness, fear, and the like.
Determining corresponding attention characterization data and interest characterization data based on the gaze tracking data; and determining a corresponding emotion recognition result of the user by combining the attention characterization data and the interest characterization data.
Specifically, the embodiment reacts to the sight line information of the user in real time, and records the eye gazing and pupil expanding data in the online chatting process. Eye movements may reveal the user's interests and attention, enrich non-verbal information by detecting attention, interests, etc. information from the user's real-time eye movement data, and use line of sight and eye movement information to facilitate emotion-related reasoning.
And 103, acquiring corresponding emotion expression content according to the emotion recognition result of the user.
In this embodiment, the emotion expression content includes at least one of an emotion expression animation and an emotion expression dynamic text, and fig. 3 is a schematic diagram of the emotion expression animation provided in this embodiment. The emotion expression content of the embodiment is adapted to the current emotional state of the user, and can be used for expressing more detailed information including the emotion of the user, different from the plain text information.
In an optional implementation manner of this embodiment, the step of obtaining corresponding emotion expression content according to the emotion recognition result of the user specifically includes: determining a corresponding content type and a content display attribute according to a user emotion recognition result; and acquiring corresponding emotion expression content according to the content type and the content display attribute.
Specifically, the content types of the emotion expression content may include: joy, anger, sadness, happiness, etc., the content display attributes may include: speed of change, size, color, etc. The embodiment can determine the display attributes of the emotion expression animation and the emotion expression dynamic characters according to the emotion awakening data, and determine the specific types of the emotion expression animation and the emotion expression dynamic characters according to the emotion valence data. In practical applications, for example, when the GSR data suddenly increases and peaks, the corresponding generated animation or dynamic text may have a higher speed, a larger size, and a brighter color; when the GSR data is reduced, the corresponding generated emotion expression content changes slowly, and the display color becomes dark.
And step 104, sending the emotion expression content to the chat client used by the opposite-end chat user in real time.
Specifically, the embodiment describes and expresses the emotion of the user to the two chat parties by adopting the emotion expression content so as to convey emotion and subtle information during online chat, and the two chat parties can effectively perceive the emotion of each other, so that the participation degree of each other in the chat can be increased, and the chat interaction can be carried out more effectively. In practical applications, the user may create an emotive animation or emotive dynamic text by buttons, shortcut keys, or indicia embedded in the text message, and embed an emotive animation in a specific location (e.g., in front) of the text message to accurately convey the emotions of both parties to the chat, such as "< Happy > i Happy! ", where < Happy >, that is, the emotion shown at the first row and first column position in fig. 3, expresses an animation to sufficiently convey a pleasant emotion of the user through the animation. In practical application, different types of emotion expression animations can be flexibly adopted according to different application scenes, for example, when the emotion of a user is Sad, the < Sad > animation shown in the third column position of the second row in fig. 3 can be adopted.
It should be further noted that the present embodiment also performs usability research on the chat client implemented by using the online social method, and the research target is a user who uses the chat client of the present embodiment and a conventional chat client at the same time. The test subjects were randomly assigned to use two chat systems, which were in the same building, but should not be met. They were divided into three pairs and were separately conversed with partners, each conversation lasting for more than an hour, and they were conversed with school lessons and the like. After the conversation, the subject answered the questionnaire and evaluated the chat system. First, the present embodiment examines the GSR data and the subject's responses, and when they are asked to feel that they like the chats the best, the present embodiment compares their answers with the GSR results. There is a good correlation between the GSR data and the user reported tension. Subjects show that they gradually focus their attention on the conversation and the GSR also shows similar changes, as shown in fig. 4, which is a schematic diagram of the GSR data changes provided in this example, and the GSR increases as the subjects are more focused on the conversation. (the graph records the user's GSR every minute). This indicates that GSR can be used to determine changes in emotion in real time in an online conversation. The results of the study also indicate that affective information may increase the subject's engagement in conversation. The emotional information from the chat partners gives the user the feeling that they are not only exchanging text information, but also exchanging their own feelings with each other, which enables them to participate more in the conversation.
According to the online social contact method based on emotion recognition, provided by the embodiment of the invention, physiological characteristic data of a local chat user is collected through a sensor; determining a corresponding user emotion recognition result based on the physiological characteristic data; acquiring corresponding emotion expression content according to the emotion recognition result of the user; and sending the emotion expression content to the chat client used by the opposite-end chat user for displaying. Through the implementation of the invention, the emotion of the local chat user is identified based on the physiological characteristic data of the user, and emotion expression contents such as emotion expression animation, emotion expression dynamic characters and the like corresponding to the emotion of the user are sent to the chat client used by the opposite chat user in real time for displaying, so that the emotion information of the chat user is transmitted through the animation, the dynamic characters and the like, the two chat parties are effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.
Second embodiment:
in order to solve the problems in the related art that chat contents cannot be accurately understood and chat interaction effectiveness is poor due to the fact that two chat parties cannot accurately know the emotion of each other, the embodiment shows an online social device based on emotion recognition, and referring to fig. 5 specifically, the online social device of the embodiment includes:
the acquisition module 501 is used for acquiring physiological characteristic data of the local chat user through a sensor;
a determining module 502 for determining a corresponding user emotion recognition result based on the physiological characteristic data;
an obtaining module 503, configured to obtain a corresponding emotion expression animation according to the emotion recognition result of the user; wherein, the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and a sending module 504, configured to send the emotion expression content to a chat client used by an opposite-end chat user in real time.
In some embodiments of this embodiment, the physiological characteristic data is galvanic skin response data and galvanic muscle response data. Correspondingly, the determining module 502 is specifically configured to: determining corresponding emotional arousal data based on the galvanic skin response data and corresponding emotional valence data based on the galvanic muscle response data; and determining a corresponding emotion recognition result of the user by combining the emotional arousal data and the emotional valence data.
In other embodiments of this embodiment, the physiological characteristic data is gaze tracking data. Correspondingly, the determining module 502 is specifically configured to: determining respective attention characterization data and interest characterization data based on the gaze tracking data; and determining a corresponding emotion recognition result of the user by combining the attention characterization data and the interest characterization data.
In some embodiments of this embodiment, the obtaining module 503 is specifically configured to: acquiring a corresponding content type and a content display attribute according to a user emotion recognition result; and acquiring corresponding emotion expression content according to the content type and the content display attribute.
It should be noted that, the online social method based on emotion recognition in the foregoing embodiments may be implemented based on the online social device based on emotion recognition provided in this embodiment, and it may be clearly understood by a person having ordinary skill in the art that, for convenience and brevity of description, the specific working process of the online social device described in this embodiment may refer to the corresponding process in the foregoing method embodiments, and details are not described here.
By adopting the online social contact device based on emotion recognition, physiological characteristic data of the local chat user is acquired through a sensor; determining a corresponding user emotion recognition result based on the physiological characteristic data; acquiring corresponding emotion expression content according to the emotion recognition result of the user; and sending the emotion expression content to the chat client used by the opposite-end chat user for displaying. Through the implementation of the invention, the emotion of the local chat user is identified based on the physiological characteristic data of the user, and emotion expression contents such as emotion expression animation, emotion expression dynamic characters and the like corresponding to the emotion of the user are sent to the chat client used by the opposite chat user in real time for displaying, so that the emotion information of the chat user is transmitted through the animation, the dynamic characters and the like, the two chat parties are effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.
The third embodiment:
the present embodiment provides an electronic device, as shown in fig. 6, which includes a processor 601, a memory 602, and a communication bus 603, wherein: the communication bus 603 is used for realizing connection communication between the processor 601 and the memory 602; the processor 601 is configured to execute one or more computer programs stored in the memory 602 to implement at least one step of the online social method in the first embodiment.
The present embodiments also provide a computer-readable storage medium including volatile or non-volatile, removable or non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, computer program modules or other data. Computer-readable storage media include, but are not limited to, RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), flash Memory or other Memory technology, CD-ROM (Compact disk Read-Only Memory), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
The computer-readable storage medium in this embodiment may be used for storing one or more computer programs, and the stored one or more computer programs may be executed by a processor to implement at least one step of the method in the first embodiment.
The present embodiment also provides a computer program, which can be distributed on a computer readable medium and executed by a computing device to implement at least one step of the method in the first embodiment; and in some cases at least one of the steps shown or described may be performed in an order different than that described in the embodiments above.
The present embodiments also provide a computer program product comprising a computer readable means on which a computer program as shown above is stored. The computer readable means in this embodiment may include a computer readable storage medium as shown above.
It will be apparent to those skilled in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software (which may be implemented in computer program code executable by a computing device), firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit.
In addition, communication media typically embodies computer readable instructions, data structures, computer program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to one of ordinary skill in the art. Thus, the present invention is not limited to any specific combination of hardware and software.
The foregoing is a more detailed description of embodiments of the present invention, and the present invention is not to be considered limited to such descriptions. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.