Skip to main content

    Sascha Gruss

    This study focuses on improving healthcare quality by introducing an automated system that continuously monitors patient pain intensity. The system analyzes the Electrodermal Activity (EDA) sensor modality modality, compares the results... more
    This study focuses on improving healthcare quality by introducing an automated system that continuously monitors patient pain intensity. The system analyzes the Electrodermal Activity (EDA) sensor modality modality, compares the results obtained from both EDA and facial expressions modalities, and late fuses EDA and facial expressions modalities. This work extends our previous studies of pain intensity monitoring via an expanded analysis of the two informative methods. The EDA sensor modality and facial expression analysis play a prominent role in pain recognition; the extracted features reflect the patient’s responses to different pain levels. Three different approaches were applied: Random Forest (RF) baseline methods, Long-Short Term Memory Network (LSTM), and LSTM with the sample-weighting method (LSTM-SW). Evaluation metrics included Micro average F1-score for classification and Mean Squared Error (MSE) and intraclass correlation coefficient (ICC [3, 1]) for both classification...
    BackgroundIn the clinical context, the assessment of pain in patients with inadequate communication skills is standardly performed externally by trained medical staff. Automated pain recognition (APR) could make a significant contribution... more
    BackgroundIn the clinical context, the assessment of pain in patients with inadequate communication skills is standardly performed externally by trained medical staff. Automated pain recognition (APR) could make a significant contribution here. Hereby, pain responses are captured using mainly video cams and biosignal sensors. Primary, the automated monitoring of pain during the onset of analgesic sedation has the highest relevance in intensive care medicine. In this context, facial electromyography (EMG) represents an alternative to recording facial expressions via video in terms of data security. In the present study, specific physiological signals were analyzed to determine, whether a distinction can be made between pre-and post-analgesic administration in a postoperative setting. Explicitly, the significance of the facial EMG regarding the operationalization of the effect of analgesia was tested.MethodsN = 38 patients scheduled for surgical intervention where prospectively recrui...
    Additional file 1.
    Background: The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the patient’s report on the pain sensation. Verbal scales, visual analog scales (VAS) or numeric... more
    Background: The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the patient’s report on the pain sensation. Verbal scales, visual analog scales (VAS) or numeric rating scales (NRS) count among the most common tools, which are restricted to patients with normal mental abilities. There also exist instruments for pain assessment in people with verbal and / or cognitive impairments and instruments for pain assessment in people who are sedated and automated ventilated. However, all these diagnostic methods either have limited reliability and validity or are very time-consuming. In contrast, biopotentials can be automatically analyzed with machine learning algorithms to provide a surrogate measure of pain intensity. Methods: In this context, we created a database of biopotentials to advance an automated pain recognition system, determine its theoretical testing quality, and optimize its performance. Eighty-five participants were subjected to painful heat stimuli (baseline, pain threshold, two intermediate thresholds, and pain tolerance threshold) under controlled conditions and the signals of electromyography, skin conductance level, and electrocardiography were collected. A total of 159 features were extracted from the mathematical groupings of amplitude, frequency, stationarity, entropy, linearity, variability, and similarity. Results: We achieved classification rates of 90.94% for baseline vs. pain tolerance threshold and 79.29% for baseline vs. pain threshold. The most selected pain features stemmed from the amplitude and similarity group and were derived from facial electromyography. Conclusion: The machine learning measurement of pain in patients could provide valuable information for a clinical team and thus support the treatment assessment
    The fact that training classification algorithms in a within-subject design is inferior to training on between subject data is discussed for an electrophysiological data set. Event-related potentials were recorded from 18 subjects,... more
    The fact that training classification algorithms in a within-subject design is inferior to training on between subject data is discussed for an electrophysiological data set. Event-related potentials were recorded from 18 subjects, emotionally stimulated by a series of 18 negative, 18 positive and 18 neutral pictures of the International Affective Picture System. In addition to traditional averaging and group comparison of event related potentials, electroencephalographical data have been intra-and inter-individually classified using a Support Vector Machine for emotional conditions. Support vector machine classifications based upon intraindividual data showed significantly higher classification rates [F(19.498),p<.001] than global ones. An effect size was calculated (d = 1.47) and the origin of this effect is discussed within the context of individual response specificities. This study clearly shows that classification accuracy can be boosted by using individual specific settings.
    Research Interests:
    OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction... more
    OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction (HCI), realized with a wizard-of-oz design. The induced emotions are based on the dimensional theory of emotions (valence, arousal and dominance). These emotional sequences - recorded with multimodal data (facial reactions, speech, audio and physiological reactions) during a naturalistic-like HCI-environment one can improve classification methods on a multimodal level. This database is the result of an HCI-experiment, for which 30 subjects in total agreed to a publication of their data including the video material for research purposes*. The now available open corpus contains sensory signal of: video, audio, physiology (SCL, respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus Major) and facial reactions annotations.
    In this work, different cognitive load situations are examined and classified in the context of a Human Computer Interaction (HCI) scenario. Machine learning methods were used to detect three cognitive load states (overload, underload,... more
    In this work, different cognitive load situations are examined and classified in the context of a Human Computer Interaction (HCI) scenario. Machine learning methods were used to detect three cognitive load states (overload, underload, normal load) with the help of five different psychophysiological signals (ECG, EMG, Respiration, GSR, Temperature). At first it is shown, that the three regarded states can be clearly distinguished in the Valence-Arousal-Dominance space (VAD). After this comparisons between a 10-fold-valdidation and a batch-validation as well as three different classifiers (k-Nearest-Neighbour, Naive Bayes, Random Forest) are accomplished. At last the influence of gender in contrast to an overall analysis is shown.
    The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual’s ability to recognise and report an observed pain episode. However, pain perception and... more
    The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual’s ability to recognise and report an observed pain episode. However, pain perception and expression are affected by numerous factors ranging from personality traits to physical and psychological health state. Hence, several approaches have been proposed for the automatic recognition of pain intensity, based on measurable physiological and audiovisual parameters. In the current paper, an assessment of several fusion architectures for the development of a multi-modal pain intensity classification system is performed. The contribution of the presented work is two-fold: (1) 3 distinctive modalities consisting of audio, video and physiological channels are assessed and combined for the classification of several levels of pain elicitation. (2) An extensive assessment of several fusion strategies is carried out in order to design a classification architecture that improves the performance of the pain recognition system. The assessment is based on the <italic>SenseEmotion Database</italic> and experimental validation demonstrates the relevance of the multi-modal classification approach, which achieves classification rates of respectively <inline-formula><tex-math notation="LaTeX">$83.39\%$</tex-math><alternatives><mml:math><mml:mrow><mml:mn>83</mml:mn><mml:mo>.</mml:mo><mml:mn>39</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq1-2892090.gif"/></alternatives></inline-formula>, <inline-formula><tex-math notation="LaTeX">$59.53\%$</tex-math><alternatives><mml:math><mml:mrow><mml:mn>59</mml:mn><mml:mo>.</mml:mo><mml:mn>53</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq2-2892090.gif"/></alternatives></inline-formula> and <inline-formula><tex-math notation="LaTeX">$43.89\%$</tex-math><alternatives><mml:math><mml:mrow><mml:mn>43</mml:mn><mml:mo>.</mml:mo><mml:mn>89</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq3-2892090.gif"/></alternatives></inline-formula> in a 2-class, 3-class and 4-class pain intensity classification task.
    Automated pain recognition (APR) is a method for the objective measurement of pain. The focus on APR is the computer-aided objective recognition of pain which is realized on the basis of machine learning. In recent years, several... more
    Automated pain recognition (APR) is a method for the objective measurement of pain. The focus on APR is the computer-aided objective recognition of pain which is realized on the basis of machine learning. In recent years, several databases (BioVid, SenseEmotion, X-ITE pain) have been collected in the laboratory to test machine learning algorithm. Numerous publications have taken place. However, it is now relevant to test the algorithms on clinical data sets with highly structured clinical protocols. In this respect, we recorded multimodal physiological signals during colonoscopy. Furthermore, clinical triggers (pain [Behavioral Pain Scala], movements, paralinguistics, and Propofol injections) were recorded synchronously. The database enables machine learning methods to test and optimize pain recognition on clinical data. It is planned to make the dataset available to a broad community.
    In the future, automatic pain monitoring may enable health care professionals to assess and manage pain objectively. In this framework, the researchers created a database of biopotentials for the development of an automatic... more
    In the future, automatic pain monitoring may enable health care professionals to assess and manage pain objectively. In this framework, the researchers created a database of biopotentials for the development of an automatic pain-recognition system and the optimization of its performance. The goal of the current research work was to optimize pain features with a Support Vector Machine (SVM) classifier and to obtain high recognition rates for pain quantification in a two- and multiclass problem. Data of 90 participants were acquired under the induction of phasic heat pain stimulation. 13 features were finally selected from the following categories: amplitude, stationarity, linearity, variability and similarity. Classification analyses were performed with a SVM for four classification problems (baseline vs. pain threshold; baseline vs. pain tolerance; pain threshold vs. pain tolerance; baseline vs. pain threshold vs. pain tolerance). High classification accuracies were obtained for bas...
    Prior work on automated methods demonstrated that it is possible to recognize pain intensity from frontal faces in videos, while there is an assumption that humans are very adept at this task compared to machines. In this paper, we... more
    Prior work on automated methods demonstrated that it is possible to recognize pain intensity from frontal faces in videos, while there is an assumption that humans are very adept at this task compared to machines. In this paper, we investigate whether such an assumption is correct by comparing the results achieved by two human observers with the results achieved by a Random Forest classifier (RFc) baseline model (called RFc-BL) and by three proposed automated models. The first proposed model is a Random Forest classifying descriptors of Action Unit (AU) time series; the second is a modified MobileNetV2 CNN classifying face images that combine three points in time; and the third is a custom deep network combining two CNN branches using the same input as for MobileNetV2 plus knowledge of the RFc. We conduct experiments with X-ITE phasic pain database, which comprises videotaped responses to heat and electrical pain stimuli, each of three intensities. Distinguishing these six stimulati...
    Background: Nurse assisted propofol sedation (NAPS) is a common method used for colonoscopies. It is safe and widely accepted by patients. Little is known, however, about the satisfaction of clinicians performing colonoscopies with NAPS... more
    Background: Nurse assisted propofol sedation (NAPS) is a common method used for colonoscopies. It is safe and widely accepted by patients. Little is known, however, about the satisfaction of clinicians performing colonoscopies with NAPS and the factors that negatively influence this perception such as observer-reported pain events. In this study, we aimed to correlate observer-reported pain events with the clinicians' satisfaction with the procedure. Additionally, we aimed to identify patient biosignals from the autonomic nervous system (B-ANS) during an endoscopy that correlate with those pain events. Methods: Consecutive patients scheduled for a colonoscopy with NAPS were prospectively recruited. During the procedure, observer-reported pain events, which included movements and paralinguistic sounds, were simultaneously recorded with different B-ANS (facial electromyogram (EMG), skin conductance level, body temperature and electrocardiogram). After the procedure, the examiners ...
    The authors of this study take the scientific position that of the major technologies of the future will be companion systems. This study examines which emotions and dispositions are relevant in this regard. It assesses which emotions and... more
    The authors of this study take the scientific position that of the major technologies of the future will be companion systems. This study examines which emotions and dispositions are relevant in this regard. It assesses which emotions and dispositions in the experienced scenarios of man-machine interactions are retroactively reflected in comparison with man-man interaction. The sample consisted of N=145 participants, which were divided into two groups. The first group described positive, and the second group negative scenarios man-machine and man-man interaction. Subsequently, the participants evaluated their respective scenarios with the help of 94 adjectives relating to emotions and dispositions. The correlations of the occurrence of emotions and dispositions in the man-man vs. man-machine interactions are very high. However, adjectives that are particularly relevant for man-man or man-machine interactions could also be identified. The results speak for a high similarity in the re...
    OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction... more
    OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction (HCI), realized with a wizard-of-oz design. The induced emotions are based on the dimensional theory of emotions (valence, arousal and dominance). These emotional sequences recorded with multimodal data (facial reactions, speech, audio and physiological reactions) during a naturalistic-like HCI-environment one can improve classification methods on a multimodal level. This database is the result of an HCI-experiment, for which 30 subjects in total agreed to a publication of their data including the video material for research purposes*. The now available open corpus contains sensory signal of: video, audio, physiology (SCL, respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus Major) and facial reactions annotations. Keywords—Open multimodal emot...
    In this brief report, we present the results of our investigation into the impact of age on reactions in the form of facial expressions to positive and negative feedback during human-computer interaction. In total, 30 subjects were... more
    In this brief report, we present the results of our investigation into the impact of age on reactions in the form of facial expressions to positive and negative feedback during human-computer interaction. In total, 30 subjects were analyzed after a video-recorded mental task in the style of a Wizard of Oz scenario. All subjects and their facial reactions were coded using the Facial Expression Coding System (FACES). To summarize briefly, we can conclude from our facial expression analysis that compared with their younger counterparts, elderly people show significantly lower levels of negative expression in response to positive feedback from the technical system ("Your performance is improving!"). This result indicates that elderly people seem to benefit more from praise during interaction than younger people, which is significant for the design of future companion technologies.
    Background Pain detection and treatment is a major challenge in the care of critically ill patients, rendered more complex by the need to take into consideration the risk of insufficient or excessive analgesia. The nociceptive flexion... more
    Background Pain detection and treatment is a major challenge in the care of critically ill patients, rendered more complex by the need to take into consideration the risk of insufficient or excessive analgesia. The nociceptive flexion reflex threshold (NFRT) has become the established basis for measuring the level of analgesia in the perioperative context. However, it remains unclear whether NFRT measurement can be usefully applied to mechanically ventilated, analgosedated critically ill patients who are unable to communicate. Therefore, the aim of the present study was to investigate whether there is an association between the NFRT measurement and the Behavioral Pain Scale (BPS) in critically ill, analgosedated, and mechanically ventilated patients and whether the NFRT measurement can also detect potential excessive analgesia. Methods This prospective, observational, randomized single-center pilot study included patients admitted to the surgical Intensive Care Unit of University Ho...
    Automatic systems enable continuous monitoring of patients' pain intensity as shown in prior studies. Facial expression and physiological data such as electrodermal activity (EDA) are very informative for pain recognition. The... more
    Automatic systems enable continuous monitoring of patients' pain intensity as shown in prior studies. Facial expression and physiological data such as electrodermal activity (EDA) are very informative for pain recognition. The features extracted from EDA indicate the stress and anxiety caused by different levels of pain. In this paper, we investigate using the EDA modality and fusing two modalities (frontal RGB video and EDA) for continuous pain intensity recognition with the X-ITE Pain Database. Further, we compare the performance of automated models before and after reducing the imbalance problem in heat and electrical pain datasets that include phasic (short) and tonic (long) stimuli. We use three distinct real-time methods: A Random Forest (RF) baseline methods [Random Forest classifier (RFc) and Random Forest regression (RFr)], Long-Short Term Memory Network (LSTM), and LSTM using sample weighting method (called LSTM-SW). Experimental results (1) report the first results of...
    Background: Over the last 12 years, the fundamentals of automated pain recognition using artificial intelligence (AI) algorithms have been investigated and optimized. The main target groups are patients with limited communicative... more
    Background: Over the last 12 years, the fundamentals of automated pain recognition using artificial intelligence (AI) algorithms have been investigated and optimized. The main target groups are patients with limited communicative abilities. To date, the extent to which anesthetists and nurses in intensive care units would benefit from an automated pain recognition system has not been investigated.Methods:N = 102 clinical employees were interviewed. To this end, they were shown a video in which the visionary technology of automated pain recognition, its basis and goals are outlined. Subsequently, questions were asked about: (1) the potential benefit of an automated pain recognition in clinical context, (2) preferences with regard to the modality used (physiological, paralinguistic, video-based, multimodal), (3) the maximum willingness to invest, (4) preferences concerning the required pain recognition rate and finally (5) willingness to use automated pain recognition.Results: The res...
    Affective computing aims at the detection of users' mental states, in particular, emotions and dispositions during human-computer interactions. Detection can be achieved by measuring multimodal signals, namely, speech, facial... more
    Affective computing aims at the detection of users' mental states, in particular, emotions and dispositions during human-computer interactions. Detection can be achieved by measuring multimodal signals, namely, speech, facial expressions and/or psychobiology. Over the past years, one major approach was to identify the best features for each signal using different classification methods. Although this is of high priority, other subject-specific variables should not be neglected. In our study, we analyzed the effect of gender, age, personality and gender roles on the extracted psychobiological features (derived from skin conductance level, facial electromyography and heart rate variability) as well as the influence on the classification results. In an experimental human-computer interaction, five different affective states with picture material from the International Affective Picture System and ULM pictures were induced. A total of 127 subjects participated in the study. Among al...
    The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the... more
    The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the patient's report on the pain sensation. Verbal scales, visual analog scales (VAS) or numeric rating scales (NRS) count among the most common tools, which are restricted to patients with normal mental abilities. There also exist instruments for pain assessment in people with verbal and / or cognitive impairments and instruments for pain assessment in people who are sedated and automated ventilated. However, all these diagnostic methods either have limited reliability and validity or are very time-consuming. In contrast, biopotentials can be automatically analyzed with machine learning algorithms to provide a surrogate measure of pain intensity. In this context, we created a database of biopotentials to advance an automated pain recognition system, determine its theoretical testing quality, and optimize its performance. Eighty-five participants were subjected to painful heat stimuli (baseline, pain threshold, two intermediate thresholds, and pain tolerance threshold) under controlled conditions and the signals of electromyography, skin conductance level, and electrocardiography were collected. A total of 159 features were extracted from the mathematical groupings of amplitude, frequency, stationarity, entropy, linearity, variability, and similarity. We achieved classification rates of 90.94% for baseline vs. pain tolerance threshold and 79.29% for baseline vs. pain threshold. The most selected pain features stemmed from the amplitude and similarity group and were derived from facial electromyography. The machine learning measurement of pain in patients could provide valuable information for a clinical team and thus support the treatment assessment.

    And 15 more