From Data to Insight: A Wearable Multi-Sensor
Module for Real-time Mental Health Monitoring
Dr. Bharani BR Dilip B Nikhil Kumar
Cambridge Institute of Technology Cambridge Institute of Technology Cambridge Institute of Technology
Bengaluru, India Bengaluru, India Bengaluru, India
bharani.ise@cambridge.edu.in dilipseervi2308@gmail.com nikhilisatwork@gmail.com
Debasis Maharana G Jayaditya
Cambridge Institute of Technology Cambridge Institute of Technology
Bengaluru, India Bengaluru, India
dmaharana603@gmail.com jayaditya530@gmail.com
Abstract—The escalating incidence of mental health challenges The core concept entails creating an integration of sev-
is ideal for the pursuit of new, non-invasive, and continuous eral physiological sensors-Plethysmography (PPG), electroder-
monitoring mechanisms. This work details the design and imple- mal activity (EDA), inertial sensors(IMU), and temperature
mentation of a wearable multi-sensor module integrated to the
glasses, for a continuous multi-parameter monitoring platform sensors- plus an eye tracker system that uses a Near Infrared
to facilitate real-time monitoring of mental health. The device camera (NoIR). An embedded microcontroller, like the ESP32,
contains multiple physiological sensors - heart rate, galvanic controls all these components. This data generated by the
skin response (GSR), temperature, and motion - and integrates sensors, such as heart rate, heart rate variability (HRV), skin
with smartphones and smartwatches to collect additional health conductance, motion data, and pupil metrics, is offered via
metrics. By using data fusion techniques, the system produces
meaningful indicators of mental health from integrated data Bluetooth to an app in a smartphone. Then the smartphone
streams. A specific mobile application provides data visualisation, app adds to this data via contextual information sourced from
early anomaly detection, and experiences tailored to the user. the phone, e.g., step count, screen-on time, sleep, etc. The
When tested as a prototype it can continuously monitor an entire goal is to fuse this multimodal dataset into one complete
individual on a non-intrusive basis, without being invasive or model that can infer stress and well-being of the user. The
causing discomfort for the user, and represents a potential tool for
early detection and intervention in mental health management. salient contributions of this study include: 1) The design
Index Terms—Mental health monitoring, wearable technology, and architecture of a novel glasses wearable for multimodal
multi-sensor integration, data fusion, real-time analytics, physi- physiological sensing; 2) An integrated software framework
ological signal processing. for acquisition, preprocessing, and fusion of data; 3) An
approach to data analysis involved in deriving mental well-
I. I NTRODUCTION being from the integrated data streams.
Chronic stress and mental health issues are increasing II. R ELATED W ORK
globally, creating a demand for continuous and personalized The wearable devices meant for various sorts of monitoring
monitoring solutions. Modern age wearable and mobile de- of stresses and emotions are indeed variedly studied. Sev-
vices generate data on an individual’s physiology and be- eral consumer products exist starting with the Empatica E4
havior in real time, potentially allowing for early detection wristband, keeping track of EDA and heart rate variability, to
of stress and emotional states [2], [3]. This paper presents fitness rings such as Oura, which track sleep time to provide a
a prototype system aimed at extending these possibilities stress and recovery score. Passive platforms are now drawing
through a glasses-type wearable device. By synergistically attention-from a head-worn or glasses-worn category, for ex-
combining multiple sensors on eyeglasses with smartphone ample. Kwon et al. in IEEE Access [1] presented a prototype
data, this concept attempts to give a more holistic perspective consisting of a glasses device that measured EDA and PPG,
of the mental state of the person. It takes inspiration from simultaneously capturing the area around one eye with a
recent advances in wearable emotion sensing: Kwon et al. small camera, and using these multi-channel data for better
have shown a glasses-type device that collects local facial emotion classification. These types of studies demonstrate the
images along with physiological signals (EDA and PPG) to suitability of eyeglass biosensors for continuous monitoring of
delineate user emotion [1]. Along the same lines, our proposed user states without wrist or chest straps being necessary.
system vices stress and emotional arousal in daily life using Sensing with the smartphone technology has been able to
electrodermal activity, cardiac measurements, and eye tracking foster, maintain, or insure a better life. Phones come with
signals. sensors and platform APIs that measure attributes such as
step counts, motion, and screen usage. It is established that indicators, bridging the gap between raw sensory input and
more than average screen times and less sleep times lead to actionable mental health insights. Table I summarizes the com-
increased stress and anxiety [5]. In our design, such knowledge plete set of hardware elements integrated into the prototype.
is leveraged by allowing queries of phone APIs for step counts
and screen time to provide context with the wearable data.
Existing state-of-the-art reviews postulate that merging mobile TABLE I
phone data with the physiological data from wearables is a W EARABLE H ARDWARE C OMPONENTS
promising avenue for psychological support [2], [3].
Physiological markers such as heart rate, HRV, and EDA are Component Description
established indicators of stress. HRV typically decreases under ESP32 Microcontroller Central controller (Bluetooth, sensor I/O).
NoIR Camera Module Infrared camera for pupil and gaze tracking.
stress and has been used in real-time stress prediction mod- Near-IR LEDs IR illumination for eye imaging (invisible to user).
els [4]. EDA (skin conductance) increases with sympathetic PPG Sensor (Pulse Sensor) Measures pulse waveform, heart rate and HRV.
arousal and has shown high accuracy for stress classification EDA Sensor Measures skin conductance (emotional arousal).
IMU (MPU-6050) Accelerometer + gyroscope for motion/orientation.
in lab studies [6]. We also include eye-based metrics: pupil Temp Sensor (DS18B20) Skin temperature as an extra physiological measure.
dilation and blink patterns change with cognitive load and LiPo Battery (300–500 mAh) Powers the wearable device.
emotion, and can be tracked by a camera [3]. These insights Battery Charger (TP4056) Recharge module for the LiPo battery.
motivate our sensor choices and multi-signal fusion approach. Glasses Frame 3D-printed housing for components.
III. M ETHODOLOGY
The proposed system comprises three layers: the wear- B. Software Layer
able hardware, the smartphone software, and the data anal-
ysis/modeling component. The hardware layer acquires raw The smartphone application functions as the intelligent con-
sensor data; the software layer on the phone collects and trol center for the entire system, seamlessly pairing with the
preprocesses all inputs; the analysis layer correlates the signals glasses to receive and process live sensor streams. Incoming
to infer stress or well-being. data from the wearable is captured in real time and securely
stored for analysis, ensuring that no critical moment is missed.
A. Hardware Layer In parallel, the application collects contextual information
The wearable module takes the familiar form of a from the user’s existing devices—such as daily step count,
lightweight glasses attachment, carefully engineered to blend sleep duration, and screen time—offering a holistic view of
comfort with advanced sensing capabilities. At its core lies an the user’s mental and physical state.
embedded microcontroller, acting as the central hub for all The app also processes the feed from the integrated eye
sensor inputs. This microcontroller continuously reads data camera. Using computer vision algorithms, such as OpenCV-
from each onboard sensor, assigns precise time-stamps, and based pupil detection, it determines both the center and
wirelessly transmits the information via Bluetooth Low Energy diameter of the pupil with high precision. Simultaneously,
(BLE) to a paired device for further processing and analysis. raw physiological signals from the PPG, EDA, and IMU
To capture subtle physiological and behavioral cues, the sensors are filtered, denoised, and feature-extracted directly
system incorporates a near-infrared (NoIR) camera paired with on the phone, allowing for cleaner, more reliable data without
infrared LEDs. This configuration enables real-time tracking of unnecessary bandwidth usage.
the pupil’s diameter and gaze direction under various lighting To make the experience engaging and user-friendly, the
conditions, offering insight into cognitive load, stress levels, system incorporates gamification mechanics—rewarding users
and fatigue. Complementing this is a photoplethysmography for healthy habits, regular check-ins, and stress-management
(PPG) sensor that measures heart rate and extracts pulse wave- achievements. Progress is visualized in interactive dashboards,
forms, enabling the detection of both short-term fluctuations transforming mental health tracking into a motivating, goal-
and long-term trends in cardiovascular activity. driven journey rather than a clinical chore.
Emotional and stress-related changes are further monitored Beyond basic monitoring, the application leverages on-
using an electrodermal activity (EDA) sensor, which detects device and cloud-based AI models to detect patterns, forecast
variations in skin conductance caused by sweat gland ac- stress or fatigue episodes, and provide personalized recom-
tivity—a well-established indicator of psychological arousal. mendations. Over time, these models learn the user’s unique
A 3-axis inertial measurement unit (IMU), combining an physiological and behavioral baseline, enabling proactive in-
accelerometer and gyroscope, captures head motion patterns, terventions such as breathing exercises, posture reminders, or
which can reflect posture, attentiveness, and physical state. gentle nudges to take breaks. The AI also powers an “issue
Finally, a precision skin temperature sensor adds another layer tracker” module that categorizes potential concerns—ranging
of physiological context, aiding in the interpretation of stress from early burnout indicators to irregular sleep patterns—and
and fatigue markers. suggests actionable steps for resolution.
Together, these components form a cohesive system capable In essence, the app bridges advanced sensing technology
of continuously monitoring both physiological and behavioral with a human-centered design philosophy, ensuring that mental
health monitoring is not only powerful and accurate but also strike a balance between accuracy, user comfort, and battery
accessible, engaging, and even enjoyable. longevity while preserving data privacy and security. As the
system evolves, opportunities exist to incorporate on-device
C. Analysis Layer AI for edge inference, further reducing latency and energy
The analysis layer is considered the brain of the system consumption while enhancing personalization.
since, in the final analysis, it cunningly combines incoming
data streams to build a common mental well-being model. IV. C ONCLUSION AND F UTURE W ORK
In this rule-based engine, there is the assessment of short- An example system has been outlined that combines phys-
term features: average heart rate, HRV (heart rate variability), iological sensors mounted on glasses with smartphone data to
electrodermal activity (EDA) level, pupil size, daily step monitor mental well-being. The hardware design includes a
count, and screen time, with all values calculated over rolling camera for eye tracking and employs PPG, EDA, IMU, and
time windows. This enables monitoring subtle physiological temperature sensors operated via a microcontroller. A mobile
and behavioral changes continuously rather than on isolated application takes care of the data collection and processing,
measurements. including phone usage metrics, and an analysis layer extracts
Based upon these measurements, an algorithm analyses the stress-related features from this data. Known correlations
data and extracts patterns indicative of stress. For instance, an between stress and signals such as heart rate, skin conductance,
uplifted heart rate with suppressed HRV and an increased level and screen time are used in the whole process.
of EDA is physiologic in nature and is a typical representation To build the actual prototype and carry out rigorous user
of increased arousal. When this physiological correlated is tied studies toward validating the system will be our next set of
with long usage without a pause on the screen and poor sleep, activities in the future. Advanced deep learning architectures
the probability of stress rises. A valid scenario is when the for multimodal fusion, including hybrid DCNN-LSTM, will
HRV drops below their personal baseline, whilst EDA sharply be considered so as to enhance the accuracy of real-time
climbs, and phone usage is recorded as being excessive with stress prediction. Study of adaptive intervention strategies,
minimal rest, signaling the system and designation as a high- together with principles of ethical AI, will further ensure the
stress state for prompt intervention. responsible and impactful deployment of this technology in
real-world settings.
D. Discussion
R EFERENCES
The results underscore the unique advantages of the glasses-
[1] J. Kwon et al., ”Emotion recognition using a glasses-type wearable
based form factor, which enables continuous, unobtrusive device via multi-channel facial responses,” IEEE Access, vol. 9, pp.
monitoring while maintaining user comfort and social accept- 12022–12032, 2021.
ability. Unlike wristbands or chest straps, smart glasses offer a [2] K. Ueafuea et al., ”Potential applications of mobile and wearable devices
for psychological support during the COVID-19 pandemic: A review,”
stable vantage point for both physiological sensing (e.g., pupil IEEE Sensors Journal, vol. 20, no. 20, pp. 12227–12244, 2020.
tracking, PPG) and contextual capture (e.g., environmental [3] J.-Y. Wu et al., ”Emerging wearable biosensor technologies for stress
cues, head motion), making them ideal for real-time mental monitoring and their real-world applications,” Sensors (Basel), vol. 22,
no. 11, 4010, 2022.
well-being assessment. [4] E. Lazarou et al., ”Predicting stress levels using physiological data:
A key strength of the proposed system lies in its multimodal Real-time stress prediction models utilizing wearable devices,” AIMS
fusion approach. By integrating physiological signals (heart Neuroscience, vol. 11, no. 2, pp. 76–102, 2024.
[5] V. S. Nakshine et al., ”Increased screen time as a cause of declining
rate, HRV, EDA), behavioral metrics (step count, screen time, physical, psychological health, and sleep patterns: A literary review,”
sleep), and ocular indicators (pupil size and dynamics), the Cureus, vol. 14, no. 10, e30051, 2022.
framework benefits from complementary data streams that [6] O. Rahma et al., ”Electrodermal activity for measuring cognitive and
emotional stress level,” J. Med. Signals Sensors, vol. 12, no. 2, pp.
improve robustness, reduce false positives, and adapt to in- 155–162, 2022.
dividual baselines. This layered fusion allows the system to
detect subtle signs of stress that might be overlooked by
unimodal approaches.
Nevertheless, several practical challenges emerged during
the design process. Motion artifact mitigation remains a critical
issue, particularly for PPG and EDA signals, where physical
movement and environmental factors can introduce noise.
Addressing this may require advanced filtering techniques,
adaptive algorithms, or hybrid sensing strategies. Power man-
agement is another central concern—balancing continuous
sensing and data transmission with battery life demands care-
ful optimization of sampling rates, on-device preprocessing,
and low-power communication protocols.
These considerations have important implications for real-
world deployment. A successful implementation will need to