[go: up one dir, main page]

0% found this document useful (0 votes)
4 views5 pages

Section 1

The document describes a multimodal dataset collected from 38 South Asian postgraduate students during Mental Rotation Tasks, aimed at understanding cognitive functions related to STEM disciplines. The dataset includes various physiological and behavioral data captured through biosensors and software tools, providing insights into engagement, attention, and stress during task performance. It is valuable for research in areas such as affective computing, learning analytics, and human-computer interaction, with a total size of ~8 GB and diverse modalities for in-depth analysis.

Uploaded by

elatedswirles0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views5 pages

Section 1

The document describes a multimodal dataset collected from 38 South Asian postgraduate students during Mental Rotation Tasks, aimed at understanding cognitive functions related to STEM disciplines. The dataset includes various physiological and behavioral data captured through biosensors and software tools, providing insights into engagement, attention, and stress during task performance. It is valuable for research in areas such as affective computing, learning analytics, and human-computer interaction, with a total size of ~8 GB and diverse modalities for in-depth analysis.

Uploaded by

elatedswirles0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Section 1: Introduction and Dataset/s Overview:

This dataset is a publicly available, multimodal, and richly annotated collection of


physiological, behavioural, and performance-related data collected from 38 South Asian
postgraduate students while they solved Mental Rotation Tasks (MRTs) in a controlled
experimental setting. Mental rotation, a process of spatial manipulation and visualisation of 3D
objects, is a critical cognitive function underpinning success in science, technology,
engineering, and mathematics (STEM) disciplines. The ability to mentally transform images
has been shown to differ based on training, strategy, feedback, and cognitive load.

The experiment was conducted at IIT Bombay, where participants completed mental
rotation tasks under three controlled categories:

Category 1: No feedback, no time limit.


Category 2: Feedback provided, no time limit.
Category 3: No feedback, with a time constraint.
Each condition was designed to simulate different learning or performance environments.
For example, Category 2 replicates a guided practice scenario, whereas Category 3 simulates
high-stakes testing. Each category included mental rotation questions that varied in difficulty
(easy, medium, hard) and design features like angular disparity, occlusion, and match/non-
match orientations.

To capture a comprehensive view of cognitive and emotional processes, each participant was
instrumented with several synchronised biosensors and software tools, including:

 EEG (Muse headband) for recording brainwave signals (delta, theta, alpha, beta,
gamma).

 Eye tracker (Tobii X3-120Hz) for gaze patterns, pupil dilation, and fixations.
 GSR (Shimmer GSR sensor) for measuring skin conductance and resistance.
 Facial emotion recognition (Affectiva via iMotions) for real-time emotion detection
from webcam video.
 Manual emotion logging (DLOT) for observer-recorded affect states.
 Task response logs (PsychoPy) for accuracy, response times, feedback interaction, and
timing.
 Subjective workload ratings (NASA-TLX) to assess perceived cognitive effort post-
task.
All these data sources were aligned using timestamps and session markers, creating a
time-synchronised multimodal dataset of high granularity and fidelity. This enables researchers
to examine how internal states such as engagement, attention, or stress evolve during tasks and
how they relate to observable behaviour and performance outcomes.

A total of ~8 GB of data has been collected, with around 200 MB per participant. Each
participant folder includes 10+ files, covering raw signals, derived features, performance
metrics, and metadata. With diverse modalities, this dataset is ideal for research areas such as:

 Affective computing.
 Learning analytics.
 Cognitive workload modelling.
 Human-Computer Interaction (HCI).
 Emotion-aware AI systems.
 Cross-cultural and gender-based behavioural analysis.

This dataset stands out globally due to its regional focus (South Asia), controlled
experiment design, and synchronised multimodal recordings, offering unprecedented insights
into the real-time mental processes involved in spatial problem solving.

Dataset Attributes Overview:

General Information:

 Number of Participants: 38
 Modalities Covered: EEG, Eye Tracking, GSR, Facial Emotion (TIVA), Manual Logs
(DLOT), Task Logs (PSY), NASA-TLX, System Events, Blank Screen Baseline.
 Average Size per Participant: ~200 MB (Total: ~8 GB)
 Total Data Modalities per Participant: 10 (EEG, GSR, Eye Tracking, Emotion
Detection, etc.)
 Sampling Rate (Varies):

1. EEG: ~256 Hz
2. Eye Tracking: 120 Hz (Tobii X3)
3. GSR: ~32 Hz
4. Emotion (TIVA): ~10 Hz

 Data Length per Task Category: Varies by participant performance (~5–20 minutes
per condition).

All files are timestamp-synchronized using UTC values, enabling cross-modality


alignment for in-depth time-series analysis.
Summary of Key Files:

File Name Type Description Missing Data?

Time- EEG brainwave values from Minor (some


1_EEG.csv
series Muse Headband NaNs)

Time- Raw eye gaze, pupil size, eye Sparse in some


1_EYE.csv
series validity points

1_IVT.csv Derived Fixations and saccades Sparse in parts

Time- Some NaNs in


1_GSR.csv Skin conductance & resistance
series early rows

Time- Facial emotion features (via Some zero/null


1_TIVA.csv
series Affectiva) values

Manual Engagement recorded by


1_DLOT.xlsx Complete
Logs observer

Task responses, category, Minimal missing


1_PSY.csv Event Log
difficulty data

1_NSTLX.csv Survey Subjective workload ratings Complete

System events, input, and


1_externalEvents.csv Event Log Sparse in labels
stimuli logs

Eye-tracking during neutral Some invalid


1_BlankScreenData.csv Baseline
screen readings
Visual Overview of the Dataset:

1. EEG Sample Preview (Delta Band - TP9 Electrode):

An EEG (electroencephalogram) sample, whether viewed during or after a recording,


displays brainwave activity as waveforms on a graph.

2. Eye-Tracking Fixation Heatmap (Gaze X vs. Y):


An eye-tracking heatmap, with gaze represented by left x and y coordinates, visually
represents areas of focus and attention on a screen during eye-tracking experiments.
3. Facial Emotion (Engagement vs Time):

Visualizing engagement and emotions over time involves creating dynamic graphs
that track emotional responses. These visualizations are used to analyze data and
insights. The relationship between data and emotions is crucial for understanding how
individuals interact with information.

4.

4. GSR Conductance Trend Over Task:

GSR (Galvanic Skin Response) conductance, also known as Electrodermal Activity


(EDA), typically reflects the level of emotional arousal or stress experienced during a
task. Increased conductance indicates greater arousal or stress, while decreased
conductance suggests relaxation or lower arousal.

You might also like