[go: up one dir, main page]

Next Article in Journal
Acute Effects of Carbon Fiber Insole on Three Aspects of Sports Performance, Lower Extremity Muscle Activity, and Subjective Comfort
Next Article in Special Issue
Sensors and Communications for the Social Good
Previous Article in Journal
0.5-V Nano-Power Shadow Sinusoidal Oscillator Using Bulk-Driven Multiple-Input Operational Transconductance Amplifier
Previous Article in Special Issue
Indoor Spatiotemporal Contact Analytics Using Landmark-Aided Pedestrian Dead Reckoning on Smartphones
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of the Uncertainty of Interpersonal Communications on Behavioral Responses of the Participants in an Immersive Virtual Reality Experience: A Usability Study

1
Department of Computer Science and Engineering, University of Bologna, 40126 Bologna, Italy
2
Department of the Arts, University of Bologna, 40126 Bologna, Italy
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(4), 2148; https://doi.org/10.3390/s23042148
Submission received: 29 December 2022 / Revised: 2 February 2023 / Accepted: 6 February 2023 / Published: 14 February 2023
(This article belongs to the Collection Sensors and Communications for the Social Good)
Figure 1
<p>The stages of the experimental design.</p> ">
Figure 2
<p>Screenshots showing some steps of the familiarization phase, from left to right: (<b>a</b>) selecting the “I am ready” button by the user; (<b>b</b>) teleporting to a destination; (<b>c</b>) reading instructions of the task from the panel on the user’s left hand; (<b>d</b>) removing an interactive object; (<b>e</b>) submitting the current task; (<b>f</b>) selecting the “Move to the office” button to move there.</p> ">
Figure 3
<p>Screenshots showing two views of the office: (<b>a</b>) the view of the office from the perspective of the user at the beginning; (<b>b</b>) Another view of the office.</p> ">
Figure 4
<p>Screenshots showing five sources of information for the user during the experience: (<b>a</b>) Arrow 1 points to a small blackboard that displays the name of the current task. Arrow 2 points to a big blackboard that communicates the current status of the tasks during the experience; (<b>b</b>) Arrow 3 points to a small blackboard attached to the panel that is a closeup of the big one; (<b>c</b>) Arrow 4 points to a phone that blinks and rings when the boss calls; (<b>d</b>) Arrow 5 points to a task board that shows the instructions for each task.</p> ">
Figure 5
<p>Screenshots showing different parts of the panel for interactions: (<b>a</b>,<b>d</b>) selecting the “I am ready” and “Answering the phone” buttons by the user; (<b>b</b>,<b>e</b>) visualization of the task tab; (<b>c</b>,<b>f</b>) visualization of the item tab.</p> ">
Figure 6
<p>Screenshots showing the participant when: (<b>a</b>) removing interactive objects; (<b>b</b>) using the teleportation to move in the environment.</p> ">
Figure 7
<p>Screenshots from seven different areas for the placements of objects characterized by the color of their associated rugs.</p> ">
Figure 8
<p>The steps that the participant experiences.</p> ">
Figure 9
<p>A comparison between the means of ratings obtained from the participants for their perceived uncertainty of Task 1 (M = 1.88, SD = 1.11) and Task 2 (M = 3.41, SD = 1.00); Range of answers = 1–5.</p> ">
Figure 10
<p>A comparison between the response time of the user to the source of information (the phone call) during Task 1 and Task 2, measured in [s].</p> ">
Figure 11
<p>A comparison between the task time completion of the user for Task 1 and Task 2, measured in [s]. In these data, two completion time values identified as outliers which are shown in white circles in the figure.</p> ">
Figure 12
<p>A visualization of participants’ change of position in Task 1; the 8 task targets are shown in blue.</p> ">
Figure 13
<p>A visualization of participants’ change of position in Task 2; the task targets related to before the change of the task are shown in red; the task targets related to after the change of the task are shown in green.</p> ">
Figure 14
<p>A comparison between the scores obtained from the participants from the System Usability Scale (SUS) questionnaire (M = 76.32, SD = 9.89).</p> ">
Figure 15
<p>Percentages of ratings for each category from the System Usability Scale (SUS) questionnaire, categories from left to right are: best imaginable (score &gt; 84.1); excellent (72.6 &lt; score &lt; 84.0); good (62.7 &lt; score &lt; 72.5); ok (51.7 &lt; score &lt; 62.6); poor (26 &lt; score &lt; 51.6); worst imaginable (score &lt; 25).</p> ">
Figure 16
<p>A comparison between the scores obtained from the participants from the Immersive Experience questionnaire (IEQ) (M = 65.84 (%), SD = 7.13 (%)).</p> ">
Figure 17
<p>Range of (1: Not at all; 5: A lot) for a comparison between the means of scores for the components of the Immersive Experience questionnaire (IEQ) scale, from left to right: overall IEQ score (M = 3.17); challenge (M = 2.59); cognitive involvement (M = 4.06); control (M = 2.95); emotional involvement (M = 3.20); and real-world dissociation (M = 3.059).</p> ">
Figure 18
<p>A comparison between the scores obtained from the participants from the Slater–Usoh–Steed presence (SUS) questionnaire (M = 64.37 (%), SD = 14.50 (%)).</p> ">
Figure 19
<p>A comparison between the scores obtained from the participants from the Motion Sickness questionnaire (MASQ) (M = 22.18 (%), SD=12.84 (%)).</p> ">
Figure 20
<p>A comparison between the scores obtained from the participants from the Intolerance of Uncertainty Scale (IUS) (M = 59.48 (%), SD=11.63 (%)).</p> ">
Versions Notes

Abstract

:
Two common difficulties which people face in their daily lives are managing effective communication with others and dealing with what makes them feel uncertain. Past research highlights that the result of not being able to handle these difficulties influences people’s performance in the task at hand substantially, especially in the context of a social environment such as a workplace. Perceived uncertainty of information is a key influential factor in this regard, with effects on the quality of the information transfer between sender and receiver. Uncertainty of information can be induced into the communication system in three ways: when there is any kind of information deficit that makes the target message unclear for the receiver, when there are some requested changes that could not be predicted by the receiver, and when the content of the message is so interconnected and complex that it limits understanding. Since uncertainty is an inseparable feature of our lives, studying the effects that different levels of it have on individuals and how individuals nevertheless accomplish the tasks of daily living is of high importance. Modern technologies such as immersive virtual reality (VR) have been successful in providing effective platforms to support human behavioral and social well-being studies. In this paper, we suggest the design, development, and evaluation of an immersive VR serious game platform to study behavioral responses to the uncertain features of interpersonal communications. In addition, we report the result of a within-subject user study with 17 participants aged between 20 and 35 and their behavioral responses to two levels of uncertainty with subjective and objective measures. The results convey that the application successfully and meaningfully measured some behavioral responses related to exposure to different levels of uncertainty and overall, the participants were satisfied with the experience.

1. Introduction

Most of us would agree that uncertainty, in many circumstances, is not something we like to experience. For example, we do not wish to be uncertain about our ability to pay bills at the end of each month, our work and educational prospects, and our health [1]. Some studies in neuroscience also support this claim by providing evidence that the human brain is hardwired to interpret uncertainty as a danger and respond to it with fear and stress [2,3]. A human brain under uncertainty tends to overestimate and dramatize danger [4], jump to conclusions [5], and underestimate its ability to handle it [6,7,8].
Following this approach to uncertainty, the goal has been to reduce it [9,10]. For example, people are encouraged to reduce the uncertainty of loss of income in old age or of possible unemployment with saving money, paying taxes, and buying insurance policies [1]. In education, traditionally, uncertainty is often seen as a threat and removed by exposing students to clearly defined problems, following predefined methods of solving them, to reach expected outcomes [11]. The reality is that we live in an uncertain and complex world [1]. Despite our best efforts, things do not always go as planned, and unexpected events may happen. Hence, one should strive to accept uncertainty, performing tasks aware of its existence instead of amplifying its fear with the risk of arguing with life rather than living it. The recent experience with COVID-19 supports such an idea [12]. This is why many educators have recently sought the best ways to provide a structured and supportive learning environment to prepare young students to respond productively to the challenges originating from dealing with uncertainty [13,14]. As described by Beghetto [15], novel learning environments should structurally offer uncertainty, engaging students with it, teaching them how to sit with its difficulty, how to explore, how to generate and evaluate new possibilities, and, most importantly, take action based on them [16]. In this way, uncertainty may act as a catalyst for creative answers rather than an unbeatable barrier. This approach motivates the idea of designing and implementing platforms to support the study of the behavioral responses that the uncertainty may trigger [12,17,18].
The broad concept of uncertainty is, in fact, closely connected with that of information which, in turn, is at the core of interpersonal communications [19,20]. Interpersonal communication concerns the study of social interaction between people and tries to understand how verbal and written dialogues, as well as nonverbal actions, are used to achieve communication goals [21]. Studies show individuals facing different levels of uncertainty have different behavioral responses, from negative to positive [22,23,24]. The ways a human being may deal with an uncertain situation may differ based on individual differences [25], culture [26], and the level of expertise [27]. Hillen et al presented a conceptualization of an individual’s experience of uncertainty based on a categorization of potential responses [28]. In such a model, ambiguities or/and complexities generate(s) stimuli to the information system. Uncertainties appear when individuals perceive (consciously become aware of) their existence. Cognitive, emotional, and behavioral responses then follow such a perception.
Virtual Reality (VR) systems may act as feasible platforms to assist in understanding behavioral responses to the uncertainty of interpersonal communications, as they may provide 3D spaces involving the same kind of navigational and communication challenges experienced in the real world [29]. With VR, it is possible to create structured environments where the ability of people to cope with challenges can be observed, behavioral data gathered, eventual achievements and feedback engineered, and strategies for skill improvement applied in a top-down fashion [30,31,32,33]. In VR, people can express their ideas, feel in control, and accomplish tasks and communicate with others [34,35,36,37]. This raises the potential to enjoy and engage in activities in the digital space and then apply them to the real world to improve one’s social well-being [38]. In addition, creating such an experience in the context of a serious game can support situated cognition by contextualizing a player’s experience in an engaging and realistic environment [39]. In addition, it can benefit from those game design techniques that support the idea that uncertainty could potentially maintain a user’s attention and engagement, providing the motivation to continue even in challenging moments [1].
Considering this domain, we propose the design and development of an immersive virtual reality experience whose scope is to support the investigation of how people manage uncertainty while performing tasks in a workplace scenario. This experience, implemented as a serious game, aims at simulating a workplace scenario, a social environment where success in managing effective interpersonal communication appears very important [40,41,42,43,44].
With this work, we aim to contribute to the research community by providing answers to the research questions below:
  • RQ1: How do the participants rate their experience with different tasks in terms of perceived uncertainty?
  • RQ2: How do different degrees of uncertainty affect users’ behavior and performance in this immersive virtual workplace scenario?
  • RQ3: How are the users’ subjective responses to uncertainty related to the objective responses?
  • RQ4: How does the user evaluate the quality of his/her experience?
This paper is structured as follows. Section 2 discusses related work. Section 3 describes the interaction techniques, environment, and task design process of this immersive VR serious game. Section 4 describes the result of a usability study that evaluates the user experience of the proposed VR system as well as reports some behavioral responses. Section 5 discusses the main findings of the experiment. Section 6 concludes the paper and discusses future opportunities for research.

2. Related Work

In this section, we present and discuss the works that fall closest to our contribution. A good body of research has focused on the study of “Navigational uncertainty” and its effect on the user’s spatial navigation performance and behavior [45,46,47,48]. In this area of research, uncertainty has been mostly introduced into the system by creating a perception of disorientation [49] and curing conflict [50] for the user, resulting in an increase in his/her information-seeking behavior. In their recent review, Keller et al. [51] proposed that collecting and analyzing continuous navigational data obtained from the participants in virtual reality experiences that create navigational uncertainty can potentially provide important insight into their information-seeking behavior. For example, in this research [46], the authors focused on the “Looking around behavior” as a common type of information-seeking behavior of participants when experiencing navigational uncertainty. They recorded continuously the heading direction and tried to find its relation to navigational success measures. From this body of literature, we could conclude the potential and importance of the data that could be captured from VR experiences to provide insights into the behavioral responses of people, especially in the study of the effects of a variable, such as uncertainty, on behavior.
Another area in which the study of uncertainty has received a lot of attention is gaming. As Costikyan et al. [1] claim, games could improve by purposefully applying the concept of uncertainty in their designs. Uncertainty could act as a catalyst to hold users’ attention and interest; mastering it may help pursue a game’s goal in an efficient and non-threatening way [52]. In addition, Costikyan et al. [1] support these claims by citing the sociologist Roger Callios [53] “Play is… uncertain activity. Doubt must remain until the end, and hinges upon the denouement… every game of skill, by definition, involves the risk for the player of missing his stroke and the threat of defeat, without which the game would no longer be pleasing. The game is no longer pleasing to one who, because he is too well trained or skillful, wins effortlessly and infallibly”.
In the following, we review some examples of games that exploit uncertainty in their design and present a comparison of their features in Table 1:
  • Gone Home [54] is a first-person exploration game designed to put players in unknown situations, engaging them to stay and accomplish some tasks, such as uncovering the narration by non-linear progression through searching the space. This game puts a player in the shoes of a young woman who returns home and finds that her family is absent. As Veale et al. [55] also discussed, Gone Home is a video game that uses effective storytelling to create empathy and a sense of responsibility in users by placing them within a recent historical moment. In this way, it exposes the user to the positive and negative elements of the past and encourages him/her to stay in the game and reflect on these elements [56]. While not strong on interactivity, the game through a careful visual, spatial, and audio design of the environment leads its users to explore the house along a twisting, uncertain path and find out what happened to the woman’s family through an analysis of imperfect clues from the memorabilia, journals, and other items left around the various rooms. During the experience, there are notes, voices, and letters from or to her family that motivate and guide her in the exploration. These items of cues can be kept in the inventory and reviewed whenever desired [54]. Considering an interest in the study of navigational behaviors of users, Bonnie Ruberg [57] argues that with a deeper analysis of the interactive elements of the game, the player path is linear instead of meandering despite what it seems the game encourages players to do. The path is already set and the locked, or hidden doors prevent the user to have access to some areas unless they trigger an event or find an object that unlocks this barrier in a predefined order.
  • Don’t Starve [58] is a survival game that places the user in the role of a scientist who finds himself in a strange and unfamiliar world. The goal is to collect and effectively use survival tools. An uncertain scenario amounts to the interaction with the frogs in the game, as this creates ambiguity, as it is unclear whether they are hostile. For example, they can represent food, but different outcomes may result from eating them. The game successfully engages the user to accept this ambiguity till effectively able to develop higher-level strategies to interact with them. Farah et al. [59] studied the multiplayer expansion of the game to track cooperative features and teamwork behavioral markers.
  • Wenge xu et al. [60] developed a motion-based survival game, GestureFit, that involves the user in a fight with a monster. They induced uncertainty in the system through three uncertain game elements: false attacks (creating the perception that there would be a chance that the system is tricking the player to waste a defense move by defending against a false attack), misses (creating the perception that there would be a chance that the actual hit will be interpreted as a miss), and critical hits (creating the perception that there would be a chance that an attack would be a critical attack and produce more damage than a normal one). In this way, they created two different levels of uncertainty, one with inducing these uncertain elements into the game and the other without. After, they conducted a study to measure the effects of levels of uncertainty (certain and uncertain), the display type (VR and LD), and age (young adults and middle-aged adults) on the game experience, performance, and exertion level. Their results showed that for the kind of game they designed, virtual reality could improve game performance. In addition, they found that the uncertain elements that they applied in their design might not help enhance the overall game experience, but could help increase the user’s exertion.
  • RelicVE [61] is a virtual reality (VR) game that gives the user a similar role to an archaeologist and engages him in an exploration process of an archaeological discovery experience. It exploits uncertainty in the design of their exploration process by placing the user in a situation where s/he does not know the shape and features of the target artifact and only can discover it by gradually and strategically using available tools and physical movements. In this way, when the user hits specific triggers, a new part of the information about the artifact will be uncovered. They also managed the complexity of the game by the complexity of the shape and volume of the artifact. They integrated VR interaction techniques in the design of their virtual system to create an experience close to the real-world experience of archaeologists and in this way increased the immersion and physical activity of the user during the experience. In addition, they used a timer and a health bar to add the element of time pressure to the experience. To evaluate the experience, the authors also conducted a usability study that found the experience to be innovative as it can improve players’ learning and motivation by adding the elements of uncertainty into the design.
To the best of our knowledge, no previous work took full advantage of the available technologies, such as virtual reality, to induce structured uncertainty and investigate the influence of uncertainty levels on human behavior with a focus on interpersonal communications. Our study tries to take this step from within the design and development of such an application by applying some of the design techniques inspired by the previous games in this area and virtual reality techniques that improve the user experience and the study of behavior.

3. Experimental Setting

In this section, we describe the experiment we conducted to study the effects of different levels of uncertainty on behavioral responses, performance, and quality of the experience of the participants resorting to objective and subjective measures. Figure 1 visualizes the stages of this experimental design.

3.1. Participants

We recruited 17 participants (3 female, 14 male, age: 20–35, M = 25.05, SD = 1.75) who are affiliated with our university to take part in our study. Detailed demographic information appears in Table 2. Before starting our experiment, we asked them to rate (1 = never, 5 = every day) any previous experience with virtual reality using a head-mounted display (M = 2.88, SD = 1.36) and their level (1 = low, 5 = high) of English proficiency (M = 3.47, SD = 1.01).

3.2. Materials

We now proceed to present the design and implementation choices of the virtual environment.

3.2.1. Setup

In our experiment, participants navigated in a virtual office via an HTC Vive Pro HMD (refresh rate: 90 Hz, resolution: 1440 × 1600 pixels, FoV 110°) connected to a workstation (Intel(R) Core(TM) i7-6850K CPU @ 3.60 GHz, 3.60 GHz). The environment was developed using Unity 3D version 2019.4.35 f1. Unity 3D is a game engine developed by the Unity Technologies (San Francisco, CA, USA). It is a very famous platform that has been used by game developers across the world. The data analysis was performed using R version 4.2.2 and RStudio version 2022.07.2+576. R is a programming language and software environment for statistical computing and graphics, developed by the R Development Core Team and maintained by the R Foundation (Vienna, Austria). RStudio is an integrated development environment (IDE) for the R programming language, developed by RStudio, Inc. (Boston, MA, USA).

3.2.2. Experience Design

The experience was designed as a role-playing serious game where a user, in the role of a new employee, is exposed to two different levels of uncertainty in the context of interpersonal communication in a workplace scenario. To this aim, the story plot that develops within the experience takes inspiration from Amelia Bedelia, the protagonist and title character of the children’s book series authored by Peggy Parish [62]. Amelia Bedelia is a housekeeper who takes her instructions literally because her boss could not be present in the house on the first day of her work. The instructions include lexical ambiguity coming from each sentence. Despite such ambiguity, Amelia stays positive and expresses her excitement to do her job well and make her boss happy, but she repeatedly misunderstands the guidance. Inspired by this story line, our application implements:
  • An absence of guidance when a user is following and executing given instructions. This design limits the access to sources of information, asking the user to focus and rely on the already provided knowledge or information that may come after from sources such as panels and phone calls;
  • Specific means of communicating instructions, which may be textual with the use of panels (and sometimes verbal, e.g., through phone calls) as a result of the absence of guidance;
  • Specific instruction communication patterns in the form of sequences of sentences, such as what appears in step-by-step construction manuals. At the same time, it enables an experiment designer to purposely reduce the amount of available information and change the complexity of the sentences to control the amount of ambiguity and complexity in the system;
  • The possibility of applying lexical ambiguity, as another potential source of confusion;
  • A friendly environment supporting understanding and empathy in interpersonal communications.
Please see the Table 3 for a comparison between these features in the proposed platform and Amelia Bedelia story.
The application includes two phases: the “Familiarization” phase and the “Main” phase. The goal of the “familiarization” phase is to remove any uncertainty arising from unfamiliarity with VR interfaces and the related context. For this purpose, it provides information and a step-by-step tutorial with feedback to familiarize the user with the context and allow the user to feel confident with the interactions that will then be executed. The user in this phase will get to know the boss, his/her role, the space s/he will be working in, the means of communication, and the way s/he can accomplish the tasks indicated by the boss using the available interfaces. At the end of this phase and when the system confirms the user has successfully executed all steps, s/he will reach the virtual office by pressing the “Move to the office” button from the panel on the left hand (see Figure 2 for some screenshots taken from the familiarization phase).
The main phase starts with the user finding himself/herself inside a virtual office in front of a door. After 10 s, a phone starts ringing, and s/he should answer. The boss is on the phone, welcoming and asking the player to follow some instructions, explaining three options that will be available during the experience: submitting the task, suppressing it, and requesting help using the buttons on the panel. The boss also says that if something important comes up he will call again. By pressing the “I am ready” button on the panel, a description of the first task appears. The user can now teleport to move within the office environment, removing objects based on the instructions. The removed objects then become visible in the “Item” tab of the panel. The user can cancel a previous removal by pressing the close button near each image. The user will be asked to complete a second task either by pressing the “Submit task” button or the “I do not do this task” button. After removing a specific number of objects, in the middle of the second task the phone will ring. The boss warns the user that it may be necessary to cancel previous removals to follow a new set of instructions. Task 2 finishes either by pressing the “Submit task” button or the “I will not do this task” button. The user can also decide to exit the game by pressing the “Exit the game” button.
The virtual office is hence furnished with interactive and non-interactive objects as well as two dynamic blackboards as two sources of information (See Figure 3 for some screenshots showing the virtual office environment). As described in Figure 4, there are five possible sources of information in the experience: 1. a small blackboard displays the name of the current task; 2. a big blackboard communicates the current status; 3. a small blackboard attached to the panel is a closeup of the big one; 4. a phone that blinks and rings when the boss calls; and 5. a task board showing the instructions for each task. The different parts of the panel are shown in Figure 5. An example of teleporting and removing interactive objects may be viewed in Figure 6. In addition, to increase the immersion, during the main phase an ambient sound is played, simulating the sounds coming from nearby offices to help reduce the confounding effects of noises coming from the real world.
The tasks amount to sequences of instructions to search and remove objects expressed in written form or verbally at different moments in the experience. The tasks include two levels of uncertainty, as inspired by the definition of Hillen et al. [28]. As explained before, uncertainty appears in terms of ambiguity, probability, and complexity. Ambiguities result from incomplete guides and instructions. Probable situations appear as unexpected task changes, and complexity as a change in the number of causal factors in the instructions. Following this guide, we proposed these two tasks to represent two levels of uncertainty:
Task 1 (or base task): This includes a simple and clear set of instructions. For each step, the number, place, and color of objects that should be removed are clearly expressed.
Instructions for Task 1:
  • Step 1: Go to the blue rug area. Remove the red glasses and green smartphone from the desk.
  • Step 2: Go to the blue rug area. Remove the apple and orange from the tables.
  • Step 3: Go to the brown rug area. Remove the blue cup from the table.
  • Step 4: Go to the red rug area. Remove the green wallet and the sandwich from the table.
  • Step 5: Go to the white rug area. Remove the pink book from the table.
Task 2 (or with an intermediate level of uncertainty): This is a task whose degree of uncertainty includes more complex instructions when compared to Task 1. This Task comprises lexical ambiguity in the instruction, instructions with missing information, and the possibility of instruction changes on the fly (See Figure 7 for the placements of objects in seven different areas of the environment with their associated rugs).
Instructions for Task 2:
  • Step 1: Go to the brown rug area. Remove any food not positioned on a plate from the table.
  • Step 2: Remove the glasses from the brown rug area.
  • Step 3: If you find some mugs in the pink rug area, remove the orange one.
  • Step 4: Go to the green rug area. Remove the pillows that are closest to the hat.
  • Step 5: If you find calculators on the table and a bag under the table in the white rug area, remove the calculators.
Instructions for Task 2 (after change):
  • Step 1: Go to the brown rug area. Remove any food on plates on the table.
  • Step 2: Remove the glasses (if you find them there) from the brown rug area.
  • Step 3: If you find any books in the blue rug area, remove any pencils near them.
  • Step 4: Go to the red rug area. If a bag lies under the table, do not remove the smartphone.
  • Step 5: If you see any mugs on the table in the yellow rug area, remove them.

3.3. Methods

3.3.1. Procedure

After the participants read the consent form and provided their informed consent, we briefly explained that the experience would develop in a virtual office and that they would be asked to perform some tasks there. Then, a short introduction of the HTC Vive Pro headset, controllers, sensors, and their applications for this study was provided. Then, the users started with the familiarization phase and were guided to the main phase of the experiment. Afterward, the users answered demographic and evaluation questions that will be analyzed later. Figure 8 provides a diagram showing the steps that the participant experiences.

3.3.2. Measures

In this section, we describe the objective and subjective measures used to test our hypotheses.
  • Objective measures
The application records behavioral responses that could be inferred from the HTC Vive controllers and the headset log data. The following variables were measured:
  • Variables related to the time:
    Time to submit Task 1;
    Time to submit Task 2;
    Response time to new messages in Task 1;
    Response time to new messages in Task 2.
  • Variables related to the position: Position of the user in each moment.
  • Subjective measures
We utilized multiple questionnaires to evaluate participants’ subjective experience with the application as detailed below.
  • The Demographic questionnaire: This questionnaire asked participants about their nationality, sex, age, and education level.
  • Level of English proficiency questionnaire: A five-point Likert scale was used to rate the level of English proficiency of participants (See Table 4).
  • Previous Experience with immersive VR: A five-point Likert scale was used to rate the previous experience with immersive VR of participants (See Table 4).
  • Perceived uncertainty questionnaire: In this questionnaire, using a five-point Likert scale, participants were asked to rate their level of perceived uncertainty for each task after the experiment (See Table 4).
  • System Usability Scale (SUS) questionnaire: This questionnaire [64] consists of 10 items and utilizes a scale of 1 (Strongly disagree) to 5 (Strongly agree) to provide a “quick and dirty” reliable tool for measuring usability.
  • Slater–Usoh–Steed presence questionnaire (SUS): This questionnaire [65] consists of five items and utilizes a scale of one to seven to assess participants’ sense of being there in a virtual office.
  • The immersive experience questionnaire (IEQ): This questionnaire [66] comprises 31 items and utilizes a scale of 1 (Not at all) to 7 (A lot) to measure the subjective experience of being immersed while playing a virtual serious game.
  • Motion sickness questionnaire (MSAQ): This questionnaire [67] comprises 16 items, utilizes a scale of 1 to 9, and is a valid instrument for the assessment of motion sickness.
  • Intolerance of Uncertainty Scale (IUS): This questionnaire [68] consists of 27 items and utilizes a scale of 1 (Not at all characteristic of me) to 5 (Entirely characteristic of me) that assesses emotional, cognitive, and behavioral reactions to ambiguous situations, implications of being uncertain, and attempts to control the future.

4. Results

In this section, we present the objective and subjective results of our experiment concerning our research questions:
We compared the ratings that the participants gave to the perceived uncertainty of two tasks with the Wilcoxon signed-rank test. The result found a significant difference between them (v = 0, p = 0.0003553 < 0.05), suggesting that overall the participants rated Task 2 with higher perceived uncertainty than Task 1 (See also Figure 9 for a visual comparison of the ratings).
To find the effects of different degrees of induced uncertainty on the user’s behavior, we first confirmed the normality of the data with the Shapiro–Wilk test at the 5% level. Then, we conducted the Paired t-test. The results did not yield a significant difference between the response time in the two tasks (t(16) = 1.44, p = 0.084 > 0.05). However, the box plot in Figure 10 visually shows a lower response time to pick up the phone in Task 2 when compared to Task 1.
Since the normality of the data was rejected by the Shapiro–Wilk test at the 5% level, using the Wilcoxon signed-rank test (v = 0, p = 0.00001526 < 0.05), we found a significant difference between the task completion time for Task 1 and Task 2 (See also Figure 11 to see a visual comparison between the amounts).
To report the differences in the change of position in Task 1 in comparison to Task 2, Figure 12 and Figure 13 present a visual comparison of participants’ change of position.
We used Pearson’s r-test to measure the strength and direction of the possible linear relationship between the scores on system usability, immersion, presence, motion sickness, and intolerance of uncertainty questionnaires and the recorded time to answer the second call (i.e., response time to the source of information in Task 2). We also used this test for finding the possible relationships between the scores of these questionnaires and the time spent on the second task. See Table 5 for the results of these tests.
Table 6 reports the mean and standard deviation of scores obtained from the questionnaires about the quality of the participants’ experience and their intolerance to uncertainty. Figure 14, Figure 15, Figure 16, Figure 17, Figure 18, Figure 19 and Figure 20 present a visual comparison of the data obtained.

5. Discussion

In this section, we present and discuss the main findings of the experiment in more detail.
The main purpose of this study was to suggest the design and implementation of a VR platform that is able to create the experience of uncertainty of interpersonal communications on two levels and to record and report human behavioral responses to this exposition. In this paper, we addressed these research questions:
RQ1: Is there any significant difference between subjective ratings of participants for perceived uncertainty of Task 1 and Task 2?
Our findings from a comparison of the post-experiment ratings of the participants to the perceived uncertainty of two tasks indicate the potential of the proposed design to successfully produce at least two levels of uncertainty in the experience of the system.
RQ2: How do different degrees of induced uncertainty affect the users’ behavior and performance in this immersive virtual workplace scenario?
  • RQ2-1: Does the response time of the user to reach the source of information (the phone call) differ in the two tasks?
  • RQ2-2: Does the task completion time for the user for each task differ?
  • RQ2-3: How does the change of participants’ position in Task 1 differ from Task 2?
In this paper, we targeted the study of the differences between behavioral responses to the experience of two levels of uncertainty. In particular, we focused on studying the real-time records of the time and position of the participants.
Related to time, we were interested in two variables:
  • Response time: In particular, we were interested in the participant’s response time to new information coming from a highly influential source that was directly associated with the boss, a phone call. Our expectation was that by increasing the degree of induced uncertainty, the participant would show a different response time, but we did not find a significant difference in this comparison with the applied statistical test.
  • Task completion time: In particular, we were interested in the time that the participants persist in accomplishing each task as an objective measure of the user’s tolerance to the change of uncertainty in the system. For this reason, we did not consider the correctness or wrongness of following the instructions, and the user was free to end tasks at any moment without experiencing any time pressure or encouraging or discouraging feedback. Related to this setting, by increasing the level of uncertainty of the task, our expectation was that the participant spends a different amount of time accomplishing the task. The results of the study were aligned with this expectation by reporting a significantly higher completion time for Task 2.
Related to position: The positions of the participants in Task 1 and Task 2 were recorded by the application every 4 s. The distribution of them along two axes of X and Y and in comparison with the targets for each individual task is visualized in Figure 12 and Figure 13. From a visual comparison of the two plots, we can see that in total, participants in Task 2, a task with an increase in uncertainty level, have more changes of position. This finding is aligned with what we were expecting.
In sum, our experiment suggests that adding uncertainty to a task will harm task performance on completion time, but not in response time.
RQ3: How are the users’ subjective responses to uncertainty related to the objective responses?
Despite our expectations for finding strong relations between the subjective and objective measures, we found a small negative linear correlation between the scores of the system usability scale and time to answer the second call, small positive linear correlations between the presence score and both task completion time for Task 2 and time to answer the second call, and a small positive linear correlation between scores of the motion sickness questionnaire and time to answer the second call. We think with the increase in sample size, we can report stronger correlations between these variables.
RQ4: How does the user evaluate the quality of his/her experience through subjective measures?
Another purpose of the study was to report the results of the participants’ evaluation of their experience with the system. The mean score of our results from the System Usability Scale (SUS) conveys a higher amount than the average SUS score which is 68. This gives an immediate insight into the overall good usability of the system and the need for minor improvements in the design [69]. In addition, based on the adjective rating scale introduced by [70], we also found that nearly 70 % of the participants’ ratings of the usability of the system fit into “Best imaginable”, “Excellent”, and “Good” categories (See Figure 15). For the Immersive Experience questionnaire (IEQ), the Slater–Usoh–Steed presence (SUS) questionnaire, and the Motion Sickness questionnaire (MASQ), the average of the scores also falls into an acceptable range representing a good quality of the participants’ experiences (See Figure 16, Figure 17, Figure 18 and Figure 19).
In sum, we can conclude that the system with the help of the designed environment and story plot is able to create a pleasant virtual experience. In addition, with the help of tracing from the HTCVive pro controllers and headset we were able to successfully capture in real time the behavioral responses of the participants related to the time of actions, and user position to our variable of interest.

6. Conclusions and Future Works

In this paper, we investigated the effects of uncertainty level in a virtual office on participants’ objective and subjective responses through a controlled human-subject study. We designed an experimental scenario inspired by a famous story name Amelia Bedelia written by Peggy Parish [62]. For the design of our system, we first investigated and carefully selected the virtual reality interfaces and environments that supported our research needs. In addition, we were inspired by previous games which applied uncertainty in their designs. The goal was to develop a system that supports a pleasant 3D immersive experience with real-world-like interactions and rich data-collecting techniques. In our usability study, participants were asked to complete two different tasks inside a virtual office where they were also involved in interpersonal communication with their boss on the first day of work. We measured the participants’ objective responses through the log data captured from the tracing of HTCVive pro controllers and headsets as well as assessed their subjective experience through questionnaires. We determined that the two proposed versions of tasks received significantly different ratings from the participants for their perceived uncertainty after the experiment. In addition, our results supported that the time taken to submit different tasks differs significantly. In addition, results from the usability, immersion, presence, and motion sickness questionnaires conveyed that overall, the participants were satisfied with the experience by scoring the usability, presence, and immersion of the experience on average higher than 50% and the motion sickness of the experience less than 30%.
This paper suggested that our proposed VR system can manipulate the levels of uncertainty to study it. In the design of this system, we inspired ourselves from real-life situations. An example workplace scenario could be what happens regularly for one in the role of a manager. S/he may receive multiple unpredictable inputs at once and has to constantly monitor and choose what to do first, stay productive, and successfully monitor time allocations to be able to work with everyone involved [71]. To indicate how effectively our system replicates such real-life happenings under the same conditions, an evaluation of our proposed system against real-life baseline conditions is required. We decided not to consider this system evaluation in this paper because of our limitations in controlling the confounding factors coming from real-world settings that make it hard for us to have a valid measure of the effects of uncertainty. So, we leave it for future work. In addition, we plan to investigate more behavioral responses from the user in a future study and assess the feasibility of this application with a desktop-based version of it. Finally, a larger sample size helps us to report and study more powerful behavioral results of the study.

Author Contributions

Conceptualization, S.H.; Methodology, S.H.; Software, S.H.; Validation, S.H.; Investigation, S.H.; Writing—original draft, S.H.; Writing—review & editing, G.M.; Supervision, G.M.; Funding acquisition, G.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been funded by the AlmaAttrezzature 2017 grant and by the Italian Ministry for Research and Education (MUR).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Costikyan, G. Uncertainty in Games; Mit Press: Cambridge, MA, USA, 2013. [Google Scholar]
  2. Peters, A.; McEwen, B.S.; Friston, K. Uncertainty AND stress: Why it causes diseases and how it is mastered by the brain. Prog. Neurobiol. 2017, 156, 164–188. [Google Scholar] [CrossRef]
  3. Mobbs, D.; Hagan, C.C.; Dalgleish, T.; Silston, B.; Prévost, C. The ecology of human fear: Survival optimization and the nervous system. Front. Neurosci. 2015, 9, 55. [Google Scholar] [CrossRef]
  4. Kessler, O.; Daase, C. From insecurity to uncertainty: Risk and the paradox of security politics. Alternatives 2008, 33, 211–232. [Google Scholar] [CrossRef]
  5. Bensi, L.; Giusberti, F. Trait anxiety and reasoning under uncertainty. Personal. Individ. Differ. 2007, 43, 827–838. [Google Scholar] [CrossRef]
  6. Konstantellou, A.; Sternheim, L.; Hale, L.; Simic, M.; Eisler, I. The experience of intolerance of uncertainty for parents of young people with a restrictive eating disorder. Eat. Weight Disord.-Stud. Anorex. Bulim. Obes. 2022, 27, 1339–1348. [Google Scholar] [CrossRef]
  7. Grupe, D.W.; Nitschke, J.B. Uncertainty and anticipation in anxiety: An integrated neurobiological and psychological perspective. Nat. Rev. Neurosci. 2013, 14, 488–501. [Google Scholar] [CrossRef] [PubMed]
  8. Mason, J.W. A review of psychoendocrine research on the pituitary-adrenal cortical system. Psychosom. Med. 1968, 30, 576–607. [Google Scholar] [CrossRef]
  9. Stirling, A. Keep it complex. Nature 2010, 468, 1029–1031. [Google Scholar] [CrossRef] [PubMed]
  10. Stirling, A. Science, precaution, and the politics of technological risk: Converging implications in evolutionary and social scientific perspectives. Ann. N. Y. Acad. Sci. 2008, 1128, 95–110. [Google Scholar] [CrossRef]
  11. Lee, H.S.; Anderson, J.R. Student learning: What has instruction got to do with it? Annu. Rev. Psychol. 2013, 64, 445–469. [Google Scholar] [CrossRef] [PubMed]
  12. Bacon, A.M.; Krupić, D.; Caki, N.; Corr, P.J. Emotional and Behavioral Responses to COVID-19. Eur. Psychol. 2022, 26, 334–347. [Google Scholar] [CrossRef]
  13. Duch, B.J.; Groh, S.E.; Allen, D.E. The Power of Problem-Based Learning: A Practical “How to” for Teaching Undergraduate Courses in Any Discipline; Stylus Publishing, LLC.: Sterling, VA, USA, 2001. [Google Scholar]
  14. Pigott, J. Less is more: Education for uncertain times. Glob. Soc. Educ. 2022, 20, 251–261. [Google Scholar] [CrossRef]
  15. Beghetto, R.A. What If?: Building Students’ Problem-Solving Skills Through Complex Challenges; ASCD: Alexandria, VA, USA, 2018. [Google Scholar]
  16. Beghetto, R.A. Structured uncertainty: How creativity thrives under constraints and uncertainty. In Creativity under Duress in Education? Springer: Berlin/Heidelberg, Germany, 2019; pp. 27–40. [Google Scholar]
  17. Piccolo, M.; Milos, G.F.; Bluemel, S.; Schumacher, S.; Mueller-Pfeiffer, C.; Fried, M.; Ernst, M.; Martin-Soelch, C. Behavioral responses to uncertainty in weight-restored anorexia nervosa–preliminary results. Front. Psychol. 2019, 10, 2492. [Google Scholar] [CrossRef] [PubMed]
  18. Tanovic, E. Individual Differences in Cognitive, Affective, and Behavioral Responses to Uncertainty. Ph.D. Thesis, Yale University, New Haven, CT, USA, 2020. [Google Scholar]
  19. Klir, G.J. Uncertainty and Information: Foundations of Generalized Information Theory; Wiley: New York, NY, USA, 2006. [Google Scholar]
  20. Berger, C.R.; Calabrese, R.J. Some explorations in initial interaction and beyond: Toward a developmental theory of interpersonal communication. Hum. Commun. Res. 1974, 1, 99–112. [Google Scholar] [CrossRef]
  21. Berger, C.R. Interpersonal communication. In The International Encyclopedia of Communication; Routledge: Abingdon, UK, 2008. [Google Scholar]
  22. Koerner, N.; Dugas, M.J. A cognitive model of generalized anxiety disorder: The role of intolerance of uncertainty. In Worry and Its Psychological Disorders: Theory, Assessment and Treatment; John Wiley & Sons: Hoboken, NJ, USA, 2006; pp. 201–216. [Google Scholar]
  23. Kornilova, T.V.; Chumakova, M.A.; Kornilov, S.A. Tolerance and intolerance for uncertainty as predictors of decision making and risk acceptance in gaming strategies of the Iowa gambling task. Psychol. Russ. 2018, 11, 86. [Google Scholar] [CrossRef]
  24. Boswell, J.F.; Thompson-Hollands, J.; Farchione, T.J.; Barlow, D.H. Intolerance of uncertainty: A common factor in the treatment of emotional disorders. J. Clin. Psychol. 2013, 69, 630–645. [Google Scholar] [CrossRef] [PubMed]
  25. Andersen, S.M.; Schwartz, A.H. Intolerance of ambiguity and depression: A cognitive vulnerability factor linked to hopelessness. Soc. Cogn. 1992, 10, 271–298. [Google Scholar] [CrossRef]
  26. Hofstede, G. Culture’s Consequences; SAGE Publications, Inc.: Beverly Hills, CA, USA, 1980. [Google Scholar]
  27. Kuhlmann, D.O.; Ardichvili, A. Becoming an expert: Developing expertise in an applied discipline. Eur. J. Train. Dev. 2015, 39, 262–276. [Google Scholar] [CrossRef]
  28. Hillen, M.A.; Gutheil, C.M.; Strout, T.D.; Smets, E.M.; Han, P.K. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare. Soc. Sci. Med. 2017, 180, 62–75. [Google Scholar] [CrossRef]
  29. Mandal, S. Brief introduction of virtual reality & its challenges. Int. J. Sci. Eng. Res. 2013, 4, 304–309. [Google Scholar]
  30. Dede, C. The evolution of distance education: Emerging technologies and distributed learning. Am. J. Distance Educ. 1996, 10, 4–36. [Google Scholar] [CrossRef]
  31. Brookes, J.; Warburton, M.; Alghadier, M.; Mon-Williams, M.; Mushtaq, F. Studying human behavior with virtual reality: The Unity Experiment Framework. Behav. Res. Methods 2020, 52, 455–463. [Google Scholar] [CrossRef] [PubMed]
  32. Bohné, T.; Heine, I.; Gürerk, Ö.; Rieger, C.; Kemmer, L.; Cao, L.Y. Perception engineering learning with virtual reality. IEEE Trans. Learn. Technol. 2021, 14, 500–514. [Google Scholar] [CrossRef]
  33. Villagrasa, S.; Fonseca, D.; Durán, J. Teaching case: Applying gamification techniques and virtual reality for learning building engineering 3D arts. In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 1–3 October 2014; pp. 171–177. [Google Scholar]
  34. Gugenheimer, J.; Stemasov, E.; Frommel, J.; Rukzio, E. Sharevr: Enabling co-located experiences for virtual reality between hmd and non-hmd users. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 4021–4033. [Google Scholar]
  35. Cobbett, S.; Snelgrove-Clarke, E. Virtual versus face-to-face clinical simulation in relation to student knowledge, anxiety, and self-confidence in maternal-newborn nursing: A randomized controlled trial. Nurse Educ. Today 2016, 45, 179–184. [Google Scholar] [CrossRef] [PubMed]
  36. Soto, J.B.; Ocampo, D.T.; Colon, L.B.; Oropesa, A.V. Perceptions of ImmerseMe Virtual Reality Platform to Improve English Communicative Skills in Higher Education. Int. J. Interact. Mob. Technol. 2020, 14, 4–19. [Google Scholar] [CrossRef]
  37. Hoffman, H.; Vu, D. Virtual reality: Teaching tool of the twenty-first century? Acad. Med. J. Assoc. Am. Med Coll. 1997, 72, 1076–1081. [Google Scholar] [CrossRef] [PubMed]
  38. Bossard, C.; Kermarrec, G.; Buche, C.; Tisseau, J. Transfer of learning in virtual environments: A new challenge? Virtual Real. 2008, 12, 151–161. [Google Scholar] [CrossRef]
  39. De Gloria, A.; Bellotti, F.; Berta, R. Serious Games for education and training. Int. J. Serious Games 2014, 1. [Google Scholar] [CrossRef]
  40. Moynihan, D.P. Learning under uncertainty: Networks in crisis management. Public Adm. Rev. 2008, 68, 350–365. [Google Scholar] [CrossRef]
  41. Zeidner, M.; Matthews, G.; Roberts, R.D. Emotional intelligence in the workplace: A critical review. Appl. Psychol. 2004, 53, 371–399. [Google Scholar] [CrossRef]
  42. Barseli, M.; Sembiring, K.; Ifdil, I.; Fitria, L. The concept of student interpersonal communication. JPPI J. Penelit. Pendidik. Indones. 2019, 4, 129–134. [Google Scholar] [CrossRef]
  43. Marris, P. The Politics of Uncertainty: Attachment in Private and Public Life; Routledge: Abingdon, UK, 2003. [Google Scholar]
  44. Gudykunst, W.D. Anxiety/Uncertainty Management (AUM) Theory: Current Status. In Intercultural Communication Theory; SAGE Publications, Inc.: Thousand Oaks, CA, USA, 1995. [Google Scholar]
  45. Stankiewicz, B.J.; Legge, G.E.; Mansfield, J.S.; Schlicht, E.J. Lost in virtual space: Studies in human and ideal spatial navigation. J. Exp. Psychol. Hum. Percept. Perform. 2006, 32, 688. [Google Scholar] [CrossRef] [PubMed]
  46. Brunyé, T.T.; Haga, Z.D.; Houck, L.A.; Taylor, H.A. You look lost: Understanding uncertainty and representational flexibility in navigation. In Representations in Mind and World; Routledge: Abingdon, UK, 2017; pp. 42–56. [Google Scholar]
  47. Brunyé, T.T.; Gagnon, S.A.; Gardony, A.L.; Gopal, N.; Holmes, A.; Taylor, H.A.; Tenbrink, T. Where did it come from, where do you go? Direction sources influence navigation decisions during spatial uncertainty. Q. J. Exp. Psychol. 2015, 68, 585–607. [Google Scholar] [CrossRef] [PubMed]
  48. Herdener, N.; Wickens, C.D.; Clegg, B.A.; Smith, C. Overconfidence in projecting uncertain spatial trajectories. Hum. Factors 2016, 58, 899–914. [Google Scholar] [CrossRef]
  49. Cheng, K.; Huttenlocher, J.; Newcombe, N.S. 25 years of research on the use of geometry in spatial reorientation: A current theoretical perspective. Psychon. Bull. Rev. 2013, 20, 1033–1054. [Google Scholar] [CrossRef]
  50. Hirsh, J.B.; Mar, R.A.; Peterson, J.B. Psychological entropy: A framework for understanding uncertainty-related anxiety. Psychol. Rev. 2012, 119, 304. [Google Scholar] [CrossRef] [PubMed]
  51. Keller, A.M.; Taylor, H.A.; Brunyé, T.T. Uncertainty promotes information-seeking actions, but what information? Cogn. Res. Princ. Implic. 2020, 5, 1–17. [Google Scholar] [CrossRef]
  52. Stanley, D.; Latimer, K. ‘The Ward’: A simulation game for nursing students. Nurse Educ. Pract. 2011, 11, 20–25. [Google Scholar] [CrossRef]
  53. Caillois, R. Man, Play, and Games; University of Illinois Press: Champaign, IL, USA, 2001. [Google Scholar]
  54. Yap, C.M.; Kadobayashi, Y.; Yamaguchi, S. Conceptualizing Player-Side Emergence in Interactive Games: Between Hardcoded Software and the Human Mind in Papers, Please and Gone Home. Int. J. Gaming Comput.-Mediat. Simul. (IJGCMS) 2015, 7, 1–21. [Google Scholar] [CrossRef]
  55. Veale, K. Gone Home, and the power of affective nostalgia. Int. J. Herit. Stud. 2017, 23, 654–666. [Google Scholar] [CrossRef]
  56. Smith, L.; Campbell, G. The elephant in the room: Heritage, affect, and emotion. In A Companion to Heritage Studies; Wiley: New York, NY, USA, 2015; pp. 443–460. [Google Scholar]
  57. Ruberg, B. Straight paths through queer walking simulators: Wandering on rails and speedrunning in Gone Home. Games Cult. 2020, 15, 632–652. [Google Scholar] [CrossRef]
  58. Klei Entertainment. Don’t Starve. Video game. PC, IOS, Android, Xbox, Playstation, Wii U. Klei Entertainment, Canada. 2013. Available online: https://www.klei.com/games/dont-starve (accessed on 28 December 2022).
  59. Farah, Y.A.; Dorneich, M.C.; Gilbert, S.B. Evaluating Team Metrics in Cooperative Video Games. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; SAGE Publications Sage CA: Los Angeles, CA, USA, 2022; Volume 66, pp. 70–74. [Google Scholar]
  60. Xu, W.; Liang, H.N.; Yu, K.; Baghaei, N. Effect of gameplay uncertainty, display type, and age on virtual reality exergames. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–14. [Google Scholar]
  61. Liu, Y.; Lin, Y.; Shi, R.; Luo, Y.; Liang, H.-N. Relicvr: A virtual reality game for active exploration of archaeological relics. In Proceedings of the Extended Abstracts of the 2021 Annual Symposium on Computer-Human Interaction in Play, Virtual, Austria, 18–21 October 2021; pp. 326–332. [Google Scholar]
  62. Parish, P.; Siebel, F.; Thomas, B.S.; Canetti, Y. Amelia Bedelia; Harper & Row: New York, NY, USA, 1963. [Google Scholar]
  63. Tabbers, H.; Martens, R.; van Merriënboer, J.J. Multimedia instructions and cognitive load theory: Split-attention and modality effects. In Proceedings of the National Convention of the Association for Educational Communications and Technology, Long Beach, CA, USA, 17 February 2000; Citeseer: Princeton, NJ, USA, 2000. [Google Scholar]
  64. Brooke, J. SUS-A quick and dirty usability scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  65. Usoh, M.; Catena, E.; Arman, S.; Slater, M. Using presence questionnaires in reality. Presence 2000, 9, 497–503. [Google Scholar] [CrossRef]
  66. Rigby, J.M.; Brumby, D.P.; Gould, S.J.; Cox, A.L. Development of a questionnaire to measure immersion in video media: The Film IEQ. In Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video, Salford, UK, 5–7 June 2019; pp. 35–46. [Google Scholar]
  67. Gianaros, P.J.; Muth, E.R.; Mordkoff, J.T.; Levine, M.E.; Stern, R.M. A questionnaire for the assessment of the multiple dimensions of motion sickness. Aviat. Space Environ. Med. 2001, 72, 115. [Google Scholar] [PubMed]
  68. Buhr, K.; Dugas, M.J. The intolerance of uncertainty scale: Psychometric properties of the English version. Behav. Res. Ther. 2002, 40, 931–945. [Google Scholar] [CrossRef]
  69. Brooke, J. SUS: A retrospective. J. Usability Stud. 2013, 8, 29–40. [Google Scholar]
  70. Bangor, A.; Kortum, P.; Miller, J. Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
  71. Zika-Viktorsson, A.; Sundström, P.; Engwall, M. Project overload: An exploratory study of work and management in multi-project settings. Int. J. Proj. Manag. 2006, 24, 385–394. [Google Scholar] [CrossRef]
Figure 1. The stages of the experimental design.
Figure 1. The stages of the experimental design.
Sensors 23 02148 g001
Figure 2. Screenshots showing some steps of the familiarization phase, from left to right: (a) selecting the “I am ready” button by the user; (b) teleporting to a destination; (c) reading instructions of the task from the panel on the user’s left hand; (d) removing an interactive object; (e) submitting the current task; (f) selecting the “Move to the office” button to move there.
Figure 2. Screenshots showing some steps of the familiarization phase, from left to right: (a) selecting the “I am ready” button by the user; (b) teleporting to a destination; (c) reading instructions of the task from the panel on the user’s left hand; (d) removing an interactive object; (e) submitting the current task; (f) selecting the “Move to the office” button to move there.
Sensors 23 02148 g002
Figure 3. Screenshots showing two views of the office: (a) the view of the office from the perspective of the user at the beginning; (b) Another view of the office.
Figure 3. Screenshots showing two views of the office: (a) the view of the office from the perspective of the user at the beginning; (b) Another view of the office.
Sensors 23 02148 g003
Figure 4. Screenshots showing five sources of information for the user during the experience: (a) Arrow 1 points to a small blackboard that displays the name of the current task. Arrow 2 points to a big blackboard that communicates the current status of the tasks during the experience; (b) Arrow 3 points to a small blackboard attached to the panel that is a closeup of the big one; (c) Arrow 4 points to a phone that blinks and rings when the boss calls; (d) Arrow 5 points to a task board that shows the instructions for each task.
Figure 4. Screenshots showing five sources of information for the user during the experience: (a) Arrow 1 points to a small blackboard that displays the name of the current task. Arrow 2 points to a big blackboard that communicates the current status of the tasks during the experience; (b) Arrow 3 points to a small blackboard attached to the panel that is a closeup of the big one; (c) Arrow 4 points to a phone that blinks and rings when the boss calls; (d) Arrow 5 points to a task board that shows the instructions for each task.
Sensors 23 02148 g004
Figure 5. Screenshots showing different parts of the panel for interactions: (a,d) selecting the “I am ready” and “Answering the phone” buttons by the user; (b,e) visualization of the task tab; (c,f) visualization of the item tab.
Figure 5. Screenshots showing different parts of the panel for interactions: (a,d) selecting the “I am ready” and “Answering the phone” buttons by the user; (b,e) visualization of the task tab; (c,f) visualization of the item tab.
Sensors 23 02148 g005
Figure 6. Screenshots showing the participant when: (a) removing interactive objects; (b) using the teleportation to move in the environment.
Figure 6. Screenshots showing the participant when: (a) removing interactive objects; (b) using the teleportation to move in the environment.
Sensors 23 02148 g006
Figure 7. Screenshots from seven different areas for the placements of objects characterized by the color of their associated rugs.
Figure 7. Screenshots from seven different areas for the placements of objects characterized by the color of their associated rugs.
Sensors 23 02148 g007
Figure 8. The steps that the participant experiences.
Figure 8. The steps that the participant experiences.
Sensors 23 02148 g008
Figure 9. A comparison between the means of ratings obtained from the participants for their perceived uncertainty of Task 1 (M = 1.88, SD = 1.11) and Task 2 (M = 3.41, SD = 1.00); Range of answers = 1–5.
Figure 9. A comparison between the means of ratings obtained from the participants for their perceived uncertainty of Task 1 (M = 1.88, SD = 1.11) and Task 2 (M = 3.41, SD = 1.00); Range of answers = 1–5.
Sensors 23 02148 g009
Figure 10. A comparison between the response time of the user to the source of information (the phone call) during Task 1 and Task 2, measured in [s].
Figure 10. A comparison between the response time of the user to the source of information (the phone call) during Task 1 and Task 2, measured in [s].
Sensors 23 02148 g010
Figure 11. A comparison between the task time completion of the user for Task 1 and Task 2, measured in [s]. In these data, two completion time values identified as outliers which are shown in white circles in the figure.
Figure 11. A comparison between the task time completion of the user for Task 1 and Task 2, measured in [s]. In these data, two completion time values identified as outliers which are shown in white circles in the figure.
Sensors 23 02148 g011
Figure 12. A visualization of participants’ change of position in Task 1; the 8 task targets are shown in blue.
Figure 12. A visualization of participants’ change of position in Task 1; the 8 task targets are shown in blue.
Sensors 23 02148 g012
Figure 13. A visualization of participants’ change of position in Task 2; the task targets related to before the change of the task are shown in red; the task targets related to after the change of the task are shown in green.
Figure 13. A visualization of participants’ change of position in Task 2; the task targets related to before the change of the task are shown in red; the task targets related to after the change of the task are shown in green.
Sensors 23 02148 g013
Figure 14. A comparison between the scores obtained from the participants from the System Usability Scale (SUS) questionnaire (M = 76.32, SD = 9.89).
Figure 14. A comparison between the scores obtained from the participants from the System Usability Scale (SUS) questionnaire (M = 76.32, SD = 9.89).
Sensors 23 02148 g014
Figure 15. Percentages of ratings for each category from the System Usability Scale (SUS) questionnaire, categories from left to right are: best imaginable (score > 84.1); excellent (72.6 < score < 84.0); good (62.7 < score < 72.5); ok (51.7 < score < 62.6); poor (26 < score < 51.6); worst imaginable (score < 25).
Figure 15. Percentages of ratings for each category from the System Usability Scale (SUS) questionnaire, categories from left to right are: best imaginable (score > 84.1); excellent (72.6 < score < 84.0); good (62.7 < score < 72.5); ok (51.7 < score < 62.6); poor (26 < score < 51.6); worst imaginable (score < 25).
Sensors 23 02148 g015
Figure 16. A comparison between the scores obtained from the participants from the Immersive Experience questionnaire (IEQ) (M = 65.84 (%), SD = 7.13 (%)).
Figure 16. A comparison between the scores obtained from the participants from the Immersive Experience questionnaire (IEQ) (M = 65.84 (%), SD = 7.13 (%)).
Sensors 23 02148 g016
Figure 17. Range of (1: Not at all; 5: A lot) for a comparison between the means of scores for the components of the Immersive Experience questionnaire (IEQ) scale, from left to right: overall IEQ score (M = 3.17); challenge (M = 2.59); cognitive involvement (M = 4.06); control (M = 2.95); emotional involvement (M = 3.20); and real-world dissociation (M = 3.059).
Figure 17. Range of (1: Not at all; 5: A lot) for a comparison between the means of scores for the components of the Immersive Experience questionnaire (IEQ) scale, from left to right: overall IEQ score (M = 3.17); challenge (M = 2.59); cognitive involvement (M = 4.06); control (M = 2.95); emotional involvement (M = 3.20); and real-world dissociation (M = 3.059).
Sensors 23 02148 g017
Figure 18. A comparison between the scores obtained from the participants from the Slater–Usoh–Steed presence (SUS) questionnaire (M = 64.37 (%), SD = 14.50 (%)).
Figure 18. A comparison between the scores obtained from the participants from the Slater–Usoh–Steed presence (SUS) questionnaire (M = 64.37 (%), SD = 14.50 (%)).
Sensors 23 02148 g018
Figure 19. A comparison between the scores obtained from the participants from the Motion Sickness questionnaire (MASQ) (M = 22.18 (%), SD=12.84 (%)).
Figure 19. A comparison between the scores obtained from the participants from the Motion Sickness questionnaire (MASQ) (M = 22.18 (%), SD=12.84 (%)).
Sensors 23 02148 g019
Figure 20. A comparison between the scores obtained from the participants from the Intolerance of Uncertainty Scale (IUS) (M = 59.48 (%), SD=11.63 (%)).
Figure 20. A comparison between the scores obtained from the participants from the Intolerance of Uncertainty Scale (IUS) (M = 59.48 (%), SD=11.63 (%)).
Sensors 23 02148 g020
Table 1. Key features of the games reviewed in the related work section.
Table 1. Key features of the games reviewed in the related work section.
Name of the GameType of the GameType of the ExperienceApplied Uncertainty ApproachRequested Task
Gone HomeExploration game; Role-playing gameVideo gamePut players in unknown situations; Encourage the user to explore scattered objects along a twisting, uncertain pathTo find out what has happened to the character’s sister and family by digging through scattered documents
Don’t StarveExploration game; Role-playing gameVideo gamePut the user in a strange and unfamiliar world; Put the user in a situation that is uncertain about the outcome of some of his/her choicesTo find through trial and error the way that can survive from the threats coming from the environment
GestureFitMotion-based survival gameImmersive virtual realityInduce uncertainty in the system through three uncertain gameplay elements (false attacks, misses, and critical hits)To stay alive and perform gestures to defeat a monster
RelicVEExploration game; Role-playing gameImmersive virtual realityPlace the user in a situation where s/he does not know the shape and features of the target artifact from the beginningTo take away earth from the artifact using the available tools and physical movement without damaging it
Table 2. Participants’ demographic information.
Table 2. Participants’ demographic information.
SexNumberPercentage (%)
Female317.65
Male1482.35
AgeNumberPercentage (%)
20–3017100
30–3500
NationalityNumberPercentage (%)
Italian17100
Others00
Education levelNumberPercentage (%)
High school graduate, diploma or the equivalent15.9
Bachelor’s degree1270.59
Master’s degree317.65
Doctorate degree15.9
Table 3. A comparison of the elements in the proposed platform with those in the Amelia Bedelia story.
Table 3. A comparison of the elements in the proposed platform with those in the Amelia Bedelia story.
Name of the ElementDescription of the Element in Amelia Bedelia StoryDescription of the Element in Our platformReason for Use
Absence of guidanceThe housekeeper is asked to accomplish tasks based on the given instructions while does not have access to anybody to communicate her doubts.The same situation is true here for the user but also s/he can save in the system the type of problem which is facing (a problem with the interface and/or a problem with the instruction).This design choice limits the access to the sources of information, asking to focus and rely on the already provided knowledge or may come after, from sources such as panels and phone calls.
Specific means of communicating instructionTextual on a printed paperBoth textual and verbal that comes from the panels and voice callsCommunicating instructions in both verbal and textual forms would be in favor of cognitive load and managing the attention of the user during the experience [63].
Specific instruction communication patterns and sources of ambiguityInstructions are presented in sequences of sentences, as appearing in step-by-step construction manuals. These instructions include lexical ambiguity coming from each sentence.The same is followed here. In addition, the complexity of instructions also changes with increasing the degree of interconnectivity among parts of the sentences. In addition, available tasks and information change with receiving unpredictable phone calls coming from the boss.This design choice provides the possibility for the experiment designer to purposely reduce the amount of available information and change the complexity of the sentences to control the amount of ambiguity and complexity in the system. In addition, the possibility of applying lexical ambiguity, as another potential source of confusion is provided.
A friendly environment supporting understanding and empathy in interpersonal communicationsA nice house with friendly relationships between Amelia and the familyA nice virtual office and the friendly voice of the bossExperiencing this environment potentially keeps the user’s interest to stay till the end of the experiment and accept the challenges.
Table 4. Questionnaires used to rate the level of English proficiency, the previous experience with immersive VR, and the perceived uncertainty of Task 1 and Task 2.
Table 4. Questionnaires used to rate the level of English proficiency, the previous experience with immersive VR, and the perceived uncertainty of Task 1 and Task 2.
Measuring ItemQuestionRange
Level of English proficiencyHow do you rate your level of English proficiency?(1. Poor–5. Very good)
Previous Experience with immersive VRHave you experienced virtual reality with a head-mounted display in the past?(1. Never–5. Everyday)
Rating the perceived uncertainty of Tasks 1 and 2From the definition of Hillen et al. [28], changes in three sources can provide uncertainty in an information system: probability, ambiguity, or/and complexity of information. You can perceive uncertainty if you can become aware of its existence. Based on this definition and your experiment with the virtual office, what score will you give to the perceived uncertainty of Task 1 (the same question for Task 2)?1–5
Table 5. Correlation values between subjective and objective measures.
Table 5. Correlation values between subjective and objective measures.
Questionnaire NameTime to Answer the Call in Task 2Task Competition Time for Task 2
The System Usability Scale (SUS)r = −0.38r = −0.15
Immersive Experience Questionnaire (IEQ)r = 0.15r = 0.4
Slater–Usoh–Steed presence questionnaire (SUS)r = 0.24r = 0.29
Motion sickness questionnaire (MASQ)r = 0.21r = 0.16
Intolerance of Uncertainty Scale (IUS)r = −0.09r = 0.07
Table 6. Mean and standard deviation of scores received from participants’ answers to the questionnaires.
Table 6. Mean and standard deviation of scores received from participants’ answers to the questionnaires.
Questionnaire NameMeanSDRange
The System Usability Scale (SUS)76.329.89[1–100]
Immersive Experience Questionnaire (IEQ)65.847.13[1–100]
Slater–Usoh–Steed presence questionnaire (SUS)64.3714.50[1–100]
Motion sickness questionnaire (MASQ)22.1812.48[1–100]
Intolerance of Uncertainty Scale (IUS)59.4811.63[1–100]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hajahmadi, S.; Marfia, G. Effects of the Uncertainty of Interpersonal Communications on Behavioral Responses of the Participants in an Immersive Virtual Reality Experience: A Usability Study. Sensors 2023, 23, 2148. https://doi.org/10.3390/s23042148

AMA Style

Hajahmadi S, Marfia G. Effects of the Uncertainty of Interpersonal Communications on Behavioral Responses of the Participants in an Immersive Virtual Reality Experience: A Usability Study. Sensors. 2023; 23(4):2148. https://doi.org/10.3390/s23042148

Chicago/Turabian Style

Hajahmadi, Shirin, and Gustavo Marfia. 2023. "Effects of the Uncertainty of Interpersonal Communications on Behavioral Responses of the Participants in an Immersive Virtual Reality Experience: A Usability Study" Sensors 23, no. 4: 2148. https://doi.org/10.3390/s23042148

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop