sensors
Article
Exploring Gaze Dynamics in Virtual Reality through Multiscale
Entropy Analysis
Sahar Zandi *
and Gregory Luhan
Department of Architecture, Texas A&M University, College Station, TX 77843, USA;
[email protected]
* Correspondence:
[email protected]
Abstract: This study employs Multiscale Entropy (MSE) to analyze 5020 binocular eye movement
recordings from 407 college-aged participants, as part of the GazeBaseVR dataset, across various
virtual reality (VR) tasks to understand the complexity of user interactions. By evaluating the vertical
and horizontal components of eye movements across tasks such as vergence, smooth pursuit, video
viewing, reading, and random saccade, collected at 250 Hz using an ET-enabled VR headset, this
research provides insights into the predictability and complexity of gaze patterns. Participants were
recorded up to six times over a 26-month period, offering a longitudinal perspective on eye movement
behavior in VR. MSE’s application in this context aims to offer a deeper understanding of user behavior in VR, highlighting potential avenues for interface optimization and user experience enhancement.
The results suggest that MSE can be a valuable tool in creating more intuitive and immersive VR
environments by adapting to users’ gaze behaviors. This paper discusses the implications of these
findings for the future of VR technology development, emphasizing the need for intuitive design and
the potential for MSE to contribute to more personalized and comfortable VR experiences.
Keywords: virtual reality; time series analysis; eye movements; multiscale entropy; user experience;
human sensing
1. Introduction
Citation: Zandi, S.; Luhan, G.
Exploring Gaze Dynamics in Virtual
Reality through Multiscale Entropy
Analysis. Sensors 2024, 24, 1781.
https://doi.org/10.3390/s24061781
Academic Editors: Calin Gheorghe
Dan Neamtu, Radu Comes, Jing-Jing
Fang and Dorin-Mircea Popovici
Received: 21 January 2024
Revised: 1 March 2024
Accepted: 8 March 2024
Published: 10 March 2024
Copyright: © 2024 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
In the changing world of innovation, augmented reality (AR) and virtual reality
(VR) technologies are making a powerful impact. They are revolutionizing interaction,
visualization, and immersion in industries. These technologies have the ability to transform
entertainment by creating gaming experiences and interactive media. They also have
the potential to revolutionize education and training through simulated environments.
Moreover, they are enhancing care through therapy and reshaping the retail industry with
interactive shopping experiences. At the heart of these technologies lies eye tracking, a
method that captures a user’s gaze within a virtual environment. By understanding and
responding to where users look, eye tracking technology significantly improves interactivity
and personalization in AR/VR experiences. This leads to engaging and effective user
experiences [1–6].
In the realm of healthcare, VR technologies have shown promise in enhancing surgical
planning and patient outcomes. For instance, Innocente et al. demonstrated the application
of Mixed Reality for Total Hip Arthroplasty Assessment, offering surgeons advanced
visualization tools to evaluate and predict post-operative results [7]. This application
exemplifies the transformative potential of VR and AR in medical procedures, aligning
with our findings on the impact of VR on user experience. Similarly, Su et al. (2022)
discuss the use of mixed-reality technology in total knee arthroplasty, highlighting its
advantages in complex surgical procedures [8]. Furthermore, Zavala-González et al. (2022)
conducted a randomized controlled trial on the effectiveness of adding virtual reality to
physiotherapeutic treatment in patients with total hip arthroplasty, showcasing the potential
of virtual reality in enhancing rehabilitation outcomes [9]. These studies exemplify the
Sensors 2024, 24, 1781. https://doi.org/10.3390/s24061781
https://www.mdpi.com/journal/sensors
Sensors 2024, 24, 1781
2 of 18
diverse applications of technology in orthopedic surgery and rehabilitation, emphasizing
the role of innovative solutions in improving patient care and outcomes.
However despite the progress and widespread acceptance of AR/VR technologies,
there are still obstacles that need to be overcome in order to fully utilize their potential
and ensure sustainable growth. One major concern in the realm of VR is user comfort.
Extended exposure to VR environments can cause discomfort and sensory overload commonly referred to as VR sickness. This condition is characterized by symptoms such as
nausea, headaches, and disorientation. Addressing this challenge highlights the importance of creating VR experiences that are not immersive but also comfortable and safe, for
users [10–16].
Maintaining user engagement poses another challenge in virtual reality (VR) applications. As the initial excitement of VR diminishes, it becomes increasingly crucial to sustain
user interest and captivation over time. This challenge underscores the importance of
creating adaptive content and interface designs that cater to user preferences and behaviors,
ensuring a continuous level of engagement and immersion. Furthermore, optimizing
interface design for seamless navigation and interaction is essential to provide users with
an enjoyable VR experience. Recent works, such as the study by Cannavò et al. [17] on the
automatic generation of affective 3D virtual environments from 2D images, highlight the
significance of design considerations in retaining users and determining the success of VR
applications [18–21].
Industry leaders such as Meta (previously known as Facebook) along with their
Oculus VR systems, Netflix, and Amazon are at the forefront of advancing VR experiences.
One of the challenges in this field is finding a balance between the costs involved in
creating avatars that enable a sense of embodiment and the need for hardware to track and
render natural movements. Meta specifically needs to strike a balance between achieving
convincing motion and visual accuracy to maintain a feeling of presence while ensuring
accessibility and scalability across their VR platforms, for consumers. Recent research also
emphasizes the significance of understanding user behavior eye movement in order to
create user-centric VR experiences [22–24].
To address these challenges, our study adopts Multiscale Entropy (MSE) analysis, a
computational technique initially developed by Javaherian et al. [25]. MSE analysis offers
a unique perspective on time series data by dissecting it into non-overlapping windows,
revealing patterns and correlations at various scales. Applied to the eye movement data
obtained from the GazeBaseVR dataset [26,27], which provides a comprehensive and
longitudinal perspective on eye movement behavior in various VR tasks and stimuli, MSE
analysis enables us to explore the intricate dynamics of eye movement patterns. This
dataset captures the eye movements of participants engaged in a series of diverse and
dynamic tasks, including vergence, smooth pursuit, video viewing, self-paced reading, and
random saccades, collectively referred to as ‘VRG’, ‘PUR’, ‘VID’, ‘TEX’, and ‘RAN’. These
tasks encompass a wide range of stimuli and activities typical in VR environments, making
the dataset an invaluable resource for our study [28,29].
In our research, we specifically focus on the horizontal (θ H ) and vertical (θV ) components of eye movement, quantified in degrees of visual angle (dva). These components are
essential as they provide insights into the intricate dynamics of eye movements in VR. The
utilization of MSE analysis allows us to quantify the complexity of these eye movement
patterns, offering valuable insights into their predictability and dynamics across different
VR tasks and stimuli [26].
By using Mean Squared Error (MSE) on this dataset our study aims to uncover the
aspects of eye movements, in different tasks and stimuli. This will help us understand
how users interact with and respond to virtual reality (VR) environments. These valuable
insights will play a role in addressing the challenges mentioned earlier such as user comfort,
engagement, and optimizing interface design. Moreover, they have the potential to drive
advancements by allowing us to explore MSE analysis findings for informing VR interface
and content design resulting in intuitive, engaging, and comfortable user experiences.
Sensors 2024, 24, 1781
3 of 18
Additionally, our research strives to provide suggestions and insights for leaders, in the
tech industry. This will assist them in overcoming challenges related to VR production and
delivery aligning with the needs and goals of these companies while contributing to the
advancement and refinement of VR technologies.
In conducting this study, the paper enhances understanding of eye movement behavior
in virtual reality environments and provides practical insights that drive innovation and
improvement in the VR industry. The subsequent sections detail the methodology of
Multiscale Entropy (MSE) analysis, present findings from this approach, and discuss
implications for VR design and user experience. Specifically, the Methodology section
outlines the dataset used and the steps taken in the MSE analysis. The Results section
presents the complexity patterns observed in eye movements across different VR tasks. The
Discussion interprets these findings in the context of current challenges faced by leaders in
the tech industry, offering strategic insights and recommendations. Finally, the Conclusion
and Future Work summarize the study’s key contributions and propose directions for
further research, emphasizing the potential for collaborative efforts with the tech industry.
2. Methodology
2.1. Dataset Description
The dataset utilized in this study, referred to as GazeBaseVR, is a large-scale longitudinal binocular eye-tracking dataset collected at a high frequency of 250 Hz using an
eye-tracking enabled virtual reality (VR) headset. For an in-depth description of the dataset,
including the VR application used and the purpose of data collection, readers are referred
to Lohr et al. (2023) [26]. The GazeBaseVR dataset includes 5020 binocular recordings from
a diverse population of 407 college-aged participants, recorded up to six times each over a
26-month period. Each session included a series of five different eye-tracking (ET) tasks
designed to elicit various eye movements and behaviors. These tasks were a vergence task,
horizontal smooth pursuit task, video-viewing task, self-paced reading task, and random
oblique saccade task. These tasks were selected to simulate a wide range of visual experiences in VR, from focusing on specific objects at varying depths to engaging in activities that
mimic everyday visual behaviors, providing a comprehensive assessment of eye movement
patterns. This comprehensive set of tasks was carefully selected to cover a broad spectrum
of eye movements and to provide insights into both the micro-movements and overall
behavioral patterns of the participants’ eye movements in a virtual reality environment.
The dataset’s longitudinal nature, with multiple recordings of the same individuals
over an extended period, provides a unique opportunity to investigate not just the variability and commonalities in eye movements across individuals and tasks but also changes
and trends over time, whether due to learning, adaptation, or other long-term factors.
Furthermore, the diversity of the participant pool in terms of demographics and the
detailed recording of participant details make this dataset particularly valuable for investigating factors such as the impact of demographic variables on eye movement behavior and
the potential for personalized or adaptive VR experiences based on eye movement patterns.
From a technical perspective, the data were captured using SensoMotoric Instrument’s
eye-tracking virtual reality head-mounted display (ET-HMD) based on the HTC Vive. This
device tracks both eyes at a nominal sampling rate of 250 Hz with a high spatial accuracy
and precision, ensuring detailed and reliable eye movement data. The dataset provides an
extensive range of metrics, including gaze direction vectors and positional data, which can
be used to derive various eye movement characteristics.
In our study, we specifically focus on the horizontal (x) and vertical (y) components
of the eye rotation, denoted as θ H and θV , respectively, in terms of degrees of visual
angle (dva). These components are crucial for understanding the direction and extent
of eye movements, especially in the context of interactive VR environments where users
Sensors 2024, 24, 1781
4 of 18
are constantly engaging with dynamic content. The conversion of the direction vector
v = [ x, y, z] into these components is performed using the following equations:
q
180
arctan 2 x, y2 + z2
θH =
(1)
π
180
arctan 2(y, z)
(2)
π
The “z” component in this vector represents the depth axis in a three-dimensional
coordinate system, indicating the depth at which a user is looking into the screen or virtual
environment. This depth component is essential for calculating the exact direction of gaze in
the 3D space of VR, providing a comprehensive understanding of eye movement behavior.
These equations provide a standardized way of quantifying eye movement in terms of
horizontal and vertical rotation angles, facilitating the analysis of eye movement patterns
and the application of Multiscale Entropy (MSE) analysis to assess the complexity and
predictability of eye movements across different VR tasks [26,27].
θV =
2.2. Analysis of Multiscale Entropy
A system’s time series data can be affected by various noises that stem from its
interaction with the surrounding environment. These noises can introduce short-term
correlations that falsely appear as long-term effects in the time-series analysis. Therefore, it
is crucial to mitigate the impact of these noises and the associated short-term correlations.
This is achieved through a process known as coarse graining. In this process, a time series
with N data points (y1 , y2 , . . . , y N ) is divided into equal-length, non-overlapping segments
called windows, each of length λ. Within each window, the data points are averaged to
produce a new time series with reduced granularity as follows:
(λ)
yp
=
1
λ
λ
∑ y ( p −1) λ + q
(3)
q =1
(λ)
(λ)
This results in a new series composed of the averaged data points (y1 , . . . , y N ),
λ
where λ serves as the scaling factor. The restructured data can then be represented by
vectors of m dimensions as shown below:
(λ)
(λ)
(λ)
Ym ( p) = (y p , . . . , y p+m−1 )
(4)
The subsequent step involves counting the number of vector pairs whose mutual
distance is less than a specified value r. This quantity is denoted as nm (r, λ). This counting
is also performed for vectors of dimension m + 1, denoted as nm+1 (r, λ). The measure of
complexity, known as sample entropy, is then computed with the following formula:
nm+1 (r, λ)
Se (m, r, λ) = − log
.
(5)
nm (r, λ)
Given that nm+1 (r, λ) ≤ nm (r, λ), it follows that Se is non-negative. A graph plotting
sample entropy against the scale factor illustrates the correlation ranges present in the
time series. This comprehensive procedure is referred to as multiscale entropy (MSE)
analysis [25].
2.3. Computational Methodology
The Multiscale Entropy (MSE) analysis of the eye movement data was conducted
using a Python script that leverages libraries such as NumPy for numerical operations
and Matplotlib for visualizing the entropy trends. The script’s core consists of functions
specifically designed to perform coarse graining, calculate sample entropy, and compile
multiscale entropy across varying scales for a comprehensive understanding of complexity
Sensors 2024, 24, 1781
5 of 18
in eye movement data. The choice of Python and these libraries was driven by their
robustness, efficiency, and widespread use in scientific computing, ensuring reliability and
reproducibility of our results.
As illustrated in Figure 1, the computational process of Multiscale Entropy (MSE)
analysis involves several key steps. Starting with the initialization of the process for each
file in the dataset, it proceeds with loading the data specifically from the ‘x’ and ‘y’ columns
indicative of the horizontal and vertical gaze positions. Parameters for the MSE analysis
are set according to standard practices and preliminary analysis to ensure robustness and
relevance of the results. The time series data are then coarse grained at multiple scales, upon
which sample entropy is calculated to quantify the complexity at each scale. Finally, these
complexity measures are compiled across all scales to visualize and analyze the variation of
eye movement complexity. The coarse graining and sample entropy calculations are guided
by specific equations, integral to understanding the underlying mechanics of this analysis.
Initialize MSE
calculation f for
each CSV file
Set MSE parameters
Load data from ’x’
and ’Y’ columns
Coarse grain series
Coarse Graining Eq.:
(λ)
yp
Calculate sample entropy
Sample Entropy
Eq.:
Se (m, r, λ) = − log
nm+1 (r,λ)
nm (r,λ)
=
1
λ
∑λq=1 y( p−1)λ+q
Compile MSE
across scales
Plot scale factor
vs. sample entropy
Figure 1. Flowchart illustrating the MSE calculation process with included equations.
2.4. Rationale Behind Parameters and Approach
The selection of parameters for Multiscale Entropy (MSE) analysis, particularly the
max_scale parameter set to 20, is instrumental in dissecting the complexity and predictability of eye movements within virtual reality (VR) environments. This value is chosen based
on a balance between capturing a broad range of temporal scales and maintaining computational efficiency. A max_scale of 20 typically covers the spectrum from rapid, short-term
dynamics to more extended patterns of engagement without overly smoothing the data,
thus preserving the detailed behavior inherent in the eye movement recordings. This careful calibration allows for a nuanced examination of eye movement behavior across these
scales, revealing insights into immediate reactions and prolonged engagement patterns.
The choice is further supported by empirical testing and alignment with standard practices
in time-series analysis, ensuring that the scale range is both practical and effective for
capturing the complexities of eye movement in VR environments.
Interpreting MSE Analysis: Interpretations of MSE plots draw upon complex signal
analyses, aligning with methodologies used in various scientific studies. These interpreta-
Sensors 2024, 24, 1781
6 of 18
tions help us understand the diverse nature of eye movements, from short-term adjustments
to long-term strategies [30–33].
•
•
•
Rising Trends (Increasing Sample Entropy with Scale Factor): Gradual increases in
sample entropy across scales can indicate adaptive or exploratory behavior, reflecting
users’ continuous engagement and adjustment to the complexities of the VR environment. This pattern may denote a dynamic and enriching interaction, where the gaze
behavior becomes progressively complex and unpredictable, suggesting deepening
engagement or increasing task difficulty.
Fluctuating Trends (Variable Sample Entropy): Fluctuations in sample entropy highlight moments of transition in cognitive demand or strategy. Peaks in entropy may
indicate periods of heightened complexity or adaptation to new stimuli, whereas
valleys suggest more predictable or habitual gaze patterns. Such variability is crucial
for understanding how users interact with changing content or complexities within
the VR environment.
Steady Trends (Consistent Sample Entropy across Scale Factors): A consistent level
of sample entropy across scales suggests a uniform cognitive load or a continuous
level of interaction complexity. This trend may indicate that users have reached an
equilibrium in their gaze behavior, efficiently balancing exploration and exploitation
of visual information in the VR environment. It might also suggest that the VR task or
experience is consistently engaging users at a stable level throughout.
Comprehensive Insights from MSE Analysis: The understanding of these trends
provides deeper insights into the cognitive load, user engagement, and possible avenues
for VR experience enhancement as follows:
•
•
•
Datasets with rising trends and fluctuations might indicate a constantly adapting and
highly engaged user, with significant moments of cognitive demand interspersed with
periods of routine interaction.
Steady trends can suggest a balanced and consistent level of interaction, possibly
pointing towards efficient user navigation or well-calibrated task demands within the
VR environment.
The absence of declining trends in our analysis indicates that users do not typically
exhibit decreasing complexity or increasing predictability in their gaze patterns across
the studied tasks, highlighting the consistent challenge or engagement presented by
the VR tasks.
Scale Factor and Sample Entropy: Both the scale factor and sample entropy are pivotal in MSE analysis. The scale factor, a dimensionless number, determines the granularity
of the time series analysis, affecting the temporal resolution and interpretability of complexity patterns. Specifically, it represents the level of coarse-graining applied to the time series
data, signifying the extent to which data points are averaged together. A scale factor of 1
indicates no coarse-graining, thus reflecting the original data, while higher-scale factors
imply a progressive averaging of data points, highlighting broader, long-term patterns.
Sample entropy, also a dimensionless measure, quantifies the regularity and unpredictability within the time series across scales. As a statistical tool, it offers insights into
the complexity inherent in the eye movement data. A decrease in entropy at larger scale
factors typically implies increasing predictability and regularity over longer periods, while
an increase might indicate more complex and less predictable patterns, reflecting changes
in cognitive demand or task complexity. Together, these metrics provide a comprehensive
understanding of the intricate dynamics of eye movements in VR environments, reflecting
the cognitive processes and engagement levels of users.
3. Results and Discussion
Understanding the intricacies of eye movement behavior in virtual reality (VR) environments is crucial for enhancing user experience and interface design. This study applies
Multiscale Entropy (MSE) analysis to dissect the complexity of eye movements, specifically
Sensors 2024, 24, 1781
7 of 18
focusing on the horizontal (x) and vertical (y) components of gaze position. MSE, a robust
method for quantifying the complexity and predictability of time series data, is particularly
well suited for exploring the nuanced patterns of eye movement in various VR tasks.
By analyzing these components across different scales, the study reveals insights into
the dynamic nature of gaze behavior and its implications for VR interaction. The following
subsections present the key findings from the MSE analysis, highlighting the distinct
complexity patterns observed in eye movements across a series of VR tasks. Through these
results, the study aims to shed light on the multifaceted nature of human gaze in virtual
environments and contribute to the optimized design and development of VR systems.
3.1. Multiscale Entropy Analysis of Eye Movements in the VRG Task
Task Context: The vergence (VRG) task involves eye movements where participants
adjust their gaze to focus on objects at varying depths within the virtual environment,
simulating a 3D experience. This task is critical for depth perception and fundamental to a
realistic and immersive virtual environment.
MSE Findings: Multiscale Entropy (MSE) analysis was performed on both the horizontal (x) and vertical (y) components of eye movements during the VRG task. Figure 2
illustrates the sample entropy across different scale factors for the horizontal and vertical
gaze positions, respectively. The sample entropy was plotted across different scale factors
for both gaze positions, illustrating the complexity and predictability of eye movements as
participants engaged in depth-adjusting tasks.
Interpretation:
Horizontal Component (x) Analysis: For the horizontal component, the rising trends
in sample entropy across nearly all datasets indicate an increase in complexity and unpredictability of gaze patterns as the scale factor increases. The patterns resembling y = ln( x )
suggest an adaptive or exploratory behavior, reflecting continuous engagement and adjustment by the users to the depth variations in the VR environment. The slight positive
slope observed in the rest of the datasets, similar to y = n1 x, although indicative of a rising
trend, suggests a more gradual increase in complexity, possibly due to a more steady or
predictable aspect of the vergence task.
Figure 2. Cont.
Sensors 2024, 24, 1781
8 of 18
Figure 2. Multiscale Entropy analysis of eye movements in the VRG Task. (a) Horizontal and
(b) Vertical Gaze Position. The VRG task involves vergence eye movements, where participants adjust
their gaze to focus on objects at varying depths, simulating a 3D environment in VR.
Vertical Component (y) Analysis: In the vertical component, the datasets split into
two distinct behaviors: approximately half exhibiting a logarithmic increase, indicating
adaptive behavior as seen in the horizontal component, and the other half displaying
a steady trend, suggesting a uniform cognitive load or a consistent level of interaction
complexity. The consistent sample entropy across scales in these steady trend datasets
might imply that for certain aspects of the VRG task, users maintain a stable gaze behavior,
efficiently balancing exploration and exploitation of depth information presented in the
virtual environment.
Comprehensive Insights from VRG Task Analysis: The observed MSE trends for the
VRG task provide valuable insights into how users interact with and adapt to 3D virtual
environments. Rising trends across most datasets, especially in the horizontal component,
underscore the dynamic and engaging nature of the task, prompting users to continuously
adjust their gaze strategy. The variability in entropy, particularly the distinction between
rising and steady trends in the vertical component, highlights the complexity of depth
perception and the different strategies users might employ to navigate 3D spaces. These
findings emphasize the need for VR content design to consider the varying cognitive loads
and interaction complexities inherent in depth-oriented tasks, aiming to optimize user
comfort and enhance the overall immersive experience.
Implications for VR Content Design and User Experience: Understanding the complexity patterns in eye movements during vergence tasks can inform the design of more
intuitive and comfortable VR experiences. By aligning VR content and interface design
with the natural tendencies of human ocular activity in 3D environments, developers can
create more immersive and engaging applications that cater to the depth perception needs
of users. Furthermore, recognizing the adaptive behaviors and stable interaction trends in
eye movements can help optimize the balance between visual challenge and user comfort,
contributing to the reduction of VR sickness and enhancing long-term user engagement.
3.2. Multiscale Entropy Analysis of Eye Movements in the VID Task
Task Context: The Video Viewing (VID) task is a central component of virtual reality
(VR) experiences, involving participants watching diverse video content. This immersive
activity requires users to maintain sustained attention and engagement, as they visually
explore and react to dynamic visual scenes. Understanding how gaze behavior adapts
Sensors 2024, 24, 1781
9 of 18
during video viewing is crucial for enhancing user experience and optimizing content
design in cinematic or narrative VR settings.
MSE Findings: For the VID task, Multiscale Entropy (MSE) analysis was conducted
on the horizontal (x) and vertical (y) components of eye movements. Figure 3 showcases
the sample entropy across different scale factors for horizontal and vertical gaze positions,
respectively. The analysis produced plots showing sample entropy across various scale factors for both gaze positions, reflecting the complexity and predictability of eye movements
as participants interacted with video content.
Figure 3. Multiscale Entropy analysis of eye movements in the VID Task. (a) Horizontal and
(b) Vertical Gaze Position. The VID task involves participants watching videos, a common activity in
VR, focusing on understanding how gaze behavior changes during passive viewing.
Interpretation:
Linearly Rising Trends: Both the horizontal and vertical components exhibit linearly
rising trends in sample entropy with different slopes. This consistent increase across the
scale factors suggests that as the viewing time extends, the complexity of gaze patterns
also increases. The variation in slopes between the horizontal and vertical components
Sensors 2024, 24, 1781
10 of 18
could indicate different levels or types of engagement and visual exploration. For instance,
steeper slopes may represent periods of more intense visual exploration or engagement
with the content, possibly during dynamic or rapidly changing scenes.
Comprehensive Insights from VID Task Analysis: The MSE analysis of eye movements during the VID task underscores the dynamic interaction between viewers and
video content in VR environments. The linearly rising trends across most datasets highlight the ongoing adaptation and engagement of viewers with the content, providing a
quantitative measure of how video narratives and visual stimuli influence gaze behavior.
These insights can drive the development of more captivating and immersive VR video
experiences, ensuring that content remains engaging and aligned with the viewer’s natural
gaze dynamics.
Implications for Content Design and User Experience: These findings have significant implications for VR video content creators. Understanding the nature of gaze
complexity during video viewing can help in identifying key moments for narrative or
visual shifts to maintain or enhance viewer engagement. For example, creators might
introduce changes in content at points where the entropy trend suggests a decrease in
engagement or predictability, thereby reinvigorating viewer attention and interest.
3.3. Multiscale Entropy Analysis of Eye Movements in the PUR Task
Task Context: The pursuit (PUR) task is a crucial component in understanding user
interaction in virtual reality (VR), particularly focusing on smooth pursuit eye movements.
Participants track moving targets, a fundamental activity for simulating realistic scenarios
in interactive and gaming environments. This task is vital for examining the mechanisms
of visual tracking and how individuals maintain focus on dynamic objects within VR.
MSE Findings: Multiscale Entropy (MSE) analysis was performed on both the horizontal (x) and vertical (y) components of eye movements during the PUR task. Figure 4
illustrates the sample entropy across different scale factors for the horizontal and vertical
gaze positions, respectively. The analysis yielded entropy trends across various scale factors, reflecting the complexity and predictability of eye movements as participants engaged
in tracking moving targets.
Interpretation:
Horizontal Component—Logistic Function Trends: The horizontal eye movements
exhibited trends similar to logistic functions with varying heights, indicating a saturation
effect in the complexity of eye movements.
Vertical Component—Logarithmic and Steady Trends: Vertical eye movements
predominantly displayed a logarithmic increase in complexity, suggesting a continuous
adaptation or increasing challenge in tracking the vertical motion of targets. A few datasets
exhibited steady trends, which might indicate consistent tracking behavior or uniform task
difficulty across those particular sessions.
Comprehensive Insights from PUR Task Analysis: The MSE trends in the PUR task
provide a detailed account of how users engage with moving targets in VR. The logisticlike patterns in horizontal movements suggest a bounded complexity, perhaps due to the
capabilities of the human visual system or the specific nature of the task. On the other hand,
the logarithmic trends in vertical movements reflect a more nuanced adaptation process,
with users possibly exerting more effort or strategy in tracking vertical motion.
Implications for VR Content Design and User Experience: Understanding these
intricate patterns of eye movement can inform the design of VR content, particularly
in scenarios requiring object tracking or interaction with moving elements. Designers
and developers might consider the limitations and capabilities reflected in the logistic
function trends when creating moving targets or interactive elements, ensuring that these
are within comfortable tracking ranges for users. Additionally, the varying complexity in
vertical tracking might influence how vertical motion is incorporated into VR experiences,
balancing challenge and comfort to enhance overall engagement and minimize discomfort
or disorientation.
Sensors 2024, 24, 1781
11 of 18
Figure 4. Multiscale Entropy analysis of eye movements in the PUR task. (a) Horizontal and
(b) Vertical Gaze Position. In the PUR task, subjects engage in smooth pursuit movements, following
moving objects with their eyes, which is essential for tracking moving stimuli in VR.
3.4. Multiscale Entropy Analysis of Eye Movements in the TEX Task
Task Context: The text reading (TEX) task simulates the act of reading within a virtual
environment, an essential component in educational and informational VR applications.
This task involves participants reading text presented in the VR space, requiring saccadic
movements and fixations similar to those in real-world reading scenarios. Understanding
the complexity of these eye movements is vital for creating VR systems that support
effective and comfortable reading experiences.
MSE Findings: In the TEX task, Multiscale Entropy analysis was applied to both the
horizontal (x) and vertical (y) components of eye movements associated with reading. The
analysis yields distinct patterns of sample entropy across different scale factors for both gaze
positions. Figure 5 represents the sample entropy across different scale factors for horizontal
and vertical gaze positions, respectively. Horizontally, there is a trend of linearly rising
entropy with minor fluctuations, indicative of the systematic left-to-right eye movement
typical of reading in many languages. Vertically, most datasets show a rising trend with
Sensors 2024, 24, 1781
12 of 18
more pronounced fluctuations, reflecting the varied complexity as eyes move down the
page or screen and possibly adjust to different lengths of text or formatting changes.
Figure 5. Multiscale Entropy analysis of eye movements in the TEX task. (a) Horizontal and
(b) Vertical Gaze Position. The TEX task represents reading text in VR, a scenario that involves
distinct eye movement patterns due to the linear nature of text and frequent line shifts.
Interpretation: The MSE trends observed in the TEX task shed light on the nuanced
complexity of eye movements during VR reading activities. In the horizontal component,
the mostly linearly rising trend with little fluctuations suggests a consistent, though slightly
increasing, complexity in eye movement as the text progresses, likely due to accumulating
cognitive load or adapting strategies for text navigation. The presence of minor fluctuations might also indicate variations in line lengths, word complexity, or punctuation that
momentarily alter the reading rhythm.
For the vertical component, the varying slopes and more pronounced fluctuations in
rising trends suggest a diverse range of strategies or challenges as users navigate between
lines or paragraphs. This variability might reflect differences in text layout, the effort
required to adjust to new lines, or the cognitive processing of section breaks or paragraph
Sensors 2024, 24, 1781
13 of 18
endings. A few datasets showing a more steady trend might represent users with more
uniform reading strategies or perhaps simpler text structures that elicit more consistent eye
movement patterns.
In both cases, the MSE analysis illustrates the complex landscape of reading behavior
in VR, offering valuable insights for designing VR content and interfaces that accommodate
the natural reading process. By aligning text presentation with the observed entropy trends,
developers can enhance readability, reduce cognitive load, and improve user comfort and
engagement with textual content in VR.
Implications for VR Content Design and User Experience: Understanding the entropy patterns in reading-related eye movements enables VR content developers to optimize
text layout, pacing, and interaction in educational and informational applications. The insights from the horizontal and vertical components of eye movement can inform decisions
about font size, line spacing, and the placement of textual elements to align with natural
reading patterns and minimize discomfort or disorientation. Furthermore, recognizing the
variability in individual reading strategies, as suggested by the different slopes and fluctuations in entropy, underscores the need for customizable or adaptive reading interfaces that
can accommodate a range of user preferences and capabilities.
3.5. Multiscale Entropy Analysis of Eye Movements in the RAN Task
Task Context: The random saccades (RAN) task captures the eye movements when
participants are required to swiftly and randomly shift their gaze between various points
within the VR environment. This task simulates the erratic and spontaneous eye movements
that occur as users navigate through and respond to unpredictable or rapidly changing virtual scenarios. Assessing the complexity of these movements is essential for understanding
user navigational strategies and responsiveness in a dynamic VR landscape.
MSE Findings: Multiscale Entropy analysis applied to the RAN task’s eye movements,
both horizontal (x) and vertical (y), shows a pattern of rising entropy, depicted in Figure 6.
The entropy progression in both components, characterized by a generally linear rise with
similar slopes across different datasets, suggests a consistent increase in complexity as
the scale factor increases. This uniformity in the trend across datasets might indicate a
standardized response to the random saccades required in the task, reflecting the users’
adaptation to the unpredictability inherent in the activity.
Interpretation: The MSE findings from the RAN task offer insights into the nature of
eye movements during random, quick shifts of gaze in VR. The consistently rising entropy
across scales in both horizontal and vertical components suggests that participants exhibit a
gradual increase in complexity in their eye movements, possibly as a reflection of ongoing
adaptation to the unpredictability and dynamism of the task. This adaptation might be
indicative of users’ efforts to optimize their gaze strategy to efficiently deal with random
stimuli, maintaining a level of readiness to shift focus swiftly and accurately.
The similar slopes in the linear rise of entropy among datasets imply a commonality in
the approach users take to handle random saccades in VR. This could be due to the nature
of the task, which requires a general alertness and readiness to move the gaze randomly,
leading to a uniform increase in complexity across users.
Comprehensive Insights from RAN Task Analysis: The RAN task’s analysis underscores the importance of understanding spontaneous and random eye movements in VR,
as they are indicative of how users might navigate and respond to unstructured or unpredictable environments. Insights from this task can inform the design of VR experiences
that require users to be alert and ready to change focus quickly, such as in certain gaming
scenarios or training simulations where rapid situational awareness is critical.
Sensors 2024, 24, 1781
14 of 18
Figure 6. Multiscale Entropy analysis of eye movements in the RAN task. (a) Horizontal and
(b) Vertical Gaze Position. In the RAN task, participants perform random saccades, simulating
unpredictable eye movements as they might occur in a dynamic and unstructured VR environment.
Implications for VR Content Design and User Experience: Understanding the consistent increase in complexity in eye movements during random saccades tasks can help VR
designers create environments that are attuned to the naturalistic movements of users. By
recognizing the characteristics of eye movement in unpredictable scenarios, VR experiences
can be optimized to accommodate or even leverage these natural behaviors, enhancing
user navigation, orientation, and overall engagement with the content. Moreover, insights
into the uniform strategies employed by users across different datasets may assist in standardizing certain aspects of VR design, ensuring a coherent and user-friendly experience
even in the most dynamic and unstructured virtual scenarios.
3.6. Deciphering Eye Movement Dynamics
The MSE analysis reveals diverse patterns in eye movements, reflecting users’ cognitive states and adaptability within VR environments. The absence of declining trends
suggests users may be consistently engaged or adapting to the complexities of VR tasks.
Sensors 2024, 24, 1781
15 of 18
Rising, fluctuating, and steady trends indicate different user interactions, each providing
insights into how participants navigate and respond to VR environments. These patterns
highlight the need for adaptive VR systems that are responsive to the intricate behaviors
of users.
3.7. Insights into User Comfort from MSE Analysis
Our study’s Multiscale Entropy (MSE) analysis of eye movements across various VR
tasks provides critical insights that can significantly contribute to enhancing user comfort
in VR environments. By examining the complexity of eye movements, we can identify
potential triggers of discomfort and devise strategies to mitigate these issues. For instance,
tasks that exhibit high complexity in eye movements may signal a higher cognitive load or
visual stress, potentially leading to discomfort or VR sickness.
Specifically, the analysis revealed that certain VR tasks, such as the PUR (pursuit) and
RAN (random saccades) tasks, show pronounced complexity in eye movement patterns.
This complexity could indicate that users are required to constantly adapt their gaze,
potentially leading to eye strain or fatigue over extended periods. To address this, VR
content designers can consider integrating more natural gaze paths and reducing the
frequency of abrupt visual changes in these tasks. Moreover, implementing dynamic
difficulty adjustments based on real-time MSE analysis of eye movements can help maintain
user engagement without overwhelming them, thereby enhancing comfort.
Furthermore, our findings suggest that steady trends in eye movement complexity,
observed in tasks like the TEX (text reading) task, could be leveraged to design VR experiences that align with natural eye movement patterns, minimizing the risk of discomfort.
By tailoring content to match these naturalistic patterns, VR experiences can become more
intuitive and less taxing on the user, promoting longer and more comfortable usage.
Incorporating adaptive interfaces that respond to the MSE analysis of a user’s eye
movements can also offer personalized comfort adjustments. For example, interfaces could
automatically adjust text size, brightness, or even the pace of content based on real-time
analysis, ensuring a comfortable viewing experience tailored to individual user needs.
Overall, the application of MSE analysis in understanding and optimizing eye movement dynamics in VR opens up new avenues for creating more comfortable and engaging
virtual environments. By closely aligning VR design with the natural complexities of
human eye movements, we can significantly reduce discomfort and enhance the overall
user experience. Future research should continue to explore the relationship between eye
movement complexity and user comfort, fostering the development of VR technologies
that prioritize user well-being.
3.8. Comparative Analysis with Existing Research
Our study contributes to the burgeoning field of VR and eye movement analysis by
providing insights into the complexity of gaze patterns in VR environments. Previous research,
including Mallari et al.’s systematic review on VR’s impact on chronic pain [34], highlights
VR’s potential in enhancing user experiences. Riva et al. [35] and Chiquet et al. [36] have further elucidated the benefits of VR in various therapeutic and wellness contexts, underscoring
the versatility of VR applications. Wu et al.’s work [37] on VR in clinical settings adds to the
understanding of VR’s applicability across different domains.
Expanding on these foundations, our application of Multiscale Entropy (MSE) analysis to VR gaze data offers a novel perspective on user engagement and cognitive load.
Parsons (2015) [38], Kourtesis et al. (2019) [39], and Tashjian et al. (2017) [40] emphasize
the importance of immersive and interactive VR experiences for cognitive research and
therapeutic interventions. Our findings resonate with these studies by demonstrating how
MSE analysis can be used to assess and enhance the VR experience, contributing to a deeper
understanding of user behavior in VR environments.
Moreover, recent studies by Tang et al. [41] and Sipatchin et al. [42] highlight the critical
role of eye-tracking in VR for cognitive assessments and clinical applications. By integrating
Sensors 2024, 24, 1781
16 of 18
MSE analysis with eye-tracking data, our research aligns with the broader trajectory of
leveraging VR technology to create immersive, beneficial, and user-centric experiences.
4. Conclusions
This study advances our understanding of eye movement behavior in virtual reality
(VR) through Multiscale Entropy (MSE) analysis. By dissecting the complexity of eye
movements across various VR tasks, we highlight the potential of MSE to contribute to
more intuitive and immersive VR environments. The nuanced insights into gaze behavior
provided by this analysis are invaluable for enhancing user experience and interface design
in VR technology.
4.1. Contributions
We conducted a comprehensive investigation into the complexity of eye movement
patterns in VR using MSE analysis. Our meticulous analysis of the horizontal and vertical
components of eye movements across a range of VR tasks—including vergence, smooth
pursuit, video viewing, reading, and random saccades—revealed distinct complexity patterns. These patterns, characterized by logarithmic, logistic, rising, and steady behaviors,
contribute to a deeper understanding of gaze predictability and complexity in VR. Our findings offer valuable insights for optimizing VR interfaces and enhancing user experiences
by adapting to dynamic user interactions.
4.2. Future Work
The continued exploration of MSE and its applications in VR is crucial for unlocking
the full potential of this technology for users worldwide. A collaborative approach between
academia and industry will accelerate innovation and ensure that VR technology evolves
in tandem with our understanding of user needs and preferences. Future research should
focus on the following:
•
•
•
Investigating a broader range of VR tasks and long-term user interactions to provide a
more comprehensive understanding of how entropy trends manifest in different contexts.
Exploring the implications of these findings for enhancing user comfort and engagement in VR, aiming to support the development of more personalized and
immersive experiences.
Integrating complex signal analysis like MSE into VR research, recognizing the potential for transformative impacts on technology and user interaction.
These directions for future research underscore the importance of a nuanced approach
to studying VR interactions, with the goal of creating more engaging and sustainable
VR environments.
Author Contributions: Conceptualization, S.Z.; methodology, S.Z.; software, S.Z.; validation, S.Z.;
formal analysis, S.Z.; investigation, S.Z.; data curation, S.Z.; writing—original draft preparation,
S.Z.; writing—review and editing, S.Z. and G.L.; visualization, S.Z.; supervision, G.L.; project
administration, G.L. All authors have read and agreed to the published version of the manuscript.
Funding: Sahar Z. would like to express her sincere gratitude to the College of Architecture and
Gregory A. Luhan, Department Head of Architecture at Texas A&M University, for providing financial
support and research assistantship (grant #248012) during the past four years.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Data and Python code for reproducing the experiments reported in this
article are available upon request. Please contact Sahar Zandi at
[email protected] for access.
Acknowledgments: Sahar Z. is grateful to Shahriar Esmaeili for his assistance and guidance in
understanding the Multiscale Entropy Analysis (MSE) concepts, which significantly contributed to
the development of the Python code used in this study.
Sensors 2024, 24, 1781
17 of 18
Conflicts of Interest: The authors declare no conflicts of interest.
Abbreviations
The following abbreviations are used in this manuscript:
MSE
AR
VR
VRG
VID
PUR
TEX
RAN
Multiscale Entropy
Augmented Reality
Virtual Reality
vergence task
video-viewing atsk
pursuit task
text-reading task
random saccades task
References
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
Billinghurst, M.; Clark, A.; Lee, G. A survey of augmented reality. Found. Trends Hum. Comput. Interact. 2015, 8, 73–272. [CrossRef]
Rizzo, A.; Koenig, S.T. Is clinical virtual reality ready for primetime? Neuropsychology 2017, 31, 877. [CrossRef] [PubMed]
Clay, V.; König, P.; Koenig, S. Eye tracking in virtual reality. J. Eye Mov. Res. 2019, 12. [CrossRef] [PubMed]
Mihelj, M.; Novak, D.; Beguš, S. Virtual Reality Technology and Applications; Springer: Dordrecht, The Netherlands, 2014.
Kaewrat, C.; Boonbrahm, P.; Sahoh, B. The Design and Development of a Foot-Detection Approach Based on Seven-Foot
Dimensions: A Case Study of a Virtual Try-On Shoe System Using Augmented Reality Techniques. Informatics 2023, 10, 48.
[CrossRef]
Zandi, S. Revival of the Silk Road using the applications of AR/VR and its role on cultural tourism. arXiv 2023, arXiv:2304.10545.
Innocente, C.; Piazzolla, P.; Ulrich, L.; Moos, S.; Tornincasa, S.; Vezzetti, E. Mixed Reality-Based Support for Total Hip Arthroplasty
Assessment. In Proceedings of the International Joint Conference on Mechanics, Design Engineering & Advanced Manufacturing,
Ischia, Italy, 1–3 June 2022; pp. 159–169.
Su, S.; Lei, P.; Wang, C.; Gao, F.; Zhong, D.; Hu, Y. Mixed reality technology in total knee arthroplasty: An updated review with a
preliminary case report. Front. Surg. 2022, 9, 804029. [CrossRef] [PubMed]
Zavala-González, J.; Martínez, D.; Gutiérrez-Espinoza, H. Effectiveness of adding virtual reality to physiotherapeutic treatment
in patients with total hip arthroplasty. A randomized controlled trial. Clin. Rehabil. 2022, 36, 660–668. [CrossRef] [PubMed]
Chang, E.; Kim, H.T.; Yoo, B. Virtual reality sickness: A review of causes and measurements. Int. J. Hum. Comput. Interact. 2020,
36, 1658–1682. [CrossRef]
Won, J.; Kim, Y.S. A new approach for reducing virtual reality sickness in real time: Design and validation study. JMIR Serious
Games 2022, 10, e36397. [CrossRef] [PubMed]
Parida, K.; Bark, H.; Lee, P.S. Emerging thermal technology enabled augmented reality. Adv. Funct. Mater. 2021, 31, 2007952.
[CrossRef]
Zandi, S.; Luhan, G.A. Exploring User Interactions in AR/VR Interfaces: A Simulation-Based Study. In Proceedings of
the 2023 International Conference on Electrical, Computer and Energy Technologies (ICECET), Cape Town, South Africa,
16–17 November 2023; pp. 1–6. [CrossRef]
Dwivedi, Y.K.; Hughes, L.; Baabdullah, A.M.; Ribeiro-Navarrete, S.; Giannakis, M.; Al-Debei, M.M.; Dennehy, D.; Metri,
B.; Buhalis, D.; Cheung, C.M.; et al. Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges,
opportunities, and agenda for research, practice and policy. Int. J. Inf. Manag. 2022, 66, 102542. [CrossRef]
Palumbo, A. Microsoft HoloLens 2 in medical and healthcare context: State of the art and future prospects. Sensors 2022, 22, 7709.
[CrossRef] [PubMed]
Zandi, S. Sustainable and Resilient Systems for Intergenerational Justice. arXiv 2021, arXiv:2102.09122.
Cannavo, A.; D’Alessandro, A.; Daniele, M.; Giorgia, M.; Congyi, Z.; Lamberti, F. Automatic generation of affective 3D
virtual environments from 2D images. In Proceedings of the 15th International Conference on Computer Graphics Theory and
Applications (GRAPP 2020), Valetta, Malta, 27–29 February 2020; pp. 113–124.
Buttussi, F.; Chittaro, L. Effects of different types of virtual reality display on presence and learning in a safety training scenario.
IEEE Trans. Vis. Comput. Graph. 2017, 24, 1063–1076. [CrossRef] [PubMed]
Huang, K.T.; Ball, C.; Francis, J.; Ratan, R.; Boumis, J.; Fordham, J. Augmented versus virtual reality in education: An exploratory
study examining science knowledge retention when using augmented reality/virtual reality mobile applications. Cyberpsychol.
Behav. Soc. Netw. 2019, 22, 105–110. [CrossRef] [PubMed]
Logan, D.E.; Simons, L.E.; Caruso, T.J.; Gold, J.I.; Greenleaf, W.; Griffin, A.; King, C.D.; Menendez, M.; Olbrecht, V.A.; Rodriguez, S.;
et al. Leveraging virtual reality and augmented reality to combat chronic pain in youth: Position paper from the interdisciplinary
network on virtual and augmented technologies for pain management. J. Med. Internet Res. 2021, 23, e25916. [CrossRef] [PubMed]
Demeco, A.; Zola, L.; Frizziero, A.; Martini, C.; Palumbo, A.; Foresti, R.; Buccino, G.; Costantino, C. Immersive virtual reality in
post-stroke rehabilitation: A systematic review. Sensors 2023, 23, 1712. [CrossRef] [PubMed]
Sensors 2024, 24, 1781
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
18 of 18
Haley, A.C.; Thorpe, D.; Pelletier, A.; Yarosh, S.; Keefe, D.F. Inward VR: Toward a Qualitative Method for Investigating
Interoceptive Awareness in VR. IEEE Trans. Vis. Comput. Graph. 2023, 29, 2557–2566. [CrossRef] [PubMed]
Anastasaki, I.; Drosatos, G.; Pavlidis, G.; Rantos, K. User Authentication Mechanisms Based on Immersive Technologies:
A Systematic Review. Information 2023, 14, 538. [CrossRef]
Jia, F.; Wang, W.; Yang, J.; Li, T.; Song, G.; Xu, Y. Effectiveness of Rectangular Cartogram for Conveying Quantitative Information:
An Eye Tracking-Based Evaluation. ISPRS Int. J. Geo-Inf. 2023, 12, 39. [CrossRef]
Javaherian, M.; Mollaei, S. Multiscale Entropy Analysis of Gravitational Waves. Adv. High Energy Phys. 2021, 2021, 6643546.
[CrossRef]
Lohr, D.; Aziz, S.; Friedman, L.; Komogortsev, O.V. GazeBaseVR: A large-scale, longitudinal, binocular eye-tracking dataset
collected in virtual reality. Sci. Data 2023, 10, 177. [CrossRef] [PubMed]
Griffith, H.; Lohr, D.; Abdulin, E.; Komogortsev, O. GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset.
Sci. Data 2021, 8, 184. [CrossRef] [PubMed]
D’Amelio, A.; Patania, S.; Bursic, S.; Cuculo, V.; Boccignone, G. Using Gaze for Behavioural Biometrics. Sensors 2023, 23, 1262.
[CrossRef] [PubMed]
Yin, J.; Sun, J.; Li, J.; Liu, K. An Effective Gaze-Based Authentication Method with the Spatiotemporal Feature of Eye Movement.
Sensors 2022, 22, 3002. [CrossRef]
Costa, M.; Goldberger, A.L.; Peng, C.K. Multiscale entropy analysis of biological signals. Phys. Rev. 2005, 71, 021906. [CrossRef]
[PubMed]
Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol.
Heart Circ. Physiol. 2000, 278, H2039–H2049. [CrossRef] [PubMed]
Ahmed, M.U.; Mandic, D.P. Multivariate multiscale entropy analysis. IEEE Signal Process. Lett. 2011, 19, 91–94. [CrossRef]
Humeau-Heurtier, A. The multiscale entropy algorithm and its variants: A review. Entropy 2015, 17, 3110–3123. [CrossRef]
Mallari, B.; Spaeth, E.K.; Goh, H.; Boyd, B.S. Virtual reality as an analgesic for acute and chronic pain in adults: A systematic
review and meta-analysis. J. Pain Res. 2019, 12, 2053–2085. [CrossRef]
Riva, G.; Wiederhold, B.K.; Mantovani, F. Neuroscience of virtual reality: From virtual exposure to embodied medicine.
Cyberpsychol. Behav. Soc. Netw. 2019, 22, 82–96. [CrossRef]
Chiquet, S.; Martarelli, C.S.; Mast, F.W. Eye movements to absent objects during mental imagery and visual memory in immersive
virtual reality. Virtual Real. 2021, 25, 655–667. [CrossRef]
Wu, H.Y.; Robert, F.; Fafet, T.; Graulier, B.; Passin-Cauneau, B.; Sassatelli, L.; Winckler, M. Designing Guided User Tasks in VR
Embodied Experiences. Proc. ACM Hum. Comput. Interact. 2022, 6, 1–24. [CrossRef]
Parsons, T.D. Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social
neurosciences. Front. Hum. Neurosci. 2015, 9, 660. [CrossRef] [PubMed]
Kourtesis, P.; Collina, S.; Doumas, L.A.; MacPherson, S.E. Technological competence is a pre-condition for effective implementation
of virtual reality head mounted displays in human neuroscience: A technological review and meta-analysis. Front. Hum. Neurosci.
2019, 13, 342. [CrossRef] [PubMed]
Tashjian, V.C.; Mosadeghi, S.; Howard, A.R.; Lopez, M.; Dupuy, T.; Reid, M.; Martinez, B.; Ahmed, S.; Dailey, F.; Robbins, K.; et al.
Virtual reality for management of pain in hospitalized patients: Results of a controlled trial. JMIR Ment. Health 2017, 4, e7387.
[CrossRef] [PubMed]
Tang, Z.; Liu, X.; Huo, H.; Tang, M.; Qiao, X.; Chen, D.; Dong, Y.; Fan, L.; Wang, J.; Du, X.; et al. Eye movement characteristics in a
mental rotation task presented in virtual reality. Front. Neurosci. 2023, 17, 1143006. [CrossRef] [PubMed]
Sipatchin, A.; Wahl, S.; Rifai, K. Eye-tracking for clinical ophthalmology with virtual reality (vr): A case study of the htc vive pro
eye’s usability. Healthcare 2021, 9, 180. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.