Virtual Reality,
Journal Year:
2024,
Volume and Issue:
28(3)
Published: June 26, 2024
Abstract
The
personalization
of
user
experiences
through
recommendation
systems
has
been
extensively
explored
in
Internet
applications,
but
this
yet
to
be
fully
addressed
Virtual
Reality
(VR)
environments.
complexity
managing
geometric
3D
data,
computational
load,
and
natural
interactions
poses
significant
challenges
real-time
adaptation
these
immersive
experiences.
However,
tailoring
VR
environments
individual
needs
interests
holds
promise
for
enhancing
In
paper,
we
present
Environment
Adaptation
Recommendations
(
VR-EAR
),
a
framework
designed
address
challenge.
employs
customizable
object
metadata
hybrid
system
modeling
implicit
feedback
We
utilize
optimization
techniques
ensure
efficient
performance.
To
evaluate
our
framework,
virtual
store
where
product
locations
dynamically
adjust
based
on
interactions.
Our
results
demonstrate
the
effectiveness
adapting
personalizing
real
time.
domains.
Computers & Education X Reality,
Journal Year:
2023,
Volume and Issue:
2, P. 100023 - 100023
Published: Jan. 1, 2023
In
response
to
the
high
demand
for
digital
learning
as
a
surrogate
physical
experiences,
virtual
reality
(VR)
is
positioning
itself
tool
creating
educational
experiences.
VR
technology
faces
number
of
ethical
issues,
including
reduction
users'
autonomy,
health
problems,
and
privacy
concerns.
The
use
realism
in
education
can
turn
out
be
double-edged
sword.
While
realistic
visualizations
promote
some
content
domains,
they
hinder
comprehension
others.
Furthermore,
effects
on
also
depend
learners'
spatial
abilities.
Letting
young
children
teenagers
engage
experiences
expose
them
manipulation,
could
lead
may
infringe
their
privacy.
short,
severely
limit
autonomy
ways.
Based
review
literature
considerations
emerging
technologies
such
generative
artificial
intelligence,
this
paper
presents
guidelines
ethically
sound
utilization
realism.
By
applying
findings
conclusions
established
context
research
ethics
technology,
I
develop
several
suggestions
that
help
avoid
negative
consequences
VR.
These
include
ability
testing,
requiring
offer
alternative
paths
prevent
well
using
algorithms
deidentify
highly
detailed
developmental
profiles
generated
through
use.
Nature Communications,
Journal Year:
2025,
Volume and Issue:
16(1)
Published: April 1, 2025
Eye-tracking
plays
a
crucial
role
in
the
development
of
virtual
reality
devices,
neuroscience
research,
and
psychology.
Despite
its
significance
numerous
applications,
achieving
an
accurate,
robust,
fast
eye-tracking
solution
remains
considerable
challenge
for
current
state-of-the-art
methods.
While
existing
reflection-based
techniques
(e.g.,
"glint
tracking")
are
considered
to
be
very
their
performance
is
limited
by
reliance
on
sparse
3D
surface
data
acquired
solely
from
cornea
surface.
In
this
paper,
we
rethink
way
how
specular
reflections
can
used
eye
tracking:
We
propose
method
accurate
evaluation
gaze
direction
that
exploits
teachings
single-shot
phase-measuring-deflectometry.
contrast
methods,
our
acquires
dense
information
both
sclera
within
only
one
single
camera
frame
(single-shot).
For
typical
measurement,
acquire
>3000×
more
reflection
points
("glints")
than
conventional
show
feasibility
approach
with
experimentally
evaluated
errors
realistic
model
below
0.13°.
Moreover,
demonstrate
quantitative
measurements
real
human
eyes
vivo,
reaching
accuracy
values
between
0.46°
0.97°.
The
authors
introduce
tracking
high
accuracy.
uses
deflectometry
surface,
leading
increase
factors
>
3000X
compared
IEEE Transactions on Visualization and Computer Graphics,
Journal Year:
2024,
Volume and Issue:
30(5), P. 2077 - 2086
Published: March 5, 2024
Eye
tracking
has
shown
great
promise
in
many
scientific
fields
and
daily
applications,
ranging
from
the
early
detection
of
mental
health
disorders
to
foveated
rendering
virtual
reality
(VR).
These
applications
all
call
for
a
robust
system
high-frequency
near-eye
movement
sensing
analysis
high
precision,
which
cannot
be
guaranteed
by
existing
eye
solutions
with
CCD/CMOS
cameras.
To
bridge
gap,
this
paper,
we
propose
Swift-Eye,
an
offline
precise
pupil
estimation
framework
support
analysis,
especially
when
region
is
partially
occluded.
Swift-Eye
built
upon
emerging
event
cameras
capture
high-speed
eyes
temporal
resolution.
Then,
series
bespoke
components
are
designed
generate
high-quality
video
at
frame
rate
over
kilohertz
deal
occlusion
caused
involuntary
blinks.
According
our
extensive
evaluations
on
EV-Eye,
large-scale
public
dataset
using
cameras,
shows
robustness
against
significant
occlusion.
It
can
improve
IoU
F1-score
20%
12.5%
respectively,
compared
second-best
competing
approach,
80%
occluded
eyelid.
Lastly,
it
provides
continuous
smooth
traces
pupils
extremely
resolution
number
potential
such
as
diagnosis,
behaviour-brain
association,
etc.
The
implementation
details
source
codes
found
https://github.com/ztysdu/Swift-Eye.
Journal of the Society for Information Display,
Journal Year:
2024,
Volume and Issue:
32(8), P. 605 - 646
Published: Aug. 1, 2024
Abstract
Building
on
several
decades
of
research
and
development,
the
recent
progress
in
virtual
reality
(VR)
augmented
(AR)
devices
with
spatial
computing
technologies
marks
a
significant
leap
human–computer
interaction,
applications
ranging
from
entertainment
education
to
e‐commerce
healthcare.
Advances
these
promise
immersive
experiences
by
simulating
augmenting
real
world
computer‐generated
digital
content.
The
core
objective
VR
AR
systems
is
create
convincing
human
sensory
perceptions,
thereby
creating
interactive
that
bridge
gap
between
physical
realities.
However,
achieving
true
immersion
remains
goal,
it
necessitates
comprehensive
understanding
neuroscience
multisensory
perception
accurate
technical
implementations
consistency
natural
synthetic
cues.
This
paper
reviews
sensory‐perceptual
requirements
vital
for
such
immersion,
examines
current
status
challenges,
discusses
potential
future
advancements.
Scientific Reports,
Journal Year:
2024,
Volume and Issue:
14(1)
Published: Feb. 12, 2024
Abstract
Global
interest
in
applying
virtual
reality
(VR)
research
and
medicine
has
grown
significantly,
with
potential
benefits
for
patients
suffering
from
balance
disorders,
instability,
a
high
risk
of
falling.
This
exploratory
study
assesses
the
impact
immersive
VR
(IVR)
delivered
through
head-mounted
display
(HMD)
on
explores
feasibility
using
HMD
unit
as
standalone
posturography
tool.
Using
Meta
Quest
2
mid-range
Android
smartphone
equipped
standard
sensors,
employed
environment
that
simulated
ship
at
sea,
thirty-eight
healthy
participants
no
otoneurologic
abnormalities.
Measurements
were
conducted
repeated
trials,
including
static
assessments
both
stable
ground
foam,
well
3-m
walk.
was
two
settings:
one
within
three
different
intensity
levels
other
non-VR
settings.
Statistical
analysis
clinical
evaluation
revealed
IVR
influences
head-level
sway
velocity,
which
correlates
increased
visual
disturbance,
suggesting
its
low-risk
Frontiers in Human Neuroscience,
Journal Year:
2024,
Volume and Issue:
18
Published: Feb. 26, 2024
This
study
explores
the
synchronization
of
multimodal
physiological
data
streams,
in
particular,
integration
electroencephalography
(EEG)
with
a
virtual
reality
(VR)
headset
featuring
eye-tracking
capabilities.
A
potential
use
case
for
synchronized
streams
is
demonstrated
by
implementing
hybrid
steady-state
visually
evoked
(SSVEP)
based
brain-computer
interface
(BCI)
speller
within
fully
immersive
VR
environment.
The
hardware
latency
analysis
reveals
an
average
offset
36
ms
between
EEG
and
mean
jitter
5.76
ms.
further
presents
proof
concept
VR,
showcasing
its
real-world
applications.
findings
highlight
feasibility
combining
commercial
technologies
neuroscientific
research
open
new
avenues
studying
brain
activity
ecologically
valid
environments.
Future
could
focus
on
refining
methods
exploring
applications
various
contexts,
such
as
learning
social
interactions.
Multimodal Technologies and Interaction,
Journal Year:
2024,
Volume and Issue:
8(11), P. 98 - 98
Published: Nov. 6, 2024
This
scoping
review
examines
the
broad
applications,
risks,
and
ethical
challenges
associated
with
Extended
Reality
(XR)
technologies,
including
Virtual
(VR),
Augmented
(AR),
Mixed
(MR),
within
context
of
Metaverse.
XR
is
revolutionizing
fields
such
as
immersive
learning
in
education,
medical
professional
training,
neuropsychological
assessment,
therapeutic
interventions,
arts,
entertainment,
retail,
e-commerce,
remote
work,
sports,
architecture,
urban
planning,
cultural
heritage
preservation.
The
integration
multimodal
technologies
haptics,
eye-tracking,
face-
body-tracking,
brain-computer
interfaces,
enhances
user
engagement
interactivity,
playing
a
key
role
shaping
experiences
However,
XR's
expansion
raises
serious
concerns,
data
privacy
cybersecurity
vulnerabilities,
cybersickness,
addiction,
dissociation,
harassment,
bullying,
misinformation.
These
psychological,
social,
security
are
further
complicated
by
intense
advertising,
manipulation
public
opinion,
social
inequality,
which
could
disproportionately
affect
vulnerable
individuals
groups.
emphasizes
urgent
need
for
robust
frameworks
regulatory
guidelines
to
address
these
risks
while
promoting
equitable
access,
privacy,
autonomy,
mental
well-being.
As
increasingly
integrate
artificial
intelligence,
responsible
governance
essential
ensure
safe
beneficial
development
Metaverse
broader
application
enhancing
human
development.
Scientific Reports,
Journal Year:
2023,
Volume and Issue:
13(1)
Published: May 17, 2023
Abstract
Eye-based
communication
languages
such
as
Blink-To-Speak
play
a
key
role
in
expressing
the
needs
and
emotions
of
patients
with
motor
neuron
disorders.
Most
invented
eye-based
tracking
systems
are
complex
not
affordable
low-income
countries.
Blink-To-Live
is
an
eye-tracking
system
based
on
modified
language
computer
vision
for
speech
impairments.
A
mobile
phone
camera
tracks
patient’s
eyes
by
sending
real-time
video
frames
to
modules
facial
landmarks
detection,
eye
identification
tracking.
There
four
defined
alphabets
language:
Left,
Right,
Up,
Blink.
These
gestures
encode
more
than
60
daily
life
commands
expressed
sequence
three
movement
states.
Once
encoded
sentences
generated,
translation
module
will
display
phrases
native
screen,
synthesized
voice
can
be
heard.
prototype
evaluated
using
normal
cases
different
demographic
characteristics.
Unlike
other
sensor-based
systems,
simple,
flexible,
cost-efficient,
no
dependency
specific
software
or
hardware
requirements.
The
its
source
available
from
GitHub
repository
(
https://github.com/ZW01f/Blink-To-Live
).