Proceedings of the ACM on Human-Computer Interaction,
Год журнала:
2024,
Номер
8(ETRA), С. 1 - 18
Опубликована: Май 20, 2024
Question
answering
has
recently
been
proposed
as
a
promising
means
to
assess
the
recallability
of
information
visualisations.
However,
prior
works
are
yet
study
link
between
visually
encoding
visualisation
in
memory
and
recall
performance.
To
fill
this
gap,
we
propose
VisRecall++
--
novel
40-participant
dataset
that
contains
gaze
data
on
200
visualisations
1,000
questions,
including
identifying
title
retrieving
values.
We
measured
by
asking
participants
questions
after
they
observed
for
10
seconds.
Our
analyses
reveal
several
insights,
such
saccade
amplitude,
number
fixations,
fixation
duration
significantly
differ
high
low
groups.
Finally,
GazeRecallNet
computational
method
predict
from
behaviour
outperforms
state-of-the-art
model
RecallNet
three
other
baselines
task.
Taken
together,
our
results
shed
light
assessing
inform
future
work
recallability-based
optimisation.
Behavior Research Methods,
Год журнала:
2022,
Номер
55(1), С. 364 - 416
Опубликована: Апрель 6, 2022
Abstract
In
this
paper,
we
present
a
review
of
how
the
various
aspects
any
study
using
an
eye
tracker
(such
as
instrument,
methodology,
environment,
participant,
etc.)
affect
quality
recorded
eye-tracking
data
and
obtained
eye-movement
gaze
measures.
We
take
to
represent
empirical
foundation
for
reporting
guidelines
involving
tracker.
compare
five
existing
database
207
published
studies.
find
that
vary
substantially
do
not
match
with
actual
practices.
end
by
deriving
minimal,
flexible
guideline
based
on
research
(Section
“An
empirically
minimal
guideline”).
Virtual Reality,
Год журнала:
2023,
Номер
27(2), С. 1481 - 1505
Опубликована: Янв. 18, 2023
Eye
tracking
is
becoming
increasingly
available
in
head-mounted
virtual
reality
displays
with
various
headsets
integrated
eye
trackers
already
commercially
available.
The
applications
of
are
highly
diversified
and
span
multiple
disciplines.
As
a
result,
the
number
peer-reviewed
publications
that
study
has
surged
recent
years.
We
performed
broad
review
to
comprehensively
search
academic
literature
databases
aim
assessing
extent
published
research
dealing
reality,
highlighting
challenges,
limitations
areas
for
future
research.
Behavior Research Methods,
Год журнала:
2025,
Номер
57(1)
Опубликована: Янв. 6, 2025
Abstract
Researchers
using
eye
tracking
are
heavily
dependent
on
software
and
hardware
tools
to
perform
their
studies,
from
recording
data
visualizing
it,
processing
analyzing
it.
This
article
provides
an
overview
of
available
for
research
trackers
discusses
considerations
make
when
choosing
which
adopt
one’s
study.
IEEE Transactions on Visualization and Computer Graphics,
Год журнала:
2021,
Номер
27(5), С. 2577 - 2586
Опубликована: Март 29, 2021
The
cameras
in
modern
gaze-tracking
systems
suffer
from
fundamental
bandwidth
and
power
limitations,
constraining
data
acquisition
speed
to
300
Hz
realistically.
This
obstructs
the
use
of
mobile
eye
trackers
perform,
e.g.,
low
latency
predictive
rendering,
or
study
quick
subtle
motions
like
microsaccades
using
head-mounted
devices
wild.
Here,
we
propose
a
hybrid
frame-event-based
near-eye
gaze
tracking
system
offering
update
rates
beyond
10,000
with
an
accuracy
that
matches
high-end
desktop-mounted
commercial
when
evaluated
same
conditions.
Our
system,
previewed
Figure
1,
builds
on
emerging
event
simultaneously
acquire
regularly
sampled
frames
adaptively
events.
We
develop
online
2D
pupil
fitting
method
updates
parametric
model
every
one
few
Moreover,
polynomial
regressor
for
estimating
point
real
time.
Using
first
event-based
dataset,
demonstrate
our
achieves
accuracies
0.45°-1.75°
fields
view
45°
98°.
With
this
technology,
hope
enable
new
generation
ultra-low-latency
gaze-contingent
rendering
display
techniques
virtual
augmented
reality.
This
manuscript
presents
GazeBase,
a
large-scale
longitudinal
dataset
containing
12,334
monocular
eye-movement
recordings
captured
from
322
college-aged
participants.
Participants
completed
battery
of
seven
tasks
in
two
contiguous
sessions
during
each
round
recording,
including
-
(1)
fixation
task,
(2)
horizontal
saccade
(3)
random
oblique
(4)
reading
(5/6)
free
viewing
cinematic
video
and
(7)
gaze-driven
gaming
task.
Nine
rounds
recording
were
conducted
over
37
month
period,
with
participants
subsequent
recruited
exclusively
prior
rounds.
All
data
was
collected
using
an
EyeLink
1000
eye
tracker
at
1,000
Hz
sampling
rate,
calibration
validation
protocol
performed
before
task
to
ensure
quality.
Due
its
large
number
nature,
GazeBase
is
well
suited
for
exploring
research
hypotheses
movement
biometrics,
along
other
applications
applying
machine
learning
signal
analysis.
Classification
labels
produced
by
the
instrument's
real-time
parser
are
provided
subset
pupil
area.
Journal of Eye Movement Research,
Год журнала:
2022,
Номер
15(3)
Опубликована: Сен. 7, 2022
A
growing
number
of
virtual
reality
devices
now
include
eye
tracking
technology,
which
can
facilitate
oculomotor
and
cognitive
research
in
VR
enable
use
cases
like
foveated
rendering.
These
applications
require
different
performance,
often
measured
as
spatial
accuracy
precision.
While
manufacturers
report
data
quality
estimates
for
their
devices,
these
typically
represent
ideal
performance
may
not
reflect
real-world
quality.
Additionally,
it
is
unclear
how
precision
change
across
sessions
within
the
same
participant
or
between
influenced
by
vision
correction.
Here,
we
Vive
Pro
Eye
built-in
tracker
a
range
30
visual
degrees
horizontally
vertically.
Participants
completed
ten
measurement
over
multiple
days,
allowing
to
evaluate
calibration
reliability.
Accuracy
were
highest
central
gaze
decreased
with
greater
eccentricity
both
axes.
Calibration
was
successful
all
participants,
including
those
wearing
contacts
glasses,
but
glasses
yielded
significantly
lower
performance.
We
further
found
differences
(but
precision)
two
headsets,
estimated
participants'
inter-pupillary
distance.
Our
metrics
suggest
high
reliability
serve
baseline
expected
experiments.
Behavior Research Methods,
Год журнала:
2023,
Номер
unknown
Опубликована: Авг. 7, 2023
Abstract
Over
the
past
few
decades,
there
have
been
significant
developments
in
eye-tracking
technology,
particularly
domain
of
mobile,
head-mounted
devices.
Nevertheless,
questions
remain
regarding
accuracy
these
eye-trackers
during
static
and
dynamic
tasks.
In
light
this,
we
evaluated
performance
two
widely
used
devices:
Tobii
Pro
Glasses
2
3.
A
total
36
participants
engaged
tasks
under
three
dynamicity
conditions.
“seated
with
a
chinrest”
trial,
only
eyes
could
be
moved;
without
both
head
were
free
to
move;
walking
walked
along
straight
path.
During
seated
trials,
participants’
gaze
was
directed
towards
dots
on
wall
by
means
audio
instructions,
whereas
maintained
their
bullseye
while
it.
Eye-tracker
determined
using
computer
vision
techniques
identify
target
within
scene
camera
image.
The
findings
showed
that
3
outperformed
terms
trials.
Moreover,
results
suggest
employing
chinrest
case
is
counterproductive,
as
it
necessitates
larger
eye
eccentricities
for
fixation,
thereby
compromising
compared
not
chinrest,
which
allows
movement.
Lastly,
found
who
reported
higher
workload
demonstrated
poorer
accuracy.
current
may
useful
design
experiments
involve
eye-trackers.
Behavior Research Methods,
Год журнала:
2023,
Номер
56(5), С. 5002 - 5022
Опубликована: Окт. 11, 2023
Abstract
This
paper
aims
to
compare
a
new
webcam-based
eye-tracking
system,
integrated
into
the
Labvanced
platform
for
online
experiments,
“gold
standard”
lab-based
eye
tracker
(EyeLink
1000
-
SR
Research).
Specifically,
we
simultaneously
recorded
data
with
both
trackers
in
five
different
tasks,
analyzing
their
real-time
performance.
These
tasks
were
subset
of
standardized
test
battery
trackers,
including
Large
Grid
task,
Smooth
Pursuit
movements,
viewing
natural
images,
and
two
Head
Movements
(roll,
yaw).
The
results
show
that
system
achieved
an
overall
accuracy
1.4°,
precision
1.1°
(standard
deviation
(SD)
across
subjects),
error
about
0.5°
larger
than
EyeLink
system.
Interestingly,
(1.3°)
(0.9°)
slightly
better
centrally
presented
targets,
region
interest
many
psychophysical
experiments.
Remarkably,
correlation
raw
gaze
samples
between
was
at
90%
task
80%
Free
View
Pursuit.
Overall,
these
put
performance
roughly
on
par
mobile
devices
(Ehinger
et
al.
PeerJ
,
7
e7086,
2019;
Tonsen
al.,
2020)
demonstrate
substantial
improvement
compared
existing
webcam
solutions
(Papoutsaki
2017).