AR-DAVID: Augmented Reality Display Artifact Video Dataset DOI Creative Commons
Alexandre Chapiro, Dongyeon Kim, Yuta Asano

и другие.

ACM Transactions on Graphics, Год журнала: 2024, Номер 43(6), С. 1 - 11

Опубликована: Ноя. 19, 2024

The perception of visual content in optical-see-through augmented reality (AR) devices is affected by the light coming from environment. This additional interacts with a non-trivial manner because illusion transparency, different focal depths, and motion parallax. To investigate impact environment on display artifact visibility (such as blur or color fringes), we created first subjective quality dataset targeted toward displays. Our study consisted 6 scenes, each one distortions at two strength levels, seen against 3 background patterns shown 2 luminance levels: 432 conditions total. shows that has much smaller masking effect than expected. Further, show this cannot be explained compositing AR-content using optical blending models. As consequence, demonstrate existing video metrics perform worse expected when predicting perceived magnitude degradation AR displays, motivating further research.

Язык: Английский

16‐1: A Model for the Appearance of Interocular Colorimetric Differences in Binocular XR Displays DOI
Minqi Wang, Emily A. Cooper,

Lorenza Moro

и другие.

SID Symposium Digest of Technical Papers, Год журнала: 2024, Номер 55(1), С. 177 - 181

Опубликована: Июнь 1, 2024

Many extended reality (XR) devices present different views to the left and right eyes. Unwanted colorimetric differences between these can cause perceptual artifacts that degrade binocular image quality. We an image‐computable model designed predict appearance of with in XR displays. The is fitted data from a recent study which people provided multidimensional responses about stimuli simulating optical see‐through augmented device interocular intensity differences. This work be used create preliminary assessments artifact inform display design.

Язык: Английский

Процитировано

0

AR-in-VR simulator: A toolbox for rapid augmented reality simulation and user research DOI
Jacob Hadnett-Hunter,

Benjamin Lundell,

Ian Ellison-Taylor

и другие.

ACM Symposium on Applied Perception, Год журнала: 2024, Номер unknown, С. 1 - 11

Опубликована: Авг. 22, 2024

Язык: Английский

Процитировано

0

Evaluating the effects of colour blending on optical-see-through displays for ubiquitous visualizations DOI
Charles-Olivier Dufresne-Camaro, Yumiko Sakamoto, Pourang Irani

и другие.

Graphics Interface, Год журнала: 2024, Номер unknown, С. 1 - 13

Опубликована: Июнь 3, 2024

Язык: Английский

Процитировано

0

AR-DAVID: Augmented Reality Display Artifact Video Dataset DOI Creative Commons
Alexandre Chapiro, Dongyeon Kim, Yuta Asano

и другие.

ACM Transactions on Graphics, Год журнала: 2024, Номер 43(6), С. 1 - 11

Опубликована: Ноя. 19, 2024

The perception of visual content in optical-see-through augmented reality (AR) devices is affected by the light coming from environment. This additional interacts with a non-trivial manner because illusion transparency, different focal depths, and motion parallax. To investigate impact environment on display artifact visibility (such as blur or color fringes), we created first subjective quality dataset targeted toward displays. Our study consisted 6 scenes, each one distortions at two strength levels, seen against 3 background patterns shown 2 luminance levels: 432 conditions total. shows that has much smaller masking effect than expected. Further, show this cannot be explained compositing AR-content using optical blending models. As consequence, demonstrate existing video metrics perform worse expected when predicting perceived magnitude degradation AR displays, motivating further research.

Язык: Английский

Процитировано

0