000165985 001__ 165985 000165985 005__ 20260116163026.0 000165985 0247_ $$2doi$$a10.1145/3763361 000165985 0248_ $$2sideral$$a147323 000165985 037__ $$aART-2025-147323 000165985 041__ $$aeng 000165985 100__ $$aChu, Victor 000165985 245__ $$aArtifact-resilient real-time holography 000165985 260__ $$c2025 000165985 5060_ $$aAccess copy available to the general public$$fUnrestricted 000165985 5203_ $$aHolographic near-eye displays promise unparalleled depth cues, high-resolution imagery, and realistic three-dimensional parallax at a compact form factor, making them promising candidates for emerging augmented and virtual reality systems. However, existing holographic display methods often assume ideal viewing conditions and overlook real-world factors such as eye floaters and eyelashes—obstructions that can severely degrade perceived image quality. In this work, we propose a new metric that quantifies hologram resilience to artifacts and apply it to computer generated holography (CGH) optimization. We call this Artifact Resilient Holography (ARH). We begin by introducing a simulation method that models the effects of pre- and post-pupil obstructions on holographic displays. Our analysis reveals that eyebox regions dominated by low frequencies—produced especially by the smooth-phase holograms broadly adopted in recent holography work—are vulnerable to visual degradation from dynamic obstructions such as floaters and eyelashes. In contrast, random phase holograms spread energy more uniformly across the eyebox spectrum, enabling them to diffract around obstructions without producing prominent artifacts. By characterizing a random phase eyebox using the Rayleigh Distribution, we derive a differentiable metric in the eyebox domain. We then apply this metric to train a real-time neural network-based phase generator, enabling it to produce artifact-resilient 3D holograms that preserve visual fidelity across a range of practical viewing conditions—enhancing both robustness and user interactivity. 000165985 536__ $$9info:eu-repo/grantAgreement/ES/MCIU/FPU22/02432 000165985 540__ $$9info:eu-repo/semantics/openAccess$$aby$$uhttps://creativecommons.org/licenses/by/4.0/deed.es 000165985 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion 000165985 700__ $$aPueyo-Ciutad, Oscar$$uUniversidad de Zaragoza 000165985 700__ $$aTseng, Ethan 000165985 700__ $$aSchiffers, Florian 000165985 700__ $$aKuo, Grace 000165985 700__ $$aMatsuda, Nathan 000165985 700__ $$0(orcid)0000-0002-0601-4820$$aRedo-Sanchez, Alberto$$uUniversidad de Zaragoza 000165985 700__ $$aLanman, Douglas 000165985 700__ $$aCossairt, Oliver 000165985 700__ $$aHeide, Felix 000165985 7102_ $$15007$$2570$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Lenguajes y Sistemas Inf. 000165985 773__ $$g44, 6 (2025), 219 [14 pp.]$$pACM trans. graph.$$tACM TRANSACTIONS ON GRAPHICS$$x0730-0301 000165985 8564_ $$s1340980$$uhttps://zaguan.unizar.es/record/165985/files/texto_completo.pdf$$yVersión publicada 000165985 8564_ $$s717875$$uhttps://zaguan.unizar.es/record/165985/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada 000165985 909CO $$ooai:zaguan.unizar.es:165985$$particulos$$pdriver 000165985 951__ $$a2026-01-16-14:54:15 000165985 980__ $$aARTICLE