Acknowledgement
이 논문은 과학기술정보통신부 및 정보통신기획평가원(IITP)의 과기정통부 정보통신방송연구개발사업의 일환으로 수행된 연구임[No. 2022-0-00026, 초실감 메타버스 서비스를 위한 입체영상 디바이스 기술 개발].
References
- Computerworld, 2024 Will Be A Big Year For AR/VR, But Mainstream Adoption Will Lag, 2024, https://www.computerworld.com/article/3712164/2024-will-be-a-big-year-for-ar-vr-but-mainstream-adoption-will-lag.html
- T. Shibata et al., "The zone of comport: Predicting visual discomfort with stereo displays," J. Vis., vol. 11, no. 8, 2011.
- Wikipedia, Vergence-Accommodation Conflict, Sept. 2022, https://en.wikipedia.org/wiki/Vergence-accommodation_conflict
- Y. Ryu and E. Ryu, "Overview of motion-to-photon latency reduction for mitigating VR sickness," KSII TIIS, vol. 15, no. 7, 2021, pp. 2531-2546. https://doi.org/10.3837/tiis.2021.07.013
- Magic Leap 2 Overview, Distance to View Digital Content, https://www.magicleap.com/magic-leap-2
- L. Xiao et al., "DeepFocus: Learned image synthesis for computational display," ACM Trans. Graph., vol. 37, no. 6, 2018.
- J. March et al., "Impact of correct and simulated focus cues on perceived realism," in Proc. SA, (Daegu, Rep. of Korea), Nov. 2022, pp. 1-9.
- T. Zhan et al., "Multifocal displays: Review and prospect," PhotoniX, vol. 1, no. 10, 2020.
- S. Suyama et al., "Three-dimensional display system with dual-frequency liquid-crystal varifocal lens," Jpn. J. Appl. Phys., vol. 39, 2000, pp. 480-484. https://doi.org/10.1143/JJAP.39.480
- S. Liu, D. Cheng, and H. Hua, "An optical see-through head mounted display with addressable focal planes," in Proc. IEEE Int. Symp. Mixed Augmented Reality, (Cambridge, UK), Sept. 2008, pp. 33-42.
- S. Liu and H. Hua, "Time-multiplexed dual-focal plane head-mounted display with a liquid lens," Opt. Lett., vol. 34, no. 11, 2009, pp. 1642-1644. https://doi.org/10.1364/OL.34.001642
- S. Liu et al., "A novel prototype for an optical see-through head-mounted display with addressable focus cues," IEEE Trans. Vis. Comput. Graph., vol. 16, no. 3, 2009, pp. 381-393. https://doi.org/10.1109/TVCG.2009.95
- P. Llull et al., "Design and optimization of a near-eye multifocal display system for augmented reality," Imaging Appl. Opt., 2015, article no. JTH3A.5.
- J.H.R. Chang et al., "Towards multifocal displays with dense focal stacks," ACM Trans. Graph., vol. 37, no. 6, 2018.
- K. Rathinavel et al., "An extended depth-at-field volumetric near-eye augmented reality display," IEEE Trans. Vis. Comput. Graph., vol. 24, no. 11, 2018, pp. 2857-2866. https://doi.org/10.1109/TVCG.2018.2868570
- S. Lee et al., "Tomographic near-eye displays," Nat. Commun., vol. 10, 2019.
- W. Wu et al., "Content-adaptive focus configuration for near-eye multi-focal displays," in Proc. ICME, (Seattle, WA, USA), Jul. 2016.
- S. Liu and H. Hua, "A systematic method for designing depth-fused multi-focal plane three-dimensional displays," Opt. Express., vol. 18, no. 11, 2010, pp. 11562-11573. https://doi.org/10.1364/OE.18.011562
- S. Ravikumar et al., "Creating effective focus cues in multi-plane 3D displays," Opt. Express., vol. 19, no. 21, 2011, pp. 20940-20952. https://doi.org/10.1364/OE.19.020940
- R. Narain et al., "Optimal presentation of imagery with focus cues on multi-plane displays," ACM Trans. Graph., vol. 34, no. 4, 2015.
- O. Mercier et al., "Fast gaze-contingent optimal decompositions for multifocal displays," ACM Trans. Graph., vol. 36, no. 6, 2017.
- J.M. Boyce et al., "MPEG immersive video coding standard," Proc. IEEE, vol. 109, no. 9, 2021, pp. 1521-1536. https://doi.org/10.1109/JPROC.2021.3062590
- https://www.youtube.com/watch?v=x6AOwDttBsc
- https://www.meta.com/ko-kr/blog/quest/reality-labs-research-display-systems-siggraph-2023-butterscotch-varifocal-flamera/
- Y. Zhao et al., "Retinal-resolution varifocal VR," in Proc. SIGGRAPH, (Los Angeles, CA, USA), Aug. 2023, pp. 1-3.
- G. Kuo et al., "Perspective-correct VR passthrough without reprojection," in Proc. SIGGRAPH, (Los Angeles, CA, USA), Aug. 2023, pp. 1-9.
- Y. Qin et al., "Split-lohmann multifocal displays," ACM Trans. Graph., vol. 42, no. 4, 2023.
- K. Otao et al., "Light field blender: Designing optics and rendering methods for see-through and aerial near-eye display," in Proc. SA, (Bangkok, Thailand), Nov. 2017, pp. 1-4.
- K. Bang et al., "Lenslet VR: Thin, flat and wide-FOV virtual reality display using fresnel lens and lenslet array," IEEE Trans. Vis. Comput. Graph., vol. 27, no. 5, 2021, pp. 2545-2554. https://doi.org/10.1109/TVCG.2021.3067758
- M.C. Wapler and U. Wallrabe, "Ultra-fast and compact varifocal lens," in Proc. MEMS, (Seoul, Rep. of Korea), Jan. 2019.
- M.C. Wapler, "Ultra-fast, high-quality and highly compact varifocal lens with spherical aberration correction and low power consumption," Opt. Express, vol. 28, no. 4, 2020, pp. 4973-4987. https://doi.org/10.1364/OE.382472
- M.C. Wapler et al., "Aspherical high-speed varifocal mirror for miniature catadioptric objectives," Opt. Express, vol. 26, no. 5, 2018, pp. 6090-6102. https://doi.org/10.1364/OE.26.006090
- D. Iwai et al., "Speeded-up focus control of electrically tunable lens by sparse optimization," Sci. Rep., vol. 9, no. 1, 2019.
- A.G. Lopez-de-Haro et al., "Closed-loop experimental optimization of tunable lenses," Appl. Opt., vol. 61, no. 27, 2022, pp. 8091-8099. https://doi.org/10.1364/AO.467848