DOI QR코드

DOI QR Code

Advanced 360-Degree Integral-Floating Display Using a Hidden Point Removal Operator and a Hexagonal Lens Array

  • Erdenebat, Munkh-Uchral (School of Information and Communication Engineering, Chungbuk National University) ;
  • Kwon, Ki-Chul (School of Information and Communication Engineering, Chungbuk National University) ;
  • Dashdavaa, Erkhembaatar (School of Information and Communication Engineering, Chungbuk National University) ;
  • Piao, Yan-Ling (School of Information and Communication Engineering, Chungbuk National University) ;
  • Yoo, Kwan-Hee (Department of Digital Informatics and Convergence, Chungbuk National University) ;
  • Baasantseren, Ganbat (Department of Electronics, National University of Mongolia) ;
  • Kim, Youngmin (Korea Electronics Technology Institute) ;
  • Kim, Nam (School of Information and Communication Engineering, Chungbuk National University)
  • Received : 2014.08.05
  • Accepted : 2014.10.27
  • Published : 2014.12.25

Abstract

An enhanced 360-degree integral-floating three-dimensional display system using a hexagonal lens array and a hidden point removal operator is proposed. Only the visible points of the chosen three-dimensional point cloud model are detected by the hidden point removal operator for each rotating step of the anamorphic optics system, and elemental image arrays are generated for the detected visible points from the corresponding viewpoint. Each elemental image of the elemental image array is generated by a hexagonal grid, due to being captured through a hexagonal lens array. The hidden point removal operator eliminates the overlap problem of points in front and behind, and the hexagonal lens array captures the elemental image arrays with more accurate approximation, so in the end the quality of the displayed image is improved. In an experiment, an anamorphic-optics-system-based 360-degree integral-floating display with improved image quality is demonstrated.

Keywords

I. INTRODUCTION

Integral imaging is a well-known three-dimensional (3D) display system that generates a two-dimensional (2D) elemental image array (EIA) from a real or virtual 3D object, and displays a full-parallax, continuously viewable 3D image based on the generated EIA. A few drawbacks are present in the displayed 3D image, such as limited viewing angle and resolution, owing to the use of a lens array. Also, an improvement in any viewing characteristic affects the others [1-4]. A lot of research has been conducted on solving the problem of narrow viewing angle; however, the obtained results are still limited [5-11].

Horizontal parallax-only volumetric light field display allows the display of a 3D image in a 360-degree viewing zone by using high-speed projection and a rotating screen [12, 13]. A 360-degree integral-floating display (IFD), which is a combination of integral imaging and a light field display, was proposed to improve the limited viewing angle of integral imaging display in the 360-degree horizontal direction [14]. However, the vertical viewing angle is certainly narrow, owing to a function of the double floating lenses, and it presents a low-quality image. These drawbacks occur even in other 360-degree IFDs [15, 16].

Thereafter, a method to enhance vertical viewing angle using an anamorphic optics system (AOS) was proposed [17]. Here, an AOS, which is a vertically curved convex mirror, improves the viewing angle enough for comfortable viewing in the vertical direction, because the AOS disperses reflected light rays more widely in the vertical direction, and during the rotation creates a 360-degree viewing zone in the horizontal. The theory was verified by optical experiment and yielded an efficient result for the enhancement of vertical viewing angle. By controlling the radius of the AOS, the vertical viewing angle can be improved and adjusted as a user wants; however, the low image quality did not change at all.

In this paper, we suggest a method to enhance image quality for a 360-degree IFD using a hidden point removal (HPR) operator and a hexagonal lens array (HLA). With the proposed method, the generated EIAs are absolutely aligned with the HPR operator and the HLA, and the improvement in the quality of the displayed image is appreciable and verified in experiment.

 

II. AOS-BASED 360-DEGREE IFD USING HPR AND HLA

The main process of the 360-degree IFD system is shown in Fig. 1. A collimating lens relays the projected 2D EIAs via high-speed digital micromirror device (DMD) to a lens array, which directly reconstructs them in 3D perspectives. The initially reconstructed 3D perspectives are conveyed to the rotating convex mirror through double floating lenses configured as a 4-f structure. The rotating convex mirror (i.e. an AOS) displays the entire 3D image by tailoring the initial 3D perspectives to each other and reflecting them in the corresponding viewing directions, while the AOS rotation is synchronized with DMD projection. Here the double floating lenses reduce the viewing angle of the initial 3D perspectives to match them with the AOS angular step [18], allowing possibly flawless tailoring of the 3D perspectives in the horizontal direction.

FIG. 1.General configuration of the proposed AOS-based 360-degree IFD.

The EIA generation process is the most essential part of the proposed display. Since the problem of narrow vertical viewing angle has been solved by using an AOS, we focus on improving the quality of the displayed image. The resolution of the EIAs is fixed at the DMD resolution, so image quality must be raised without altering the EIA resolution. In our consideration, the duplication of object points that are located behind and in front from each viewpoint is a big factor in the low quality of the displayed image, so the overlap problem of points in the 360-degree IFD must be elucidated. A recently reported HPR operator that determines the points appearing from a given viewpoint [19] can be used to solve the overlapping points problem and is required for the EIA generation process of the proposed system. First of all, the HPR operator extracts only the visible points of the chosen point cloud object and eliminates duplication of the front and rear points for each angular step of the AOS. In other words, when the AOS is in the first position (has not rotated), the visible points are detected through the HPR operator and carried over to the next stage of the EIA generation process. After the first EIA has been generated, the AOS is rotated by a given angular step, and the viewpoint is also shifted by that angular step of the AOS in the horizontal direction. Here another set of visible points is extracted for the corresponding shifted viewpoint, and a second EIA is generated based only on the newly detected visible points. The generation process for every EIA proceeds in this way. Figure 2(a) shows the procedure of the HPR operator that detects the visible points of a point cloud object in our proposed 360-degree IFD system, and Fig. 2(b) shows an example of the extracted visible points from a given viewpoint, comparing it against the case without using the HPR operator, where the viewpoint is fixed corresponding to the center of the object, located at (x = −20, y = 0, z > 0). Here it can be seen that the HPR operator reveals the exact representation of a point cloud model, from which it is easy to verify which side is being observed.

FIG. 2.The principle of the HPR operator: (a) The extraction process for the visible points of the point cloud model for a given viewpoint by HPR operator, and (b) an example of the point cloud model observed through an HPR operator, compared to the case without an HPR operator, from the same viewpoint.

Using these corresponding visible points, the reflection from the AOS is calculated based on the coordinates of the visible object points for each viewing direction, considering the AOS angular step, and the coordinates of the corresponding points of the initial 3D perspectives are computed through the double floating lenses. When the extracted visible points are relayed to the central depth plane of the lens array system, the elemental images are generated with hexagonal grids for the chosen HLA specifications, via the conventional integral-imaging pickup process. Recently, several researchers have reported that an HLA improves the image quality of the observable integral imaging display, because hexagonal grids have a better fill factor and provide more accurate approximation, compared to rectangular grids [20-22]. The theory has been verified by experimental results showing that the HLA successfully improves the quality of a displayed 3D image. Also, the number of point light sources has a fundamental influence on the quality of the displayed image: a greater number of point light sources enhances image quality [23-25], where the hexagonal lattice includes more lenses compared to a rectangular lens array, as the distance between point light sources is shorter in the HLA case. Because of these issues have already been certified experimentally in prior research, we simply decided to use an HLA due to its benefits, and applied it in the proposed 360-degree IFD system. Especially, when reconstructing the 3D image using an HLA, empty spaces occur much less often between the balls which are the light rays sampled through each elemental lens, because the hexagonal grids tesselate and fill a plane almost without gaps. In the case of a rectangular lens array, these empty spaces occur severely and a lot of flipped rays may captured.

The sampling step for the HLA is much more complex than for a rectangular lens array, as shown in Fig. 3, in which PL is the pitch of an elemental lens and is its height. In the EIA generation of the proposed 360-degree IFD system, an HLA is located in the z=0 plane, and the relayed object points are located around the central depth plane, i.e. zCDP.

FIG. 3.The sampling step using an HLA, where PL is the pitch of one side and h is the height of an elemental lens.

 

III. EXPERIMENTAL RESULTS

Figure 4(a) shows the experimental configuration. The schematic configuration of the proposed system, illustrated in Fig. 1, is laid out on an optical table. The DMD projector (12-degree tilt, 1024×768 micromirrors), a collimating lens (fCL = 70 mm), double floating lenses (Fresnel lenses with f1 = 110 mm and f2 = 318 mm respectively), an AOS (size 100×150 mm2, fCM = 87.65 mm), and an SM3420 smart motor are utilized in the experiment. The specifications of the HLA are similar to those of the previously used rectangular lens array, where fLA = 3 mm and the diameter is 0.9906 mm (PL = 0.4953 mm). A comparison of elemental lens sizes of the HLA and the previously used rectangular lens array is shown in Fig. 4(b).

FIG. 4.(a) An experimental configuration on the optical table, and (b) enlarged elemental lenses of the HLA, compared to a rectangular lens array.

The 3D point cloud model “Lying deer” with 16,000 points was chosen as the object for the experiment. The proposed system displays and tailors 200 different 3D initial perspectives per revolution of the AOS, which is rotated with an angular step of 1.8 degrees. Examples of the generated EIAs for the corresponding viewpoints are shown in Fig. 5. The representations of the point cloud object shown in Fig. 5(a) are only the visible object points, determined using a HPR operator from multiple viewpoints, while the EIAs presented in Fig. 5(b) are generated by the visible points for the corresponding viewing directions. Due to the transformation properties of the AOS, the size of each generated EIAs is expanded by a factor of approximately 1.3 times in the vertical direction, compared to the original vertical size of the object.

FIG. 5.For the extracted visible points of a point cloud model, the EIAs are generated through HLA: (a) The point sets extracted via an HPR operator, from several viewpoints, and (b) the EIAs generated for the corresponding point sets.

The 3D image on the AOS from multiple viewpoints is presented in Fig. 6, where the size of the displayed image is approximately 62×65 mm. Compared to the image from the conventional 360-degree IFD, the quality of the image displayed on the AOS is obviously better. Here, the displayed image indicates clearly which side of the point cloud object is displayed in the corresponding viewpoint, due to the HPR operator having eliminated the overlap of points. Also, the spaces between points are observed, so the displayed image becomes closer to the original point cloud object. The features of the previous 360-degree IFD, such as 360-degree viewing zone and wide vertical viewing angle, are completely retained. To present a 360-degree viewing zone, we captured the images from 0/360, 135, and 315 degrees. The vertical viewing angle of the displayed image was approximately 50 degrees; this is sufficient for comfortable viewing and makes it possible to further enhance the vertical viewing angle by controlling the focal length of the AOS.

FIG. 6.Examples of the displayed image on the AOS from multiple viewpoints. It can be observed clearly which side of the object is displayed, and a more comfortable view than in the previous demonstration of the 360-degree IFD is provided. Note that the wide vertical viewing angle and 360-degree viewing zone are maintained in the proposed 360-degree IFD system.

Figure 7 shows a comparison of the image quality for the cases with and without the use of an HPR operator. From Fig. 7(a) it is easy to see exactly which side of the object is appearing from multiple viewpoints, due to the use of an HPR operator. However, in Fig. 7(b), the front and rear points of the point cloud object appeared as overlapped spots, because we did not use an HPR operator, and it affects the image quality greatly: The displayed image is blurred and white overall, due to overlapping points, and seems unfocused. This comparison indicates that eliminating the overlapping points is a very important factor in the enhancement of image quality. Therefore, we see that an HPR operator improves the quality of the displayed image significantly.

FIG. 7.The verification of image quality enhancement for the case using an HPR operator: (a) Displayed images based on the EIAs using an HPR, which were also presented in Fig. 6, and (b) all points of the point cloud object appear and are duplicated, due to not using an HPR, from the same viewpoints.

Figure 8 shows a comparison of the cases using an HLA versus a rectangular lens array for the same object: Fig. 8(a) shows the 3D image displayed in the conventional case, while Fig. 8(b) shows the 3D image displayed through an HLA from the HPR-operator-based EIA. Figure 8(b) seems slightly darker than Fig. 8(a), because the HPR operator has extracted only the visible points for the corresponding viewpoint, and the visible points and empty spaces between points are exactly revealed, while in the conventional case the image is bright because more points appear, as seen in Fig. 8(a). Also, a clearer 3D image without flipping is displayed, due to the HLA. From this comparison, it is successfully verified that the HLA reconstructs more accurate 3D images for the 360-degree IFD system and provides higher display density.

FIG. 8.Comparison of 3D image quality, (a) displayed using the previously reported 360-degree IFD, versus (b) reconstructed by HLA and an HPR operator for the same object.

 

IV. CONCLUSION

In this paper, we propose an enhanced system for an AOS-based 360-degree IFD. Up to now, the 360-degree IFD system has demonstrated a natural full-parallax and continuously viewable 3D image with a wide vertical viewing angle in a 360-degree viewing zone, but the displayed image quality has been the main problem, where the resolution of the EIA is limited by the DMD resolution. Use of an HPR operator and an HLA improve the displayed image quality very well. The HPR operator determines only the visible points of the given point cloud model for each angular step of the AOS and thus removes the problem of overlapping points, so it is a very important factor in the enhancement of image quality. The EIAs are generated by the determined visible points for each viewpoint and captured through the HLA. The HLA has also the important function of increasing the displayed image quality, by providing a more accurate approximation and high sampling density to the display. The combination of the two techniques displays the precise views of the object from every viewpoint. So, the image quality of the 360-degree IFD system is visibly enhanced in experiment, while the previously improved 360-degree horizontal and wide vertical viewing angles are completely maintained.

References

  1. G. Lippmann, "La photographie integrale," C. R. Acad. Sci. 146, 446-451 (1908).
  2. J.-H. Park, K. Hong, and B. Lee, "Recent progress in three-dimensional information processing based on integral imaging," Appl. Opt. 48, H77-H94 (2009). https://doi.org/10.1364/AO.48.000H77
  3. N. Kim, A.-H. Phan, M.-U. Erdenebat, M. A. Alam, K.-C. Kwon, M.-L. Piao, and J.-H. Lee, "3D display technology," Disp. Imag. 1, 73-95 (2014).
  4. N. Kim, M. A. Alam, L. T. Bang, A.-H. Phan, M.-L. Piao, and M.-U. Erdenebat, "Advances in the light field displays based on integral imaging and holographic techniques," Chin. Opt. Lett. 12, 060005-1 (2014). https://doi.org/10.3788/COL201412.060005
  5. J.-S. Jang and B. Javidi, "Improvement of viewing angle in integral imaging by use of moving lenslet arrays with low fill factor," Appl. Opt. 42, 1996-2002 (2003). https://doi.org/10.1364/AO.42.001996
  6. Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W. Min, and B. Lee, "A wide-viewing-angle integral 3D imaging system by curving a screen and a lens array," Appl. Opt. 44, 546-552 (2005). https://doi.org/10.1364/AO.44.000546
  7. R. Martinez-Cuenca, H. Navarro, G. Saavedro, B. Javidi, and M. Martinez-Corral, "Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system," Opt. Express 15, 16255-16260 (2007). https://doi.org/10.1364/OE.15.016255
  8. H. Kim, J. Hahn, and B. Lee, "The use of negative index planoconcave lens array for wide -viewing angle integral imaging," Opt. Express 16, 21865-21880 (2008). https://doi.org/10.1364/OE.16.021865
  9. Y. Kim, K. Hong, and B. Lee, "Recent researches based on integral imaging display method," 3D Res. 1, 17-27 (2010). https://doi.org/10.1007/3DRes.01(2010)2
  10. G. Baasantseren, J.-H. Park, K.-C. Kwon, and N. Kim, "Viewing angle enhanced integral imaging display using two elemental image masks," Opt. Express 17, 14405-14417 (2009). https://doi.org/10.1364/OE.17.014405
  11. M. A. Alam, M.-L. Piao, L. T. Bang, and N. Kim, "Viewing-zone control of integral imaging display using a directional projection and elemental image resizing method," Appl. Opt. 52, 6969-6978 (2013). https://doi.org/10.1364/AO.52.006969
  12. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, "Rendering for an interactive $360^{\circ}$ light field display," Proc. ACM SIGGRAPH 26, 1-10 (2007).
  13. A. Jones, M. Lang, G. Flyffe, X. Yu, J. Busch, I. McDowall, M. Bolas, and P. Debevec, "Achieving eye contact in a one-to-many 3D video teleconferencing system," Proc. ACM SIGGRAPH 28, 64 (2009).
  14. M.-U. Erdenebat, G. Baasantseren, N. Kim, K.-C. Kwon, J. Byeon, K.-H. Yoo, and J.-H. Park, "Integral-floating display with 360 degree horizontal viewing angle," J. Opt. Soc. Korea 16, 365-371 (2012). https://doi.org/10.3807/JOSK.2012.16.4.365
  15. M.-U. Erdenebat, G. Baasantseren, J.-H. Park, N. Kim, K.-C. Kwon, Y.-H. Jang, and K.-H. Yoo, "Full-parallax 360 degrees horizontal viewing integral imaging using anamorphic optics," Proc. SPIE 7863, 7863OU (2011).
  16. D. Miyazaki, N. Akasaka, K. Okoda, Y. Maeda, and T. Mukai, "Floating three-dimensional display viewable from 360 degrees," Proc. SPIE 8288, 82881H (2012).
  17. M.-U. Erdenebat, K.-C. Kwon, K.-H. Yoo, G. Baasantseren, J.-H. Park, E.-S. Kim, and N. Kim, "Vertical viewing angle enhancement for the 360 degree integral-floating display using an anamorphic optic system," Opt. Lett. 39, 2326-2329 (2014). https://doi.org/10.1364/OL.39.002326
  18. G. Baasantseren, J.-H. Park, M.-U. Erdenebat, S.-W. Seo, and N. Kim, "Integral floating-image display using two lenses with reduced distortion and enhanced depth," J. Soc. Inf. Disp. 18, 519-526 (2010). https://doi.org/10.1889/JSID18.7.519
  19. S. Katz, A. Tal, and R. Basri, "Direct visibility of point sets," Proc. ACM SIGGRAPH 26, 24 (2007).
  20. J.-H. Park, D. Han, and N. Kim, "Capture of the three-dimensional information based on integral imaging and its sampling analysis," Proc. SPIE 7848, 1B1-1B9 (2010).
  21. N. Chen, J. Yeom, J.-H. Jung, J.-H. Park, and B. Lee, "Resolution comparison between integral-imaging-based hologram synthesis methods using rectangular and hexagonal lens arrays," Opt. Express 19, 26917-26927 (2011). https://doi.org/10.1364/OE.19.026917
  22. D.-H. Kim, M.-U. Erdenebat, K.-C. Kwon, J.-S. Jeong, J.-W. Lee, K.-A. Kim, N. Kim, and K.-H. Yoo, "Real-time 3D display system based on computer-generated integral imaging technique using enhanced ISPP for hexagonal lens array," Appl. Opt. 52, 8411-8418 (2013). https://doi.org/10.1364/AO.52.008411
  23. Y. Kim, J. Kim, J.-M. Kang, J.-H. Jung, H. Choi, and B. Lee, "Point light source integral imaging with improved resolution and viewing angle by the use of electrically movable pinhole array," Opt. Express 15, 18253-18267 (2007). https://doi.org/10.1364/OE.15.018253
  24. M. A. Alam, G. Baasantseren, M.-U. Erdenebat, N. Kim, and J.-H. Park, "Resolution enhancement of integral imaging three-dimensional display using directional elemental image projection," J. Soc. Inf. Disp. 20, 221-227 (2012). https://doi.org/10.1889/JSID20.4.221
  25. K.-C. Kwon, J.-S. Jeong, M.-U. Erdenebat, Y.-T. Lim, K.-H. Yoo, and N. Kim, "Real-time interactive display for integral imaging microscopy," Appl. Opt. 53, 4450-4459 (2014). https://doi.org/10.1364/AO.53.004450

Cited by

  1. Precision glass molding: Toward an optimal fabrication of optical lenses vol.12, pp.1, 2017, https://doi.org/10.1007/s11465-017-0408-3
  2. Integral imaging microscopy with enhanced depth-of-field using a spatial multiplexing vol.24, pp.3, 2016, https://doi.org/10.1364/OE.24.002072
  3. Analysis on image expressible region of integral floating vol.55, pp.3, 2016, https://doi.org/10.1364/AO.55.00A122
  4. Fast diffraction calculation of cylindrical computer generated hologram based on outside-in propagation model vol.403, 2017, https://doi.org/10.1016/j.optcom.2017.07.045
  5. Fast calculation method of computer-generated cylindrical hologram using wave-front recording surface vol.40, pp.13, 2015, https://doi.org/10.1364/OL.40.003017