Advanced
Digital Holographic Display System with Large Screen Based on Viewing Window Movement for 3D Video Service
Digital Holographic Display System with Large Screen Based on Viewing Window Movement for 3D Video Service
ETRI Journal. 2014. Feb, 36(2): 232-241
Copyright © 2014, Electronics and Telecommunications Research Institute(ETRI)
  • Received : September 22, 2013
  • Accepted : February 10, 2014
  • Published : February 01, 2014
Download
PDF
e-PUB
PubReader
PPT
Export by style
Share
Article
Author
Metrics
Cited by
TagCloud
About the Authors
Minsik Park
Byung Gyu Chae
Hyun-Eui Kim
Joonku Hahn
Hwi Kim
Cheong Hee Park
Kyungae Moon
Jinwoong Kim

Abstract
A holographic display system with a 22-inch LCD panel is developed to provide a wide viewing angle and large holographic 3D image. It is realized by steering a narrow viewing window resulting from a very large pixel pitch compared to the wave length of the laser light. Point light sources and a lens array make it possible to arbitrarily control the position of the viewing window for a moving observer. The holographic display provides both eyes of the observer with a holographic 3D image using two vertically placed LCD panels and a beam splitter to support the holographic stereogram.
Keywords
I. Introduction
- 1. Overview
Digital holographic displays reconstruct 3D information of an object in a free space using spatial light modulators (SLMs) that enable the light field to be diffracted into a specific direction according to the fringe pattern that may be computer generated using a mathematical diffraction model. For light diffraction, we can use a 2D display device, such as a liquid crystal on silicon device, a digital micromirror device, or an LCD device, as a SLM. Holographic displays using existing 2D display devices have a small space-bandwidth product (SBP) that is defined based on the product of the screen size (in meters) and the maximum spatial frequency (in cycles/meter) it is very difficult to not only decrease the pixel pitch but also to enlarge the panel size of the display using current display manufacturing techniques. The pixel pitch of the display limits the maximum viewing angle. An SBP is used as the performance index of a holographic display to simultaneously measure both the viewing angle and screen size because a holographic video service needs to provide both large-screen and wide-angle video for commercialization. There are two main approaches to overcome the limitation of the viewing angle and screen size. One is to develop a novel SLM, such as the optically addressed SLM [1] , electron-beam addressed SLM [2] , acousto-optic SLM [3] , [4] , or magneto-optic SLM [5] . The other is to increase the SBP using spatial and temporal multiplexing [6] - [13] of the existing SLM. SeeReal [14] recently implemented a holographic display to increase the viewing angle and screen size by moving the viewing window to overcome the limitation of the SBP by using the commercial LCD panel. We also develop a digital holographic display based on moving the viewing window that not only provides both a wide viewing angle and large screen simultaneously but also enables the easy migration of commercial product. The proposed digital holographic display supports both the vertical and horizontal parallax, whereas SeeReal’s display supports only the horizontal parallax.
- 2. Viewing Window Movement
The basic building module of our holographic display, based on a moving viewing window, consists of a light source, a convex lens, and an LCD panel, as shown in Fig. 1 . The coherent wave field diverges from the light source toward the convex lens and LCD panel and plays a role in generating the viewing window of the hologram. The LCD panel diffracts the wave field to reconstruct a hologram image from the fringe pattern, and the convex lens enables the diffracted waves to be converged in the viewing window through which the user is able to observe the reconstructed image from the hologram. The viewing window needs to be in line with the eye position, so as to allow the viewer to see the reconstructed hologram when the viewer moves vertically, horizontally, forward, and backward. The shifting of the viewing window can be easily accomplished by changing the position of the light source according to the (x, y, z) axes. The method used to steer the position of the light source includes not only mechanically controlling the motors equipped with a light source but also electronically switching on and off the specific position of the light shutter in front of the light source. The accurate position of the observer’s pupil should be obtained to find the position for shifting the viewing window.
PPT Slide
Lager Image
Movement of viewing window according to position of light source.
II. Digital Holographic Display System
Figure 2 shows the basic scheme of the proposed holographic display system to be operated by moving the viewing window. The system consists of SLMs, laser light sources, motorized stages, lens arrays, a beam splitter, and eye tracking cameras. The SLM used is a 22-inch LCD panel with 4 K resolution and a 125-μm pixel pitch. The size of the viewing window when located at a viewing distance of 2,600 mm from the SLM is too small to cover both eyes of the observer since a pixel pitch of 125 μm results in a viewing window size of 11 mm × 11 mm when a green laser is used (532 nm). Two SLMs are therefore used to provide the viewer with a holographic stereogram to enable the entire hologram to be observed with both eyes. A pair of input optical systems with a 2 × 2 lens array on the back of the SLM plays a role in converging the reconstructed hologram for the viewer’s right and left eyes. The laser light source is collimated into a plane wave and divided into two SLMs, and motorized stages shift the light sources in the direction of the (x, y, z) axes so as to move the viewing window according to the position of the viewer’s pupil, which is found by processing the binocular images captured by stereo cameras located in front of the SLM. A stereo camera module calculates the (x, y, z) coordinate of the viewer’s pupil position.
PPT Slide
Lager Image
Schematic 3D model of holographic display system.
The following subsections describe the main components of the holographic display system in detail.
- 1. Light Source Array
The proposed RGB light source array simplifies the complex structure of the general RGB light blending system and saves a tremendous amount of time and effort for an alignment of the optical axis for a large-scale holographic display system using an easily configurable fiber laser, as shown in Fig. 4 , that describes the side view of the light source array.
Laser light sources provide a wider color gamut compared to previous LCD monitors. The wavelengths of the laser light source to be used for the proposed display are selected from the CIE chromaticity chart from which we are able to estimate the full range of color representation. The three primary wavelengths of 660 nm, 532 nm, and 473 nm are chosen for color blending in order that the proposed display reproduces a color gamut of more than 130% in comparison to that defined by the NTSC (National Television Standards in Committee). The inner area of the triangle connected with three points (660 nm, 532 nm, and 473 nm) in the CIE chart presents the color gamut of the proposed display, as shown in Fig. 3 .
PPT Slide
Lager Image
Chromaticity of RGB laser in CIE 1931.
PPT Slide
Lager Image
Structure of light source array.
Each fiber-coupled laser is connected to a 1 × 2 fiber coupler and a 1 × 4 fiber coupler to distribute the eight light sources because the proposed display uses the entire eight lens array in a couple of input optical systems with 2 × 2 lens array, as shown in Fig. 4 . RGB combiners are used to combine the red, green, and blue light sources with different wavelengths into a single light source for the color blending. As a result, the eight white light beams are generated to be used as coherent SLM light sources.
The eight light beams go through input and output optical systems and focus on each different position. The right four beams and left four beams thus converge in the viewing spot position of the right eye and left eye, respectively, by steering the light sources’ position in the direction of the (x, y, z) axes in order that the images reconstructed from different light beams can be combined and visually interpreted as one image.
- 2. Motorized Stage
The proposed display uses two motorized stages to handle right and left input optical systems, as shown in Fig. 2 . Figure 5 shows the right motorized stage needed to move the four-light sources required to generate the right hologram beam. One light source is mounted on a motorized mounter capable of moving the viewing window in the directions of the (x, y, z) axes at a speed of 50 mm per second.
PPT Slide
Lager Image
Motorized stage for four-light source array.
- 3. Input Optical Systems
The proposed display needs aberration-free optical systems to accurately move the viewing window according to the position of the observer’s eye. The proposed display has two input optical systems for the right and left eyes, as shown in Fig. 2 . Figure 6 shows one input optical system used to generate a planar wave entering into an SLM. The one input optical system is designed with 2 × 2 lens array because it is very difficult to manufacture one lens to cover an SLM with a large square frame. Each lens array consists of two groups that have doublets for a reduction in aberrations and a convergence of the beams, as shown in Fig. 7 .
PPT Slide
Lager Image
Lens array with SLM of one input optical system.
PPT Slide
Lager Image
One lens array’s structure among four lens arrays of right input optical system.
Figure 8 shows the design concept of the optical systems used in the proposed display and demonstrates the simulation result of ray tracing from the light source into the viewing window, for one of the four lens arrays of the right input optical system. The diverging beams from the point source converge in the viewing window by means of the lens array. The designed optical systems play roles in forming the viewing window according to the position of the light source when the viewing window is moved in the direction of the x, y, or z axis, according to the position of the observer’s eye. The spot on the image plane should therefore be formed with a constant size even though the light-source position changes significantly in the object plane. The performance of the input optical system is very important to improve the quality of the image reconstructed from the proposed display. The input optic system is therefore designed to minimize the astigmatism and chromatic aberration of the output beam. The lens design optimization, results in a decrease of the image plane’s spot radius of up to 0.5 mm.
PPT Slide
Lager Image
Ray tracing from light source into viewing window in lens array by simulation of lens design tool.
- 4. Output Optical Systems
The holographic display system needs to enable the observer to see binocular holograms owing to the narrow viewing angle, as shown in Fig. 9 . Such a function is realized by using an output optic system with a beam splitter specified to a thickness of 3 mm, reflection/transmission ratio of 50%, and surface accuracy of 1 λ /inch to 2 λ /inch.
PPT Slide
Lager Image
Provisioning of binocular holograms using output optical system.
- 5. Eye Tracking
Figure 10 illustrates the relationship between the diffraction angle and viewing window, assuming that an LCD panel is used as the SLM of the holographic display. We can observe the reconstructed image from the hologram in the viewing window located between the 0th and ±1st-order diffraction beam.
PPT Slide
Lager Image
Diffraction angle and viewing window.
The ±1st-order diffraction angle is derived from the diffraction grating, as shown in (1).
diffraction angle=2 sin 1 ( λ 2ρ ) λ ρ ,
where λ is the wavelength of the laser light and ρ is the pixel pitch of the diffraction grating. The maximum diffraction angle is 2 sin −1 ( λ / 2 ρ ) because the maximum period of the diffraction grating is 2 ρ owing to the Nyquist sampling limit. The diffraction angle is also approximated to λ/ρ if it is infinitesimal.
Assuming that the viewing window is defined by the allowable area through which the viewer is able to observe the reconstructed image from the holographic display, its size is calculated from (2) with respect to each RGB wavelength at a distance of 2,600 mm, as shown in Table 1 , in the case of an on-axis hologram.
viewing window  size=2×tan( λ 2ρ )×d λd ρ .
Viewing window sizes for different wavelengths.
Wavelength 473 nm (blue) 532 nm (green) 660 nm (red)
Area (mm × mm) 10 × 10 11 × 11 14 × 14
Figures 11(a) and 11(b) show the error tolerance of eye tracking accuracy for mono (red) laser and color (blue) laser. The error tolerances (width, height) are (± 5 mm, ± 5 mm) and (± 3 mm, ± 3 mm), respectively, regarding a mono (red) and color (blue) laser if the diameter of the pupil is 4 mm because the pupil center should not deviate from the inner dark rectangular area to observe the hologram in the viewing window.
PPT Slide
Lager Image
Error tolerance of eye tracking accuracy for (a) mono (red) and (b) color (blue) laser.
The proposed eye tracking system uses two (left and right) wide-angle-lens cameras placed 40 cm apart in front of the SLM. Left and right rectification maps for the left and right camera views are first computed through stereo calibration and rectification, and the face of the observer is recognized within the captured image. Next, eyes are identified within the region of the face, and the pupil center is then detected within the region of each eye. Finally, the results obtained from the left and right images of the stereo camera are then combined into a 3D model that defines the position of the eyes, or more exactly, the pupil centers with (x, y, z) coordinates.
- 6. Hologram Generation
A computer-generated holography (CGH) pattern enables us to handle the refractive direction of the plane wave illuminated from the laser through the SLM for 3D image reconstruction in the free space. Figure 12 illustrates how to generate the CGH pattern to be displayed on the proposed holographic display system. The wave field propagated from the light source through the SLM input plane ( x 1 , y 1 ) and pupil aperture ( u , v ), is reconstructed with an object image on the retina output plane ( x 2 , y 2 ) when the observer looks at the SLM. This occurs after aligning the center of the pupil and the viewing window (shown in Fig. 8 ) with the z-axis.
PPT Slide
Lager Image
Fresnel transform and inverse Fresnel transform model based on viewing window considering the observer’s eye.
The Fresnel transform equation can be expressed mathematically as (3) if we consider the eye model with pupil, eye lens, and retina as shown in Fig. 12 in generating the CGH pattern. G ( x 1 , y 1 ) is the wave field distribution on the SLM (LCD panel) input plane and F ( x 2 , y 2 ) is that on the retina output plane. G ( x 1 , y 1 ) is propagated from the wave field distribution on the object in the free space and is diffracted into F ( x 2 , y 2 ). Equation (3)’ parameters, λ , d 1 , d 2 , d obj , and f , are defined as the wavelength of light source, the viewing window distance, the distance between pupil aperture and retina plane, the distance between SLM and object, and focal length of eye lens, respectively. Focal length f is derived with {( d 1 d obj d 2 } / {( d 1 −{ d obj ) + d 2 } according to the Gauss lens equation and is varied to focus the object on the retina according to the distance between the object and the observer’s eye.
Circ function circ(( u ^2 + v ^2) / r ^2) represents the finite limit of the aperture size of the pupil that filters the high-order and DC noise generated from the grating structure of the LCD panel where r is the pupil’s radius.
F( x 2 , y 2 )= 1 (jλ d 1 )(jλ d 2 ) exp( j π λ d 2 ( x 2 2 + y 2 2 ) )                                                    [ exp( j π λ ( 1 d 1 + 1 d 2 1 f )( u 2 + v 2 ) ) circ( u 2 + v 2 r 2 )                                   exp( j π λ d 1 ( x 1 2 + y 1 2 ) )   G( x 1 , y 1 )                                exp( j 2π λ d 1 ( x 1 u+ y 1 v) )d x 1 d y 1 ]                      exp( j 2π λ d 2 (u x 2 +v y 2 ) )dudv.
Equation (4), which is for an inverse Fresnel transform, generates the CGH pattern to be displayed on the LCD panel by propagating the wave field distribution on the retina output plane back to the SLM input plane. F ( x 2 , y 2 ) can be defined with the object’s wave field distribution because the object is focused on the retina by changing the focal length of eye lens.
G( x 1 , y 1 )= (j)(j) (λ d 1 )(λ d 2 ) exp( j π λ d 1 ( x 1 2 + y 1 2 ) ) [ exp( j π λ ( 1 d 1 + 1 d 2 1 f )( u 2 + v 2 ) ) exp( j λ d 1 ( x 1 u+ y 1 v ) )                        exp ( j π λ d 2 ( x 2 2 + y 2 2 ) )F( x 2 , y 2 )                        exp( j λ d 2 ( x 2 u+ y 2 v ) )d x 2 d y 2 ]dudv.
PPT Slide
Lager Image
(a) On-axis viewing window before shifting and (b) off-axis viewing window shifted by phase factor.
DC noises of n -th order are generated periodically owing to the grating characteristics of the LCD SLM, as shown in Fig. 13(a) . The wave field G ( x 1 , y 1 ) calculated from (4) is reconstructed in the viewing window close to the DC noise because it is an inverse Fresnel transform equation on the on-axis.
The CGH pattern should therefore be generated by multiplying G ( x 1 , y 1 ) by phase factor term, y off-axis as in (5) to avoid the DC noise in a viewing window because the term exp( j (2π / λ ) y 1 × phasefactor ) is the prism function to shift the viewing window in the vertical off-axis direction as shown in Fig. 13(b) .
G ( x 1 , y 1 ) off-axis =G( x 1 , y 1 )×exp( j 2π λ y 1 ×phasefactor ).                                        1phasefactor1.
III. Experiment Results
A holographic display system has been developed by integrating the previously described functional modules, as shown in Fig. 14 .
PPT Slide
Lager Image
Holographic display system.
Figure 15 shows the accommodation effect of our holographic display system when reconstructing the amplitude hologram of character objects at different positions from the SLM. The character objects, “ET” and “RI,” are located 1,200 mm and 0 mm away from the SLM, to which the light of a green laser is illuminated as a collimated beam. We took photographs of the reconstructed hologram at distances of both 0 mm and 1,200 mm by changing the focal length of the camera. We can observe the accommodation effect from the results in which the captured grid image of the characters “ET” is blurred at 0 mm and acute at 1,200 mm.
PPT Slide
Lager Image
Reconstruction of mono amplitude hologram at distances of (a) 0 mm and (b) 1,200 mm from the SLM.
We experiment using not only a mono hologram but also a color hologram. Figure 16 shows the experimental configuration used to observe the accommodation effect of a hologram visually. A checkerboard and two rulers as a reference plane are located at 0 mm, 1,900 mm, and 2,200 mm from the SLM, respectively.
PPT Slide
Lager Image
Reference objects at 0 mm, 1,900 mm, and 2,200 mm.
The grid characters as an input image used to generate a color CGH consist of a green “E” at 2,200 mm, a yellow “R” at 1,900 mm, a blue “T” at 0 mm, and a red “I” at 0 mm, as shown in Fig. 17 .
PPT Slide
Lager Image
Input images of characters located at 0 mm, 1,900 mm, and 2,200 mm.
The color image of Fig. 17 is generated from (5) into an off-axis amplitude hologram according to the RGB wavelengths (red, 660 nm; green, 532 nm; and blue, 473 nm) and then inserted into the SLM for reconstruction of the hologram. Figure 18 illustrates the reconstruction of color characters located at distances of 0 mm, 1,900 mm, and 2,200 mm. The grid pattern of the blue “T” and red “I” at 0 mm is more sharpened with the reference object, the checkerboard, than that of any other characters, as shown in Fig. 18(a) . This means that the grid pattern of the green “E” and the yellow “R” is blurred at a distance of 0 mm. However, we can recognize the grid pattern of the yellow “R” and green “E” quite clearly from Figs. 18(b) and 18(c) , which show photos taken at camera focal planes of 1,900 mm and 2,200 mm.
PPT Slide
Lager Image
Reconstruction of color amplitude hologram at distances of (a) 0 mm, (b) 1,900 mm, and (c) 2,200 mm from SLM.
IV. Conclusion
We developed a holographic display system that can reconstruct a large hologram in free space for a 3D video service. To provide an observer with a hologram with a wide angle and large screen under a limited space-bandwidth product, the system converges the wave fields of the hologram in the viewing window that is moved horizontally and vertically in the space of the viewing zone by shifting the point light sources in the direction of the (x, y, z) axes according to the pupil position of the observer. The drawbacks of the proposed system are that only one user can utilize it at a time due to the limited performance of the eye tracking system and input optical system and that the aberration of the input and output optical systems results in the distortion of the image reconstructed from the hologram.
In the future, we have a plan to develop a tabletop-style holographic display to support more than four observers and improve the quality of reconstructed image from the hologram by numerically compensating the distorted wave field of the reconstructed hologram.
This work was supported by GigaKOREA project, (GK13D0100, Development of Telecommunications Terminal with Digital Holographic Table-top Display).
BIO
pms@etri.re.kr
Minsik Park received his BS degree in electrical engineering from Kwangwoon University, Seoul, Rep. of Korea, in 1997 and his MS degree in mechatronics engineering from Gwangju Institute of Science and Technology, Gwangju, Rep. of Korea, in 1999. He is currently a PhD candidate in computer science and engineering, Chungnam National University, Daejeon, Rep. of Korea. Since 1999, he has been a senior member of the research staff in ETRI, Daejeon, Rep. of Korea. His research interests include signal and image processing, machine learning, and holography.
bgchae@etri.re.kr
Byung Gyu Chae received his BS and MS degrees in physics from Pusan National University, Busan, Rep. of Korea, in 1993 and 1995, respectively, and his PhD degree in solid state physics from Pusan National University, Busan, Rep. of Korea, in 1999. He joined ETRI in 2000 and has been working on many projects including display development, photonic devices, and electro-optic modulators. He currently studies digital holography and its application to biomedical imaging technology.
turnn@etri.re.kr
Hyun-Eui Kim received his BS and MS degrees in information and communication engineering from Chungbuk National University, Cheongju, Rep. of Korea, in 2010 and 2012, respectively. Since 2012, he has been working with the ETRI Digital Holography Research Team. His research interests include digital holography, integral imaging, and multi-view 3D display systems.
jhahn@knu.ac.kr
Joonku Hahn received his PhD degree from the School of Electrical Engineering, Seoul National University, Seoul, Rep. of Korea, in 2009. After receiving the degree, he worked as a postdoctoral associate in electrical and computer engineering, Duke University, Durham, NC, USA. In March 2011, he joined Kyungpook National University, Daegu, Rep. of Korea, as a faculty member.
hwikim@korea.ac.kr
Hwi Kim received his BS, MS, and PhD degrees from the School of Electrical Engineering at Seoul National University, Seoul, Rep. of Korea, in 2001, 2003, and 2007, respectively. From 2008 to 2009, he worked as a senior engineer for Samsung Display. He is currently an associate professor in the Department of Electronics and Information Engineering, Korea University, Sejong Campus. His primary research interests are in the areas of diffraction, holographic optics, plasmonics, and metamaterials.
cheonghee@cnu.ac.kr
Cheong Hee Park received her PhD in mathematics from Yonsei University, Rep. of Korea, in 1998. She received her MS and PhD degrees in computer science from the Department of Computer Science and Engineering, University of Minnesota, Minnesota, St, Paul, USA, in 2002 and 2004, respectively. She is currently with the Department of Computer Science and Engineering, Chungnam National University, Daejeon, Rep. of Korea, as an associate professor. Her research interests include pattern recognition, data mining, bioinformatics, and machine learning.
kamoon@etri.re.kr
Kyungae Moon received her BS, MS, and PhD degrees in computer science from Chungnam National University, Daejeon, Rep. of Korea, in 1985, 1988 and 1997 respectively. She joined ETRI, Daejeon, Rep. of Korea, in 1991 and has participated in many projects in the digital content and digital broadcasting areas. Currently, she participates in the development of digital holographic 3D systems. Her research interests include realistic media, digital multimedia broadcasting systems, and digital holography technology.
jwkim@etri.re.kr
Jinwoong Kim received his BS and MS degrees in electronics engineering from Seoul National University, Seoul, Rep. of Korea, in 1981 and 1983, respectively. He received his PhD degree in electrical engineering from Texas A&M University, College Station, TX, USA, in 1993. He has been working in ETRI since 1983, leading many projects in the telecommunications and digital broadcasting areas, such as the development of an MPEG-2 video encoding chipset and real-time HDTV encoder system, as well as innovative technologies for data and viewer-customized broadcasting. He also carried out projects on multimedia search, retrieval and adaptation technologies related to the MPEG-7 and MPEG-21 standards. Currently, his major research focus is in 3D and realistic media technologies and systems. He was the leader of the 3D DMB and multiview 3DTV system development project and is now actively working on building practical digital holographic 3D systems. He was a keynote speaker of 3DTV-CON 2010 and has been an invited speaker to a number of international workshops, including 3D Fair 2008, DHIP 2012, and WIO 2013.
References
Stanley M. 2003 “3D Electronic Holography Display System Using a 100-Megapixel Spatial Light Modulator,” Proc. SPIEIS& T Electron. Imag. 5005 247 - 258    DOI : 10.1117/12.516540
1989 Product Information Sheet for EBSLM Model X3636 Hamamatsu Photonics K.K. Japan
Smithwick Q.Y.J. “Progress in Holographic Video Displays Based on Guided-Wave Acousto-Optic Devices,” Proc. SPIE Feb. 7, 2008 6912    DOI : 10.1117/12.761310
Smalley D.E. 2013 “Anisotropic Leaky-Mode Modulator for Holographic Video Displays,” Nature 498 313 - 317    DOI : 10.1038/nature12217
Machida K. 2010 “Spin Transfer Switching and MR Properties of Co/Pt Multilayered Free Layers for Submicron Sized Magneto-Optical Light Modulation Device,” IEEE Trans. Magn. 46 (6) 2171 - 2174    DOI : 10.1109/TMAG.2010.2042931
Hahn J. 2008 “Wide Viewing Angle Dynamic Holographic Stereogram with a Curved Array of Spatial Light Modulators,” Optics Exp. 16 (16) 12372 - 12386    DOI : 10.1364/OE.16.012372
Kozacki T. 2012 “Wide Angle Holographic Display System with Spatiotemporal Multiplexing,” Optics Exp. 20 (25) 27473 - 27481    DOI : 10.1364/OE.20.027473
Yaras F. , Kang H.J. , Onural L. 2011 “Circular Holographic Video Display System,” Optics Exp. 19 (10) 9147 - 9156    DOI : 10.1364/OE.19.009147
Takaki Y. , Okada N. 2009 “Hologram Generation by Horizontal Scanning of a High-Speed Spatial Light Modulator,” Appl. Optics 48 (17) 3255 - 3260    DOI : 10.1364/AO.48.003255
Son J.-Y. “A Holographic Display Based on Spatial Multiplexing,” Proc. SPIE 8738, 3D Imag., Vis., Display May 17, 2013    DOI : 10.1117/12.2018439
Martinez-Cuenca R. 2009 “Progress in 3-D Multiperspective Display by Integral Imaging,” Proc. IEEE 97 (6) 1067 - 1077    DOI : 10.1109/JPROC.2009.2016816
Xiao X. 2013 “Advances in Three-Dimensional Integral Imaging: Sensing, Display, and Applications,” Appl. Optics 52 (4) 546 - 560    DOI : 10.1364/AO.52.000546
Wakunami K. , Yamaguchi M. , Javidi B. 2012 “High-Resolution Three-Dimensional Holographic Display Using Dense Ray Sampling from Integral Imaging,” Opt. Lett. 37 (24) 5103 - 5105    DOI : 10.1364/OL.37.005103
Haussler R. , Schwerdtner A. , Leister N. “Large Holographic Displays as an Alternative to Stereoscopic Displays,” Proc. SPIE Feb. 29, 2008 6803