Regular Articles

Simple wave-field rendering for photorealistic reconstruction in polygon-based high-definition computer holography

[+] Author Affiliations
Kyoji Matsushima

Kansai University, Department of Electrical and Electronic Engineering, 3-3-35 Yamate-cho, Suita, Osaka 564-8680, Japan

Hirohito Nishi

Kansai University, Department of Electrical and Electronic Engineering, 3-3-35 Yamate-cho, Suita, Osaka 564-8680, Japan

Sumio Nakahara

Kansai University, Department of Mechanical Engineering, 3-3-35 Yamate-cho, Suita, Osaka 564-8680, Japan

J. Electron. Imaging. 21(2), 023002 (Apr 26, 2012). doi:10.1117/1.JEI.21.2.023002
History: Received September 30, 2011; Revised March 6, 2012; Accepted March 12, 2012
Text Size: A A A

Open Access Open Access

Abstract.  A simple and practical technique is presented for creating fine three-dimensional (3D) images with polygon-based computer-generated holograms. The polygon-based method is a technique for computing the optical wave-field of virtual 3D scenes given by a numerical model. The presented method takes less computation time than common point-source methods and produces fine spatial 3D images of deep 3D scenes that convey a strong sensation of depth, unlike conventional 3D systems providing only binocular disparity. However, smooth surfaces cannot be reconstructed using the presented method because the surfaces are approximated by planar polygons. This problem is resolved by introducing a simple rendering technique that is almost the same as that in common computer graphics, since the polygon-based method has similarity to rendering techniques in computer graphics. Two actual computer holograms are presented to verify and demonstrate the proposed technique. One is a hologram of a live face whose shape is measured using a 3D laser scanner that outputs polygon-mesh data. The other is for a scene including the moon. Both are created employing the proposed rendering techniques of the texture mapping of real photographs and smooth shading.

Figures in this Article

In classical holography, the object wave of a real object is recorded on light-sensitive films employing optical interference with a reference wave. The object wave is optically reconstructed through diffraction by the fringe pattern. Therefore, real existing objects are required to create a three-dimensional (3D) image in classical holography. For a long time, it was not possible to create fine synthetic holograms for virtual 3D scenes such as those in modern computer graphics (CG).

Recently, the development of computer technologies and new algorithms have made it possible to create brilliant synthetic holograms.16 These holograms, whose dimensions can exceed 4 billion pixels, optically reconstruct true spatial images that give continuous motion parallax both in horizontal and vertical directions without any additional equipment such as polarizing eyeglasses. The reconstructed spatial images provide almost all depth cues such as dispersion, accommodation, occlusion, and convergence. Thus, the computer holograms give viewers a strong sensation of depth that has not been possible for conventional 3D systems providing only binocular disparity. Unfortunately, these computer holograms cannot be reconstructed by current video devices such as liquid crystal displays because of their extremely high definition. However, the high-definition holograms presage the great future of holographic 3D displays beyond Super Hi-Vision.

The synthetic fringe of a high-definition computer hologram is computed using a new algorithm referred to as the polygon-based method7 instead of using conventional point-based methods.8,9 In point-based methods, object surfaces are regarded as being covered with many point sources of light, whereas object surfaces are approximated by planar polygons regarded as surface sources of light in the polygon-based method. The point-based methods are simple but commonly time-consuming. It is almost impossible to create full-parallax high-definition computer holograms of occluded 3D scenes using point-based methods, even though many techniques have been proposed to accelerate computation.1015. The polygon-based method remarkably speeds up the computation of the synthetic fringe because far fewer polygons than point sources are needed to form a surface. Thus, some variations of the polygon-based method have been proposed for computing the fringe pattern even more quickly.16,17 In polygon-based computer holography, the silhouette method is also used for light-shielding behind an object.1,18,19

A disadvantage of the polygon-based method is that techniques have not been established for photorealistic reconstruction in high-definition holography. Some early high-definition holograms were created employing a basic diffuser model that simply corresponds to the flat shading of CG. As a result, the reconstructed surface is not a smooth curved surface but an angular faceted surface, and the borders of polygons are clearly perceived in the reconstruction. This problem is peculiar to polygon-based methods. In the early development of CG, the rendering of polygon-mesh objects suffered the same problem, but they have been resolved with simple techniques.

Photorealistic reconstruction of computer holograms has been discussed using a generic theoretical model in the literature.20 However, although the discussion is based on a generic model, fringe patterns are eventually computed with the point-based method. In addition, the surface patches used in the study are assumed to be parallel to the hologram. Thus, it is difficult to apply this model to our polygon-based holograms in that the objects are composed of slanted patches. In this paper, we present a simple technique for the photorealistic reconstruction of a diffuse surface in high-definition computer holography. The technique makes use of the similarity of the polygon-based method to conventional CG. The similarity means that the proposed technique is simple, fast and practical. Here, we use the term “rendering” to express computation of the fringe pattern and creation of 3D images by computer holograms because of the similarity to CG. Since the polygon-based method is also a wave-oriented method unlike the point-based method, the rendering technique is referred to as “wave-field rendering” in this paper.

Two actual high-definition holograms are created to verify the techniques proposed in this paper. One is a 3D portrait; i.e., a hologram that reconstructs the live face of a girl. The polygon-mesh of the live face is measured using a 3D laser scanner. A photograph of the face is texture-mapped on the polygon-mesh with smooth shading. The other is for a scene of the moon floating in a starry sky. A real astrophotograph of the moon is mapped onto a polygon-mesh sphere. Both holograms comprise more than 8 billion pixels and give a viewing angle of more than 45° in the horizontal and 36° in the vertical, and thus reconstruct almost all depth cues. As a result, these holograms can reconstruct fine true spatial 3D images that give a strong sensation of depth to viewers.

The polygon-based method is briefly summarized for convenience of explanation in this section. In the point-based method, spherical waves emitted from point sources are computed and superposed in the hologram plane, as illustrated in Fig. 1(a). Considerably high surface density of the point sources, such as 103104points/mm2, is required to create a smooth surface with this method. The computation time is proportional to the product of the number of point sources and the number of pixels in the hologram. Since this product is gigantic number for high-definition holograms, the computation commonly takes a ridiculously long time. The polygon-based method is illustrated in Fig. 1(b). Object surfaces are composed of many polygons in this method. Each polygon is regarded as a surface source of light whose shape is polygonal. Wave-fields emitted from the slanted polygons are computed by numerical methods based on wave optics. Even though the individual computation time for a polygon is longer than that for a point source, the total computation time using the polygon-based method is remarkably shorter than that using point-based methods, because far fewer polygons than point sources are required to form a surface.

Grahic Jump LocationF1 :

Schematic comparison of the point-based method (a) and polygon-based method (b).

Theoretical Model of Surface Sources of Light

We can see real objects illuminated by a light source because the object surfaces scatter the light, as shown in Fig. 2(a). If we suppose that the surface is composed of polygons and we focus on one of the polygons, the polygon can be regarded as a planar distribution of optical intensity in 3D space. This distribution of light is similar to that in a slanted aperture irradiated by a plane wave, as in (b). The aperture has the same shape and slant as the polygon. However, a simple polygonal aperture may not behave as if it is a surface source of light, because the aperture size is usually too large to diffract the incident light. As a result, the light passing through the aperture does not sufficiently diffuse and spread over the whole viewing zone of the hologram. Obviously, the polygonal surface source should be imitated by a diffuser mounted in the aperture and having shape and tilt angle corresponding to the polygon. Figure 2(b) shows the theoretical model of a surface source of light for wave-field rendering.

Grahic Jump LocationF2 :

Theoretical model of polygonal surface sources of light (b) that imitate the surface of an object (a).

Surface Function

To compute the wave-field diffracted by the diffuser with polygonal shape, a surface function is defined for each individual polygon in a local coordinate system that is also specific to the individual polygon. An example of the surface function is shown in Fig. 3, where the surface function of polygon 2 of a cubic object in (a) is shown in (b). The surface function hn(xn,yn) for polygon n is generally given in the form Display Formula

hn(xn,yn)=an(xn,yn)exp[iϕ(xn,yn)],(1)
where an(xn,yn) and ϕ(xn,yn) are the real-valued amplitude and phase distribution defined in local coordinates (xn,yn,0). The phase pattern ϕ(xn,yn) is not visible in principle, because all image sensors including the human retina can detect only the intensity of light, whereas the amplitude pattern an(xn,yn) directly determines the appearance of the polygon. Therefore, the diffuser should be applied using the phase pattern, while the shape of the polygon should be provided by the amplitude pattern.

Grahic Jump LocationF3 :

An example of surface functions (b) for polygon 2 of a cubic object (a). The amplitude image of the polygon field (c) after rotational transform agrees with the shape of the original polygon.

Since this paper discusses the rendering of diffuse surfaces, the phase pattern used for rendering should have a wideband spectrum. In this case, a given single phase pattern can be used for all polygons. Note that if a specular surface is rendered, the spectral bandwidth should be restricted to limit the direction of reflection. Furthermore, the center of the spectrum should be shifted depending on the direction of the polygon.5,21,22

Computation of Polygon Fields

The numerical procedure for computing polygon fields is shown in Fig. 4. The surface function of a polygon yielded from vertex data of the polygon can be regarded as the distribution of complex amplitudes; i.e., the wave-field of the surface source of light. However, the surface function is usually given in a plane not parallel to the hologram. This means that the polygon field cannot be computed in the hologram plane using conventional techniques for field diffraction. Therefore, the rotational transform of light23,24 is employed to calculate the polygon field in a plane parallel to the hologram. The resultant wave-field is shown in Fig. 3(c). The polygon field after the rotational transform is then propagated over a short distance employing the angular spectrum method25 (AS) or band-limited angular spectrum method26 (BL-AS) to gather and integrate all polygon fields composing the object in a given single plane. This plane is called the object plane. Here, note that the object plane is not the hologram plane. The object plane should not be placed far from the object, because polygon fields spread more in a farther plane, and thus, the computation takes longer. Therefore, the best solution is most likely to place the object plane so that the plane crosses the object.1

Grahic Jump LocationF4 :

Numerical procedure for computing entire object fields.

The polygon fields gathered in the object plane propagate to the hologram or the next object plane closer to the hologram to shield the light behind the object using the silhouette method.18,19 However, the frame buffer for the whole field in the object plane is commonly too large to be simultaneously stored in the memory of a computer. Therefore, the field is segmented and propagated segment by segment, using off-axis numerical propagation such as the shifted Fresnel method1,27 (Shift-FR) or shifted angular spectrum method2,28 (Shift-AS).

Basic Formulation

In the polygon-based method, polygon fields are emitted toward the hologram along the optical axis. The brightness of the surface observed by viewers is estimated by radiometric analysis7: Display Formula

Ln(xn,yn)σan2(xn,yn)πtan2ψdcosθn,(2)
where an(xn,yn) is again the amplitude distribution of the surface function whose sampling density is given by σ. The normal vector of the polygon n forms the angle θn with the optical axis as shown in Fig. 5. Here, we assume that the polygon field is approximately spread over the solid angle πtan2ψd by the wide band phase distribution ϕ(xn,yn).

Grahic Jump LocationF5 :

Radiometric model of reconstructed polygon surfaces.

The surface brightness given by relation (2) becomes infinite in the limit θnπ/2; i.e., the brightness diverges because it is assumed in the analysis that the hologram can reconstruct light in an unlimited dynamic range. However, the dynamic range of the reconstructed light is actually limited in real holograms because the fringe patterns are never printed in full contrast of transmittance or reflectance and are commonly quantized; e.g., the fringe pattern is binarized in our case. Therefore, we adopt the following simplified and non-divergent formula to estimate the brightness. Display Formula

In(xn,yn)=L0(1+γcosθn+γ)an2(xn,yn),(3)
Display Formula
L0=σπtan2ψd,(4)
where both L0 and γ are constants. The constant γ is introduced a priori to avoid the divergence of brightness. This constant should be determined depending on the printing process of the fringe pattern; it is fitted to the optical reconstruction of the fabricated hologram.

As a result, the wave-field rendering of diffused surfaces is given by Display Formula

an(xn,yn)=(cosθn+γ1+γ)Ishape,n(xn,yn)Ishade,n(xn,yn)Itex,n(xn,yn),(5)
where L01 and the brightness In(xn,yn) of Eq. (3) is replaced by the product of three distributions, Ishape,n(xn,yn), Ishade,n(xn,yn) and Itex,n(xn,yn), which are given in local coordinates for each polygon and perform the roles of shaping polygons and shading and texture-mapping surfaces, respectively.

Shaping and Shading

The shape of a polygon is given by the amplitude pattern of the surface function of Eq. (1). Thus, the distribution that provides shape to the polygon is given by Display Formula

Ishape,n(xn,yn)={1inside polygon0outside polygon.(6)
In the polygon-based method, the shading technique for the object surface is essentially the same as that of CG. The distribution for shading is given by Display Formula
Ishade,n(xn,yn)=Is,n(xn,yn)+Ienv,(7)
where Ienv gives the degree of ambient light. When flat-shading is used, the distribution Is,n(xn,yn) is a constant and given by Lambert’s cosine law: Display Formula
Is,n(xn,yn)Nn·Li,(8)
where Nn and Li are the normal vector of the polygon n and a unit vector pointing from the surface to the virtual light source of the 3D scene.

Our previous computer holograms such as “The Venus”1 or “Moai I/II”2 were created by flat shading; i.e., using the amplitude pattern given by Eqs. (5) to (8). As a result, the borders of the polygons are visible in the optical reconstruction as shown in Fig. 6. This problem is attributed to the shading technique used.

Grahic Jump LocationF6 :

Photographs of the optical reconstruction of a polygon-based high-definition computer hologram named “Moai II”.2 The camera is focused on the near moai (a) and far moai (b).

Well-known smooth shading techniques of CG, such as Gouraud and Phong shading, are also applicable to wave-field rendering. The distribution Is,n(xn,yn) is not a constant but a function of the local coordinates (xn,yn) in these cases. For example, Gouraud shading determines the local brightness within a polygon through linear interpolation of the brightness values at vertexes of the polygon. Thus, the local brightness is the same either side of the border of two polygons.

Figure 7 compares surface functions for flat and smooth shading. The phase distribution of smooth shading in (b) is the same as that of flat shading in (a), but the amplitude distribution in (b) is given by the same technique as used in Gouraud shading in CG. Figure 8 shows the simulated reconstruction of a computer hologram of twin semi-spheres created with flat and Gouraud shading. Here, each semi-sphere is composed of 200 polygons and is 23 mm in diameter. The hologram dimensions are 65,536×32,768pixels. The technique of numerical image formation29 based on wave-optics was used for the simulated reconstruction. It is verified that the seams of polygons are no longer visible for the left sphere (a) created using the same technique as used in Gouraud shading.

Grahic Jump LocationF7 :

Examples of the surface function for flat shading (a) and Gouraud shading (b).

Grahic Jump LocationF8 :

Simulated reconstruction of spheres rendered by Gouraud shading (a) and flat shading (b).

Texture Mapping

Texture mapping is carried out in the polygon-based method simply to provide the distribution Itex,n(xn,yn) using some projection of the texture image onto the polygon. An example of the surface function for texture mapping is shown in Fig. 9(a). Here, the object is a sphere and the mapping image is an astrophotograph of the real moon shown in (b). The distribution Itex,n(xn,yn) of the polygon n is provided by simple orthogonal projection and interpolation of the astrophotographic image as in (c).

Grahic Jump LocationF9 :

(a) Example of the surface function for texture mapping. An astrophotograph (b) is mapped onto the polygon-mesh sphere by orthogonal projection of the mapping image (c).

Two high-definition computer holograms named “The Moon” and “Shion” were created using the proposed techniques. The two holograms have the same number of pixels; that is, approximately nine billion pixels. The parameters used in creating these holograms are summarized in Table 1.

Table Grahic Jump Location
Table 1Summary of parameters used in creating computer holograms “The Moon” and “Shion”.

The fringe patterns of the holograms were printed on photoresist coated on ordinary photomask blanks by employing DWL-66 laser lithography system made by Heidelberg Instruments GmbH. After developing the photoresist, the chromium thin film was etched by using the ordinary process for fabricating photomasks and formed the binary transmittance pattern. As a result, the fabricated holograms have fringes of binary amplitudes.

The Moon

The Moon is a computer hologram created using the techniques of texture mapping and flat shading. The 3D scene is shown in Fig. 10. The main object is a sphere composed of 1600 polygons. The diameter of the sphere is 55 mm. The mapping image is again an astrophotograph of the real moon shown in Fig. 9(b). The background of this hologram is not a 2D image but 300 point sources of light. Since the intention is that this background is to appear as stars in space, the position and amplitude of these point sources are given by a random-number generator.

The optical reconstruction of The Moon is shown in Fig. 11 and Video 1. Since the light of the background stars is shielded by the silhouette of the moon object, the hologram produces a strong perspective sensation. Seams of polygons are noticed slightly because of the flat shading of the sphere.

Grahic Jump LocationF11 :

Optical reconstruction of “The Moon” by using transmitted illumination of a He-Ne laser (a) and reflected illumination of an ordinary red LED (b) (Video 1, WMV, 7.8 MB) [URL: http://dx.doi.org/10.11171/1.JEI.21.2.023002.1]. The sphere object is rendered using texture mapping and flat shading.

Shion

Shion is a hologram that reconstructs the live face of a girl. However, the recording of the light emitted from the live face does not have the same meaning as in classical holography. Instead of recording the wave field of the face, the 3D shape of the face was measured using a 3D laser scanner. The polygon mesh measured using a Konica Minolta Vivid 910 device is shown in Fig. 12(a). The photograph simultaneously taken by the 3D scanner was texture-mapped to the polygon-mesh surface, as shown in (b). The object is also shaded using Gouraud shading and placed 10 cm behind the hologram. In addition, a digital illustration is arranged 12 cm behind the face object to make the background.

Grahic Jump LocationF12 :

Polygon mesh of a live face whose shape is measured using a 3D laser scanner (a), and its rendering with CG using texture mapping.

The optical reconstruction of Shion is shown in Fig. 13 (Video 2) and Fig. 14 (Video 3). The seams of polygons are no longer perceived because of the implementation of smooth shading. However, there is occlusion error at the edge of the face object. This is most likely attributed to the use of a silhouette to mask the field behind the object. Since the object shape is complicated, the simple silhouette does not work well for light-shielding.

Grahic Jump LocationF13 :

Optical reconstruction of the polygon-based high-definition computer hologram named “Shion” by using transmitted illumination of a He-Ne laser (Video 2, WMV, 8.8 MB) [URL: http://dx.doi.org/10.11171/1.JEI.21.2.023002.2]. The polygon-modeled object is rendered by texture mapping and Gouraud shading. Photographs (a) and (b) are taken from different viewpoints.

Grahic Jump LocationF14 :

Optical reconstruction of “Shion” by using an ordinary red LED (Video 3, WMV, 10.8 MB) [URL: http://dx.doi.org/10.11171/1.JEI.21.2.023002.3].

Simple rendering techniques were proposed for photorealistic reconstruction in polygon-based high-definition computer holography. The polygon-based method has similarities with common techniques used in CG. Exploiting this similarity, smooth shading and texture mapping are applicable to rendering surface objects in almost the same manner as in CG. The created high-definition holograms are composed of billions of pixels and reconstruct true fine 3D images that convey a strong sensation of depth. These 3D images are produced only as still images at this stage, because current video display devices do not have sufficient display resolution for optical reconstruction. However, the results presented indicate what 3D images may be realized beyond Super Hi-Vision.

The authors thank Prof. Kanaya of Osaka University for his assistance in the 3D scan of live faces. The mesh data for the moai objects were provided courtesy of Yutaka_Ohtake by the AIM@SHAPE Shape Repository. This work was supported in part by research grants from the JSPS (KAKENHI, 21500114) and Kansai University (Grant-in-aid for Joint Research 2011–2012).

Matsushima  K., Nakahara  S., “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt.. 48, (34 ), H54 –H63 (2009). 0003-6935 CrossRef
Matsushima  K., Nakahara  S., “High-definition full-parallax CGHs created by using the polygon-based method and the shifted angular spectrum method,” in Proc. SPIE. 7619, , 761913  (2010). 0277-786X CrossRef
Matsushima  K., Nakamura  M., Nakahara  S., “Novel techniques introduced into polygon-based high-definition CGHs,” in  OSA Topical Meeting on Digital Holography and Three-Dimensional Imaging ,  Optical Society of America , JMA10 (2010).
Matsusima  K. et al., “Computational holography: real 3D by fast wave-field rendering in ultra-high resolution,” in  Proc. SIGGRAPH Posters’ 2010 ,  Association for Computer Machinery  (2010).
Nishi  H. et al., “New techniques for wave-field rendering of polygon-based high-definition CGHs,” in Proc. SPIE. 7957, , 79571A  (2011). 0277-786X CrossRef
DigInfo., “Computer-synthesized holograms—the ultimate in 3D images” http://www.diginfo.tv/2010/07/22/10-0130-r-en.php.
Matsushima  K., “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt.. 44, (22 ), 4607 –4614 (2005). 0003-6935 CrossRef
Waters  J. P., “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett.. 9, (11 ), 405 –407 (1966). 0003-6951 CrossRef
Stein  A. D., Wang  Z., Leigh  J. J. S., “Computer-generated holograms: a simplified ray-tracing approach,” Comput. Phys.. 6, (4 ), 389 –392 (1992). 0894-1866 CrossRef
Lucente  M., “Interactive computation of holograms using a look-up table,” J. Electronic Imaging. 2, (1 ), 28 –34 (1993). 1017-9909 CrossRef
Ritter  A. et al., “Hardware-based rendering of full-parallax synthetic holograms,” Appl. Opt.. 38, (8 ), 1364 –1369 (1999). 0003-6935 CrossRef
Matsushima  K., Takai  M., “Recurrence formulas for fast creation of synthetic three-dimensional holograms,” Appl. Opt.. 39, (35 ), 6587 –6594 (2000). 0003-6935 CrossRef
Yoshikawa  H., Iwase  S., Oneda  T., “Fast computation of Fresnel holograms employing difference,” in Proc. SPIE. 3956, , 48 –55 (2000). 0277-786X CrossRef
Masuda  N. et al., “Computer generated holography using a graphics processing unit,” Opt. Express. 14, (2 ), 603 –608 (2006). 1094-4087 CrossRef
Ichihashi  Y. et al., “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express. 17, (16 ), 13895 –13903 (2009). 1094-4087 CrossRef
Ahrenberg  L. et al., “Computer generated holograms from three dimensional meshes using an analytic light transport model,” Appl. Opt.. 47, (10 ), 1567 –1574 (2008). 0003-6935 CrossRef
Kim  H., Hahn  J., Lee  B., “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt.. 47, (19 ), D117 –D127 (2008). 0003-6935 CrossRef
Matsushima  K., Kondoh  A., “A wave-optical algorithm for hidden-surface removal in digitally synthetic full-parallax holograms for three-dimensional objects,” in Proc. SPIE. 5290, (6 ), 90 –97 (2004). 0277-786X CrossRef
Kondoh  A., Matsushima  K., “Hidden surface removal in full-parallax CGHs by silhouette approximation,” Syst. Comput. Jpn.. 38, (6 ), 53 –61 (2007). 1520-684X CrossRef
Janda  M., Hanák  I., Onural  L., “Hologram synthesis for photorealistic reconstruction,” J. Opt. Soc. Am. A. 25, (12 ), 3083 –3096 (2008). 0740-3232 CrossRef
Nishi  H., Matsushima  K., Nakahara  S., “A novel method for rendering specular and smooth surfaces in polygon-based high-definition CGH,” in  OSA Topical Meeting on Digital Holography and Three-Dimensional Imaging 2011 ,  Optical Society of America , JDWC29 (2011).
Nishi  H., Matsushima  K., Nakahara  S., “Rendering of specular surfaces in polygon-based computer-generated holograms,” Appl. Opt.. 50, (34 ), H245 –H252 (2011). 0003-6935 CrossRef
Matsushima  K., Schimmel  H., Wyrowski  F., “Fast calculation method for optical diffraction on tilted planes by use of the angular spectrum of plane waves,” J. Opt. Soc. Am.. A20, (9 ), 1755 –1762 (2003). 0030-3941 CrossRef
Matsushima  K., “Formulation of the rotational transformation of wave fields and their application to digital holography,” Appl. Opt.. 47, (19 ), D110 –D116 (2008). 0003-6935 CrossRef
Goodman  J. W., Introduction to Fourier Optics. , 2nd ed., Chapter 3.10,  McGraw-Hill ,  New York  (1996).
Matsushima  K., Shimobaba  T., “Band-limited angular spectrum method for numerical simulation of free-space propagation in far and near fields,” Opt. Express. 17, (22 ), 19662 –19673 (2009). 1094-4087 CrossRef
Muffoletto  R. P., Tyler  J. M., Tohline  J. E., “Shifted Fresnel diffraction for computational holography,” Opt. Express. 15, (9 ), 5631 –5640 (2007). 1094-4087 CrossRef
Matsushima  K., “Shifted angular spectrum method for off-axis numerical propagation,” Opt. Express. 18, (17 ), 18453 –18463 (2010). 1094-4087 CrossRef
Matsushima  K., Murakami  K., “Numrical image formation and their application to digital holography and computer holography,” to be submitted to Opt. Express.

Grahic Jump LocationImage not available.

Kyoji Matsushima received his BE, ME and PhD degree of applied physics from Osaka City University (Japan). Matsushima joined Department of Electrical Engineering and Computer Science at Kansai University as a research assistant in 1990. He is currently a professor in the Department of Electrical and Electronic Engineering in the same university. His research interests include 3D imaging based on computer-generated holograms and digital holography, and numerical simulations in wave optics.

Grahic Jump LocationImage not available.

Hirohito Nishi received his BE in electrical engineering and computer science and ME in electrical and electronic engineering from Kansai University. He is currently a graduate student of Kansai University. His current interests include 3D imaging based on computer-generated holograms.

Grahic Jump LocationImage not available.

Sumio Nakahara is an associate professor in the Department of Mechanical Engineering in Kansai University (Japan). His PhD degree is from Osaka University in 1987. He joined Department of Mechanical Engineering at Kansai University as a research assistant in 1974. He was adjunct professor position at Washington State University, Pullman, Washington, in 1993–1994. His current research interests are in the development of laser direct-write lithography technology for computer-generated holograms, laser micro processing and MEMS technology.

© 2012 SPIE and IS&T

Citation

Kyoji Matsushima ; Hirohito Nishi and Sumio Nakahara
"Simple wave-field rendering for photorealistic reconstruction in polygon-based high-definition computer holography", J. Electron. Imaging. 21(2), 023002 (Apr 26, 2012). ; http://dx.doi.org/10.1117/1.JEI.21.2.023002


Figures

Grahic Jump LocationF1 :

Schematic comparison of the point-based method (a) and polygon-based method (b).

Grahic Jump LocationF2 :

Theoretical model of polygonal surface sources of light (b) that imitate the surface of an object (a).

Grahic Jump LocationF3 :

An example of surface functions (b) for polygon 2 of a cubic object (a). The amplitude image of the polygon field (c) after rotational transform agrees with the shape of the original polygon.

Grahic Jump LocationF4 :

Numerical procedure for computing entire object fields.

Grahic Jump LocationF5 :

Radiometric model of reconstructed polygon surfaces.

Grahic Jump LocationF6 :

Photographs of the optical reconstruction of a polygon-based high-definition computer hologram named “Moai II”.2 The camera is focused on the near moai (a) and far moai (b).

Grahic Jump LocationF7 :

Examples of the surface function for flat shading (a) and Gouraud shading (b).

Grahic Jump LocationF8 :

Simulated reconstruction of spheres rendered by Gouraud shading (a) and flat shading (b).

Grahic Jump LocationF9 :

(a) Example of the surface function for texture mapping. An astrophotograph (b) is mapped onto the polygon-mesh sphere by orthogonal projection of the mapping image (c).

Grahic Jump LocationF11 :

Optical reconstruction of “The Moon” by using transmitted illumination of a He-Ne laser (a) and reflected illumination of an ordinary red LED (b) (Video 1, WMV, 7.8 MB) [URL: http://dx.doi.org/10.11171/1.JEI.21.2.023002.1]. The sphere object is rendered using texture mapping and flat shading.

Grahic Jump LocationF12 :

Polygon mesh of a live face whose shape is measured using a 3D laser scanner (a), and its rendering with CG using texture mapping.

Grahic Jump LocationF13 :

Optical reconstruction of the polygon-based high-definition computer hologram named “Shion” by using transmitted illumination of a He-Ne laser (Video 2, WMV, 8.8 MB) [URL: http://dx.doi.org/10.11171/1.JEI.21.2.023002.2]. The polygon-modeled object is rendered by texture mapping and Gouraud shading. Photographs (a) and (b) are taken from different viewpoints.

Grahic Jump LocationF14 :

Optical reconstruction of “Shion” by using an ordinary red LED (Video 3, WMV, 10.8 MB) [URL: http://dx.doi.org/10.11171/1.JEI.21.2.023002.3].

Tables

Table Grahic Jump Location
Table 1Summary of parameters used in creating computer holograms “The Moon” and “Shion”.

References

Matsushima  K., Nakahara  S., “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt.. 48, (34 ), H54 –H63 (2009). 0003-6935 CrossRef
Matsushima  K., Nakahara  S., “High-definition full-parallax CGHs created by using the polygon-based method and the shifted angular spectrum method,” in Proc. SPIE. 7619, , 761913  (2010). 0277-786X CrossRef
Matsushima  K., Nakamura  M., Nakahara  S., “Novel techniques introduced into polygon-based high-definition CGHs,” in  OSA Topical Meeting on Digital Holography and Three-Dimensional Imaging ,  Optical Society of America , JMA10 (2010).
Matsusima  K. et al., “Computational holography: real 3D by fast wave-field rendering in ultra-high resolution,” in  Proc. SIGGRAPH Posters’ 2010 ,  Association for Computer Machinery  (2010).
Nishi  H. et al., “New techniques for wave-field rendering of polygon-based high-definition CGHs,” in Proc. SPIE. 7957, , 79571A  (2011). 0277-786X CrossRef
DigInfo., “Computer-synthesized holograms—the ultimate in 3D images” http://www.diginfo.tv/2010/07/22/10-0130-r-en.php.
Matsushima  K., “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt.. 44, (22 ), 4607 –4614 (2005). 0003-6935 CrossRef
Waters  J. P., “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett.. 9, (11 ), 405 –407 (1966). 0003-6951 CrossRef
Stein  A. D., Wang  Z., Leigh  J. J. S., “Computer-generated holograms: a simplified ray-tracing approach,” Comput. Phys.. 6, (4 ), 389 –392 (1992). 0894-1866 CrossRef
Lucente  M., “Interactive computation of holograms using a look-up table,” J. Electronic Imaging. 2, (1 ), 28 –34 (1993). 1017-9909 CrossRef
Ritter  A. et al., “Hardware-based rendering of full-parallax synthetic holograms,” Appl. Opt.. 38, (8 ), 1364 –1369 (1999). 0003-6935 CrossRef
Matsushima  K., Takai  M., “Recurrence formulas for fast creation of synthetic three-dimensional holograms,” Appl. Opt.. 39, (35 ), 6587 –6594 (2000). 0003-6935 CrossRef
Yoshikawa  H., Iwase  S., Oneda  T., “Fast computation of Fresnel holograms employing difference,” in Proc. SPIE. 3956, , 48 –55 (2000). 0277-786X CrossRef
Masuda  N. et al., “Computer generated holography using a graphics processing unit,” Opt. Express. 14, (2 ), 603 –608 (2006). 1094-4087 CrossRef
Ichihashi  Y. et al., “HORN-6 special-purpose clustered computing system for electroholography,” Opt. Express. 17, (16 ), 13895 –13903 (2009). 1094-4087 CrossRef
Ahrenberg  L. et al., “Computer generated holograms from three dimensional meshes using an analytic light transport model,” Appl. Opt.. 47, (10 ), 1567 –1574 (2008). 0003-6935 CrossRef
Kim  H., Hahn  J., Lee  B., “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt.. 47, (19 ), D117 –D127 (2008). 0003-6935 CrossRef
Matsushima  K., Kondoh  A., “A wave-optical algorithm for hidden-surface removal in digitally synthetic full-parallax holograms for three-dimensional objects,” in Proc. SPIE. 5290, (6 ), 90 –97 (2004). 0277-786X CrossRef
Kondoh  A., Matsushima  K., “Hidden surface removal in full-parallax CGHs by silhouette approximation,” Syst. Comput. Jpn.. 38, (6 ), 53 –61 (2007). 1520-684X CrossRef
Janda  M., Hanák  I., Onural  L., “Hologram synthesis for photorealistic reconstruction,” J. Opt. Soc. Am. A. 25, (12 ), 3083 –3096 (2008). 0740-3232 CrossRef
Nishi  H., Matsushima  K., Nakahara  S., “A novel method for rendering specular and smooth surfaces in polygon-based high-definition CGH,” in  OSA Topical Meeting on Digital Holography and Three-Dimensional Imaging 2011 ,  Optical Society of America , JDWC29 (2011).
Nishi  H., Matsushima  K., Nakahara  S., “Rendering of specular surfaces in polygon-based computer-generated holograms,” Appl. Opt.. 50, (34 ), H245 –H252 (2011). 0003-6935 CrossRef
Matsushima  K., Schimmel  H., Wyrowski  F., “Fast calculation method for optical diffraction on tilted planes by use of the angular spectrum of plane waves,” J. Opt. Soc. Am.. A20, (9 ), 1755 –1762 (2003). 0030-3941 CrossRef
Matsushima  K., “Formulation of the rotational transformation of wave fields and their application to digital holography,” Appl. Opt.. 47, (19 ), D110 –D116 (2008). 0003-6935 CrossRef
Goodman  J. W., Introduction to Fourier Optics. , 2nd ed., Chapter 3.10,  McGraw-Hill ,  New York  (1996).
Matsushima  K., Shimobaba  T., “Band-limited angular spectrum method for numerical simulation of free-space propagation in far and near fields,” Opt. Express. 17, (22 ), 19662 –19673 (2009). 1094-4087 CrossRef
Muffoletto  R. P., Tyler  J. M., Tohline  J. E., “Shifted Fresnel diffraction for computational holography,” Opt. Express. 15, (9 ), 5631 –5640 (2007). 1094-4087 CrossRef
Matsushima  K., “Shifted angular spectrum method for off-axis numerical propagation,” Opt. Express. 18, (17 ), 18453 –18463 (2010). 1094-4087 CrossRef
Matsushima  K., Murakami  K., “Numrical image formation and their application to digital holography and computer holography,” to be submitted to Opt. Express.

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.