Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
Resource Allocation Schemes for 5G Network: A Systematic Review
Next Article in Special Issue
Multi-Incidence Holographic Profilometry for Large Gradient Surfaces with Sub-Micron Focusing Accuracy
Previous Article in Journal
Sensors and Sensor’s Fusion in Autonomous Vehicles
Previous Article in Special Issue
Digital Hologram Watermarking Based on Multiple Deep Neural Networks Training Reconstruction and Attack
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Determining Surface Shape of Translucent Objects with the Combination of Laser-Beam-Based Structured Light and Polarization Technique

1
Research Center for Quantum Optics and Quantum Communication, School of Science, Qingdao University of Technology, Qingdao 266525, China
2
Office of Laboratory Management, Qingdao Agricultural University, Qingdao 266109, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2021, 21(19), 6587; https://doi.org/10.3390/s21196587
Submission received: 28 July 2021 / Revised: 23 September 2021 / Accepted: 28 September 2021 / Published: 1 October 2021

Abstract

:
In this study, we focus on the 3D surface measurement and reconstruction of translucent objects. The proposed approach of surface-shape determination of translucent objects is based on the combination of the projected laser-beam-based sinusoidal structured light and the polarization technique. The theoretical analyses are rigorously completed in this work, including the formation, propagation, and physical features of the generated sinusoidal signal by the designed optical system, the reflection and transmission of the projected monochromatic fringe pattern on the surface of the translucent object, and the formation and the separation of the direct-reflection and the global components of the surface radiance of the observed object. The results of experimental investigation designed in accordance with our theoretical analyses have confirmed that accurate reconstructions can be obtained using the one-shot measurement based on the proposed approach of this study and Fourier transform profilometry, while the monochromaticity and the linearly-polarized characteristic of the projected sinusoidal signal can be utilized by using a polarizer and an optical filter simultaneously for removing the global component, i.e., the noised signal contributed by multiply-scattered photons and the background illuminance in the frame of our approach. Moreover, this study has also revealed that the developed method is capable of getting accurate measurements and reconstructions of translucent objects when the background illumination exists, which has been considered as a challenging issue for 3D surface measurement and reconstruction of translucent objects.

1. Introduction

It has been attractive but extremely challenging for the in situ measurement and accurate reconstruction of translucent objects. Generally speaking, the pure direct reflection can be used for determining the 3D shape of a translucent object. However, the physical feature of translucent objects determines that the signal of the direct reflection is usually quite weak, while most parts of the illumination will go through the interface and undergo the multiple scattering processes. The surface reflectivity of the translucent object could be lower less than 5%, such as an object made of silicone rubber with the real part of the refractive index being n r = 1.5 . The optical triangular method has been considered as a popular and promising technique in this area. However, as discussed by Holroyd and Lawrence [1], it would be difficult to carry out 3D shape measurement of translucent objects using optical triangular methods, i.e., the structured-light-illumination methods, due to the subsurface scattering brings uncertainties to the reflection measurements at the object surface. The optical triangular method is simply based on the assumption that the direct reflection takes place at the object surface, which might be violated for measuring the translucent objects.
The basic approach of previous investigations for measuring the translucent objects using optical triangular methods is isolating the direct reflectance at the object surface [1,2,3,4]. The method of separating the direct and global components was initially investigated by Nayar et al. [5], which has been cited, discussed, and adopted by other works [6,7]. In the approach of Nayar et al. the global lighting effect was considered as a constant if the projected high-frequency pattern was employed, which allowed for efficient separation of the observed light intensities into direct and global light [5]. Generally speaking, it is difficult to separate the direct and global components, and the separation technique developed by Nayar et al. basically needs a lot of images, which may also be affected by the noises existed in the whole process [5]. To this end, Lockerman and co-authors proposed a method using three directions of projection [8].
The study of polarization-difference imaging [9,10] and the method combining phase-shifting and polarization filtering [11] are considered as promising approach for improving 3D surface measurement of translucent objects. Recently, Xu and co-authors proposed an approach for 3D shape measurement of translucent objects based on phase-shifting fringe projection profilometry [12]. Even though phase-shifting method can be considered to perform the separation of the direct and global components and the 3D surface measurement simultaneously [5,11], measurements using phase-shifting method certainly have a strict requirement of surrounding environment, such as that there should not be background illumination and environmental vibration. For reducing the measuring error in 3D scans of translucent objects using illuminated fringe patterns [13], a number of novel investigations have been carried out, such as the approaches based on multiple scattering [14], the photometric stereo method for measuring optically-thick translucent objects [15], the Monte Carlo simulation [16], and the Fourier single-pixel imaging technique [17]. For the binary boundary methods, exclusively high or low-frequency pattern schemes were considered robust against different global illumination effects [18]. Note that, edges in images of translucent objects are very different from edges in images of opaque objects [19], since the internal scattering within the translucent object can create a variety of image effects, basically depending on the shape and material of the observed object as well as the optical configuration of the measurement.
The existing approaches of measuring translucent objects using the optical triangular method are basically based on the phase-shifting technique and the projection of visible-light fringe patterns generated by the DMD-chip-based digital-light-processing (DLP) projector. However, the phase-shifting technique does have a rigorous requirement of the observed object and the environment, i.e., there is no environmental vibration and the observed object has to remain still during the measuring time. As far as the structured light generated by DLP projector, the intensity of projected fringe pattern is affected by the Gamma effect and defocusing, which can result in inaccuracy of image processing and reconstruction. Moreover, for using projections of visible-light fringe patterns, it is basically impossible to carry out measurements under background illumination and obtain reconstructions with better accuracy. Thus, it might be difficult to fulfill the outdoor and/or in situ measurements of translucent objects using the optical triangular method together with the phase-shifting technique and the projection of DLP-based visible-light fringe patterns due to the exist of environmental vibration and background illumination.
Therefore, for obtaining an improved solution to the challenging of separating the direct and global components of measuring the translucent object as well as overcoming the drawbacks of the phase-shifting technique with the DLP-based fringe patterns, we suggest and investigate a new method using the monochromatic structured light, techniques of optical filtering and polarization, and the image-processing algorithm based on one-shot Fourier transform profilometry.

2. Theoretical Analysis and Mathematical Description of the Proposed Method

This section presents a discussion of the physical process of the surface-shape measurement and reconstruction of a translucent object using sinusoidal fringe pattern generated by the developed optical system, which basically contains the theoretical analyses and mathematical descriptions of the generation and propagation of the sinusoidal fringe pattern in the air, the projection and reflection of the sinusoidal fringe pattern on the surface of the object being observed, and the multiple-scattering process of the transmitted sinusoidal optical signal under the surface of the translucent object and its contribution to the radiance measured by the CCD camera.

2.1. Generation and Propagation of the Sinusoidal Fringe Signal

As we discussed in the section of introduction, the proposed method of the surface-shape measurement and reconstruction of a translucent object is based on the combination of the projection of the monochromatic structured light, techniques of optical filtering and polarization, and the image-processing algorithm of Fourier transform profilometry. Profilometry based on either Fourier-transform approach or phase-shifting method relies on the projection of sinusoidal fringe patterns. The widely-used sinusoidal fringe pattern generated using DLP projector is non-monochromatic structured signal, which indicates that it is impossible to employ the techniques of optical filtering and polarization when the projected fringe pattern is generated using DLP projector. For the purpose of generating monochromatic, high contrast, and truly sinusoidal fringe patterns, we have designed and developed the laser-beam-based optical system as shown in Figure 1a. The developed optical system consists of the following parts: a CW laser source S with the wavelength λ = 532 nm, a rectangular grating G, a Fourier-transform positive lens L with focal length f, an adjustable spatial-frequency filter F, the observation plane P where the target T is placed, and a CCD camera C being connected to the computer. As sketched in Figure 1b, the fundamental components of the sinusoidal optical signal generator include a CW laser source S, a grating positioned at the plane P 0 , a Fourier-transform positive lens at the plane P 1 , an adjustable spatial-frequency filter located at the plane P 2 , and the observation plane is set at the plane P 3 .
To analyze the generation and propagation of the sinusoidal fringe signal, we start with the filed from a CW point laser source S, U ( r ) , in the form [20]
U ( r ) = A r e i k r ,
where k is the wave number in the homogeneous background medium, i.e., the air in this study, A is a complex constant related to the laser power, and r = x 2 + y 2 + z 2 . Taking A = 1 for simplicity and using the paraxial approximation, the field U 0 ( x 0 , y 0 ) behind the grating G at the plane P 0 is given by [21]
U 0 ( x 0 , y 0 ) = 1 Z 0 e i k Z 0 e i k 2 Z 0 ( x 0 2 + y 0 2 ) t 0 ( x 0 , y 0 ) ,
where x 0 and y 0 are the coordinate variables of P 0 , Z 0 is the distance between the laser source S and grating G, and t 0 ( x 0 , y 0 ) the transmission function of G. Considering that G is one-dimensional, t 0 ( x 0 , y 0 ) can be described using [22]
t 0 ( x 0 , y 0 ) = t 0 ( x 0 ) = rect x 0 a 1 d comb x 0 d circ 2 | x 0 | H ,
where a and d are the optical parameters of grating G. H is the grating width, which is assumed, for the sake of simplicity, to be the diameter of the illumination spot of laser.
Since the propagation of field U 0 ( x 0 , y 0 ) from plane P 0 to plane P 1 is within Fresnel region, the field U 1 ( x 1 , y 1 ) in front of the lens L at plane P 1 can be written as
U 1 ( x 1 , y 1 ) = C 1 Σ 0 U 0 ( x 0 , y 0 ) e i k 2 Z 1 [ ( x 1 x 0 ) 2 + ( y 1 y 0 ) 2 ) ] d x 0 d y 0 ,
where C 1 = 1 i λ Z 1 e i k Z 1 is a complex constant, and Z 1 is the distance between the planes P 0 and P 1 . The field U 1 ( x 1 , y 1 ) after the lens L at plane P 1 can be represented as
U 1 ( x 1 , y 1 ) = U 1 ( x 1 , y 1 ) t L ( x 1 , y 1 ) = U 1 ( x 1 , y 1 ) e i k 2 f ( x 1 2 + y 1 2 ) ,
where t L ( x 1 , y 1 ) = e i k 2 f ( x 1 2 + y 1 2 ) denotes the transmission function of the lens L.
The propagating field U 1 ( x 1 , y 1 ) at plane P 1 to U 2 ( x 2 , y 2 ) at plane P 2 can be accurately computed using Fresnel diffraction. U 2 ( x 2 , y 2 ) denotes the field in front of the spatial-frequency filter F, and is given by
U 2 ( x 2 , y 2 ) = C 2 Σ 1 U 1 ( x 1 , y 1 ) e i k 2 Z 2 [ ( x 2 x 1 ) 2 + ( y 2 y 1 ) 2 ) ] d x 1 d y 1 ,
where C 2 = 1 i λ Z 2 e i k Z 2 is a complex constant, and Z 2 is the distance between the planes P 1 and P 2 . Combining Equations (2)–(6) as well as the expression of t L ( x 1 , y 1 ) , we have
U 2 ( ω ) = a C 3 d m = sinc a m d J 1 [ π H ( ω m / d ) ] ω m / d ,
where J 1 ( ω ) is the Bessel function of the first kind, ω = 2 π f x with the spatial frequency f x at plane P 2 defined by f x = x 2 λ f , and C 3 is a complex constant given by C 3 = 1 λ f Z 0 e i [ k ( Z 0 + Z 1 + Z 2 ) π / 2 ] .
The role of the adjustable spatial-frequency filter F at plane P 2 is to select ± m -th order spectrum described in Equation (7), and let them to pass through it, and here we take m = 1 . We then get the field U 2 ( x 2 , y 2 ) right behind the filter F in the following form
U 2 ( x 2 , y 2 ) = U 2 ( ω ) = U 2 ( ω ) | m = ± 1 = a C 1 d sinc a d { J 1 [ π H ( ω 1 / d ) ] ω 1 / d + J 1 [ π H ( ω + 1 / d ) ] ω + 1 / d } .
The propagation of field U 2 ( x 2 , y 2 ) from plane P 2 to P 3 can also be analyzed using Fresnel diffraction, which let the field U 3 ( x 3 , y 3 ) at plane P 3 to be given by
U 3 ( x 3 , y 3 ) = C 4 Σ 2 U 2 ( x 2 , y 2 ) e i k 2 Z 3 [ ( x 3 x 2 ) 2 + ( y 3 y 2 ) 2 ) ] d x 2 d y 2 ,
where C 4 = 1 i λ Z 3 e i k Z 3 is another complex constant, and Z 3 is the distance between the planes P 2 and P 3 . Considering that the size of Σ 2 ( x 2 , y 2 ) (≤4 mm in diameter) is much less than that of Σ 3 ( x 3 , y 3 ) , i.e., the spot size of the sinusoidal fringe pattern (≥100 mm in diameter), we take an approximation that λ f ( f x 2 + f y 2 ) 2 ( f x x 3 + f y y 3 ) for further derivation of Equation (9). Thus, Equation (9) becomes
U 3 ( x 3 , y 3 ) = C 5 Σ 2 U 2 ( f x , f y ) e i 2 π ( f x f x 3 Z 3 + f y f y 3 Z 3 ) d f x d f y = C 5 F 1 { U 2 ( f x , f y ) } ,
where the complex parameter C 5 is given by C 5 = λ f 2 Z 3 e i ( k Z 3 π / 2 ) e i k 2 Z 3 ( x 3 2 + y 3 2 ) .
Equation (10) indicates that the field at the observation plane P 3 is an inverse Fourier transform of the field output from the spatial-frequency filter F. Combining Equations (8) and (10), we have
U 3 ( x 3 ) = C 6 · circ 2 f | x 3 | H Z 3 · cos 2 π f x 3 d Z 3 ,
C 6 = 2 f Z 0 Z 3 π sin a π d e i [ k ( Z 0 + Z 1 + Z 2 + Z 3 ) π ] e i k 2 Z 3 ( x 3 2 + y 3 2 ) .
As indicated in Equation (12), C 6 is obviously a complex constant for determined spatial distances Z 0 and Z 3 , and the modulus of C 6 represents the amplitude of the generated sinusoidal signal. We see that the fringe intensity will decrease as Z 3 increases since f and Z 0 are generally parameters with fixed values. The part of circ 2 f | x 3 | H Z 3 in Equation (11) results in a definition of the range of the fringe pattern with a circle of radius H Z 3 2 f . The term of cos 2 π f x 3 d Z 3 is the key part of output of the designed optical system, i.e., the expression of a sinusoidal curve.

2.2. Reflection and Transmission of the Projected Sinusoidal Fringe Signal on the Surface of the Translucent Object

The proposed method of surface-shape measurement of translucent objects relies on the projection of laser-beam-based structured light, i.e., the monochromatic, high contrast, and truly sinusoidal optical signal described using Equation (11). Note that the projected fringe pattern described by Equation (11) is not affected by the defocusing issue, while the widely-used DLP projector does have the featured problem of defocusing. U 3 ( x 3 ) given in Equation (11) is coherent and linearly polarized before reaching the surface of the translucent object. As shown in Figure 2, the projected sinusoidal fringe signal undergoes the reflection and transmission on the surface of a translucent object and multiple scattering beneath the surface.
We see from Figure 2 that the total radiance measured by the detector will basically consist of I d , the directly-reflected radiance, I i , the radiance of interreflection, I s , the contribution from multiply-scattered photons, and I b , the contribution of background illuminance. In the proposed method of this work, I d is the only component that will be used for surface-shape reconstruction of the translucent object being observed, which will raise a challenging issue of separating I d from the rest parts, i.e., I i , I s and I b , of the totally-measured radiance. The physical characteristics of I d , I i , I s and I b can be summarized as follows.
(i) I d : The directly-reflected radiance, I d , is the reflected portion of the projected fringe pattern, which belongs to diffuse reflection. Since the surface property of the observed object is varying, some fraction of the direct reflection might be depolarized. When the energy of projection is fixed, I d basically depends on the real part of refraction index of the observed object in accordance with the Fresnel’s Equations for reflection and transmission. For the translucent objects used in this work, the real part of refraction index n r = 1.4∼1.5, which indicates that I d is only about 3∼4% of I p , the intensity of the projected fringe pattern. However, I d is the only part that will be picked up and used for reconstruction of the translucent objects being observed.
(ii) I i : Based on Fresnel’s Equations we can conclude the following two points: First, the polarization direction of I i is different from that of I d . Second, as shown in Figure 2, the dominant part of I i is the inter-reflections of the incident signal. However, the magnitude of I i is lower than the incident signal by about two orders due to the reason that the surface reflectivity of the translucent object being observed is R = 2.8∼5.3% with an assumption that the real part of the refractive index of the translucent object is in the range of n r = 1.4∼1.6 [23,24].
(iii) I s : The contribution to the measured radiance from multiply-scattered photons, I s , is unpolarized. Except I d , the rest part of the projected signal will transmit the surface, undergo a multiple-scattering process within the translucent object, and exit from the surface of the translucent object. I s can be estimated using diffusion approximation [25,26]. For calculating I s , the successful approach is a one-dimensional model based on radiative transfer theory [26]. Thus, we believe that it would be difficult to complete 3D reconstruction based on one-dimensional model of I s calculation.
(iv) I b : The background illuminance, I b , is unpolarized and polychromatic. It should be noted that if the projected signal is polychromatic, the surface measurement of the translucent object should be suggested to be carried out in a dark environment to keep I b = 0 . Otherwise, it would be very difficult or extremely time-consuming to separate I d from I b .
It should be noted that, for steady projection of the sinusoidal fringe pattern on a static translucent object being observed, I i and I s are time-independent, while I b is time-dependent since the background illumination is always varying. However, for measuring a dynamic translucent object, all of the values of I i , I s , and I b are time-dependent. However, the time-dependent values of I i , I s , and I b have no effects on the developed method of this investigation, which is one of the few important issues that may result in the expectation of applying the developed method to the measurement of dynamic translucent objects.

2.3. Separating I d from I g

Referring to the general definition of the global component previously suggested by Nayar and co-workers [5], the total radiance measured by the detector is given by
I = I d + I g ,
where I d and I g stands for the direct and the global components, respectively. Based on the proposed model discussed above, the global component I g contains the following three parts
I g = I b + I i + I s ,
where I b is the background illumination, I i represents the radiance of inter-reflection, and I s stands for the contribution from multiply-scattered photons.
Therefore, if the proposed methods can work for removing the three components of I g , i.e., I b , I i , and I s , the expected signal of I d can then be successfully measured, which is critically important for the 3D surface measurement and reconstruction of translucent objects.

2.3.1. Removing the Effect of I b Using an Optical Filter

The component I b related to background illumination is one important issue for in situ profilometry based on either Fourier transform or phase-shifting method, which has been the main reason that most of previous investigations have carried out experimental study in a dark environment for keeping I b = 0 .
In the proposed approach of this work, we are using the laser-beam-based monochromatic structured light at 532 nm as the projecting fringe patterns, which makes it possible to employ the optical filtering technique in the process of measurement. In practice, an optical bandpass filter centered at 532 nm is employed for removing the effects of background illuminations in the experimental setup of this investigation.

2.3.2. Eliminating the Effect of I i and I s Using a Polarizer

There have been some efforts that 3D reconstructions are based on the multiply-scattered photons, i.e., I s , such as the work by Ohtani et al. [14]. However, as we discussed above, the theoretical modeling and computation based on radiative transfer and diffusion theory for the reconstruction of the observed object must be difficult, since only one-dimensional model has been theoretically developed and experimentally validated [26], while it is almost impossible to describe the observed objects using one-dimensional model. In the work of Umeyama and Godin, a method of separating the diffuse and specular components of surface reflection using polarization as well as statistical analysis of images was proposed, but it was only validated for measuring the opaque objects [27].
In the developed method of this work, the desired useful signal is I d , and I s is then taken as noised signal to be eliminated. The component I s will be eliminated via employing polarization technique, which can be ensured since the incident monochromatic structured light and reflected signal from the surface of the translucent objects being observed are linearly polarized, and the part of signal composing multiply-scattered photons is completely unpolarized. When carrying out the practical measurement, a polarizer is mounted on the camera to carry out the measurement of observed object with projected fringe patterns generated using our designed laser-beam-based optical system as sketched in Figure 1.
As far as I i is concerned, since the magnitude of I i is much lower than I d and the polarization direction of I i is different from that of I d as discussed above, the elimination of I i can be accomplished when a polarizer is employed.
The angle of rotation of the polarizer is needed to be carefully controlled before taking the image or measurement of the object being observed. Note that, when having the same object and the same direction of fringe projection, the rotation of the polarizer is basically not necessary based on the discussion above, for the reason that the direction of polarization of I d should be unchanged. For measuring another object or having a new direction of fringe projection, the direction of polarization of I d will certainly be different, which will result in the requirement of rotation of the polarizer as a necessary procedure during the measurement.

3. Technical Analysis of Measurement and Reconstruction

In this investigation, the experimental setup for carrying out surface measurements of translucent objects is based on Figure 1a. From Figure 1a, we see that the parts S, G, L, and F are basically the main components of the sinusoidal optical signal generator. The laser-beam-based optical system employs a CW-laser source S with the wavelength λ = 532 nm, maximum output power of 200 mW, and a small divergence angle (≤1.5 ). A rectangular grating G illuminated by the laser beam is placed at a front focal point of a positive lens L. On the conjugate plane of S, an adjustable spatial-frequency filter F is set, which is considered that the part F plays a critical role in this system. For this spatial-frequency filter, a V-shape aperture with a width of 0.5 mm is carved with high precision, we can then choose and allow the selected order of the spatial frequency to pass through. The output of a sinusoidal fringe pattern can then be observed on the observation plane P, as indicated in Figure 1a. The design of this adjustable spatial-frequency filter F containing three accurately cut slits is actually based on our previous work of measuring modulation transfer function [28] and other recent studies on 3D surface measurements [29,30].
Note that the position of observation plane P is arbitrary, e.g., from few-ten centimeters to few-ten meters, since the projected fringe pattern is not affected by the defocusing issue as we discussed above. At any value of Z 3 in Figure 1b, we have the monochromatic, high contrast, and truly sinusoidal optical signal described using Equation (11), while the only difference is the spatial frequency of the projected fringe pattern as well as the maximum intensity of the fringe that is inversely proportional to the square of Z 3 .
With the proposed approach of this investigation, accurate reconstructions of translucent objects can be achieved for background illuminance up to E = 3000 Lux, which is realized using an optical bandpass filter centered at 532 nm with FWHM being 10 nm for removing the effects of background illuminations. The background illuminance E was adjusted using LED lights, and it was measured using a digital luxmeter placed beside the observation plane P. Meanwhile, as discussed above, a polarizer is employed for eliminating the part of signal composing the inter-reflected photons on the surface of the translucent object and the multiply-scattered photons exiting from the subsurface of the translucent object. Both the optical bandpass filter and the polarizer are mounted on the measuring camera.
Reconstructions of the observed translucent objects will be processed and obtained using those measurements based on Fourier transform profilometry (FTP). The method of FTP was initiated by Takeda and Mutoh [31], and it has been applied and improved in many investigations of 3D surface measurement [32,33,34,35]. However, applications of FTP have been limited to the situations with ideal conditions, while such necessary conditions may include that the surface of the measuring object must be opaque, the surface is highly reflective and diffuse, and there should be no background illumination. Thus, the quality of the measurement of surface reflectance provides a necessary basis for developing an image-processing algorithm for the retrievals of phase and 3D-surface height of the observed translucent object. The image-processing algorithm of the proposed approach in this work is based on FTP with a typical triangulation framework of 3D surface measurement, while the general framework of 3D surface measurement can be found in previous works published by Takeda and Mutoh [31], and by Maurel and co-workers [32].

4. Experimental Results and Discussion

There are two selected objects being measured in this study, including (i) a heptahedron-shape target similar to an elongated triangular dipyramid that was made of translucent room-temperature-vulcanized silicone rubber via two-component addition molding in the lab as shown in Figure 3a,b, and (ii) a translucent plastic cap made of polypropylene as shown in Figure 3c. The real part of refractive index of vulcanized silicone rubber at 532 nm is about n r = 1.41 [23], and the real part of the refractive index of the plastic cap is about n r = 1.47∼1.49 [24]. The designed heptahedron-shape target has a set of given geometrical parameters, which is necessary and important for validation study and determining the accuracy of the proposed approach.

4.1. Measurement and Reconstruction of a 3D Translucent Object Using Optical Filter

As discussed above, the approach proposed in this work uses the projected laser-beam-based monochromatic fringe patterns, which allows the use of an optical bandpass filter for removing the effects of background illuminations. Figure 4 shows the measurement and reconstruction of a translucent heptahedron-shape object made of vulcanized silicone rubber under the background illuminance E = 202 Lux, while the spatial frequency of the projected fringe pattern was set to 2.2 lp/mm.
All other conditions and technical parameters, such as the output power of laser, the background illuminance E, the spatial frequency of the projected fringe pattern, and the adjustable parameters of the camera, were the same for the measurements in Figure 4a,b, except that the optical filter was used in the measurement of Figure 4b, but not in the case of Figure 4a. Taking the comparison between Figure 4a,b, the background illuminance with E = 202 Lux lowered the fringe contrast and increased the noised signal, and the distorted fringes on the surface of target being measured was almost submerged by the background illumination as indicated by Figure 4a, which resulted in the failed reconstruction as indicated in Figure 4c. However, for the same environmental and experimental conditions, the measurement of Figure 4b using optical filtering technique showed greatly-improved fringe contrast with much lower noise level, which ensured the accurate reconstruction of the target, as shown in Figure 4d. Obviously, the combination of monochromaticity of projected structured signal and optical filtering technique can ensure the high contrast of the projected fringe pattern, which is critically important for in situ profilometry with background illuminations. A problem has been noticed where it seems impossible to see the fringes in Figure 4b, which can be explained with the relatively high spatial frequency (2.2 lp/mm). However, the fringes on the object surface can clearly be seen when the image of Figure 4b is enlarged by a scale factor of 5 as shown in Figure 4f.

4.2. Measurement and Reconstruction of a 3D Translucent Object Using Optical Polarizer

For verifying the polarization of the reflected signal from the surface of the translucent object, we carried out the measurements as shown in Figure 5a,b with the following conditions: (i) The background illuminance E = 0 , thus the optical filter was not necessarily used. (ii) The spatial frequency of the projected fringe pattern was set to 4.0 lp/mm. The optical polarizer was not used in Figure 5a, while the measurement of Figure 5b was obtained by using optical polarizer via adjusting its direction of polarization.
The experimental condition of keeping the background illuminance being zero allowed us to study and assess the effect of multiply-scattered photons. The intensity of multiply-scattered photons basically depends on the thickness of the silicone rubber and the reflectivity of bottom surface. Due to the effect of multiply-scattered photons, the image of Figure 5a looked brighter and more blurred comparing to that of Figure 5b, which resulted in that the measurement of Figure 5b showed higher fringe contrast and yielded more accurate reconstruction of the object being observed as indicated in Figure 5d with comparing the result shown in Figure 5c.
Thus, the experimental results have proven the theoretical prediction discussed above that the effect of multiply-scattered photons can be removed using an optical polarizer, which is based on the physical basis that the projected monochromatic structured light and reflected signal from the surface of the translucent object are linearly polarized, and the signal component composing multiply-scattered photons is completely unpolarized.

4.3. Measurement and Reconstruction of 3D Translucent Objects under Different Background Illumination

Based on the experimental results of Section 4.1 and Section 4.2 above, we now present two sets of experimental measurements and reconstructions of two selected translucent objects under different background illumination, while both the optical filter and the polarizer are used simultaneously.
As shown in Figure 6, the first object was a translucent heptahedron-shape object made of vulcanized silicone, and the spatial frequency of the projected fringe pattern used in this part of the experiment was 4.0 lp/mm. We see from Figure 6a–d that the effects of background illumination and multiply-scattered photons can be mostly removed, but not completely.
The relatively high brightness of the image in Figure 6d of the measurement with background illuminance E = 3000 Lux indicates that the fringe contrast will be decreased as the background illuminance is increased. The results of Figure 6 might indicate that the accuracy of reconstruction using the measurement with background illuminance E = 196 Lux is the same as the case of E = 0 Lux, and the accuracy of reconstruction using the measurement with background illuminance E = 1000 Lux is basically satisfied. For the reconstruction using the measurement with background illuminance E = 3000 Lux as indicated in Figure 6h, the inaccuracy of reconstruction is mainly due to the decrease of the fringe contrast resulted from the very high background illuminance. Note that the fringe contrast of Figure 6h can be improved by increasing the output power of the laser source, which has been validated by preliminary experimental results.
For the second selected object, a translucent plastic cap made of polypropylene, the measurement and reconstruction were carried out with the spatial frequency of the projected fringe pattern being 1.7 lp/mm as shown in Figure 7.
The purpose of the experiment with translucent plastic cap made of polypropylene should be based on the consideration that the translucent plastic cap has different transparency, refractive index, shape, and surface smoothness. The spatial frequency of the projected fringe pattern used in Figure 7 was 1.7 lp/mm, and the intensity of the projected fringe pattern on the surface of the object was lower than that used in Figure 6.
Figure 7a,c,e were the measurements for background illuminance E = 0, 50, 180 Lux, respectively, while Figure 7b,d,f were the reconstruction using the measurements corresponding to Figure 7a,c,e, respectively. Generally speaking, the accuracy of reconstruction was accurate via comparing the reconstructed 3D shape with the geometrical parameters of the original object. Note that the we were unable to obtain accurate reconstruction if the background illuminance was increased to E = 1000 Lux as we had in Figure 6c, which might be mainly due to that the lower intensity of the projected fringe pattern on the surface of the object resulted in lower fringe contrast with E = 1000 Lux.

4.4. Analysis and Discussion of the Accuracy of Reconstruction

The analysis of the accuracy of reconstruction for the proposed method was based on the measurement and reconstruction of the designed heptahedron-shape target with given geometrical parameters. Figure 8a,b showed the measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber, respectively, illuminated by monochromatic sinusoidal fringe pattern at 532 nm with the following experimental conditions: (i) The target being observed was illuminated by monochromatic sinusoidal fringe pattern at 532 nm. (ii) The spatial frequency of the projected fringe pattern is 2.2 lp/mm. (iii) The background illuminance E = 202 L u x . (iv) Both optical filter and polarizer were used for the measurement of Figure 8a.
The comparison of the retrieved height distribution on the cross section perpendicular to the ridge with actual values was described in the plot of Figure 8c, and the absolute-error distribution of retrieved surface height in accordance with Figure 8c is given in Figure 8d. The accuracy of reconstruction can be obtained using the comparison between the retrieved height distribution on the cross section perpendicular to the ridge and the actual geometrical parameters of the target being observed. We should point out that the individual point of largest deviation in Figure 8d was due to the little hump or hole on the surface of object, since the translucent heptahedron-shape object made of vulcanized silicone in our lab did not have a perfect geometrical shape as designed.
To quantitatively evaluate difference between known and calculated geometry of the object being observed, we employ the root mean square error (RMSE) for telling the accuracy of reconstruction. The lower the RMSE, the better the accuracy of the reconstruction. The calculated RMSE using the data of Figure 4e, Figure 5e, Figure 6i and Figure 8d are listed in Table 1 below. From Table 1, we see that RMSE = 0.09–0.11 (mm) can be obtained for usual indoor background illumination when both optical filter and polarizer are used in our developed method. However, when the background illumination is increased such as the case of Figure 6c,d, RMSE = 0.16–0.23 (mm) may be acceptable for many applications.
The results of Figure 8 and Table 1 may indicate that an accurate reconstruction can be obtained using 3D surface-shape measurement of the translucent object with the combination of laser-beam-based structured light and polarization technique, and the optical filtering technique should also be included and works well if the background illumination exists.
The success of the proposed method for 3D measurement and reconstruction of translucent objects is definitely based on the following items: (i) high-contrast and high-quality sinusoidal fringe projection, (ii) polarization technique, and (iii) optical filtering technique. Note that the projected structured light with the optical wave from laser source is monochromatic and generally polarized, thus both the optical filtering and polarization techniques can be used for measurement and image processing as we have proposed in our investigation. However, since the optical signal generated by a DLP-based projector is non-monochromatic and unpolarized, it is impossible to extend our approach to the methods and applications that are based on the use of structured light generated by a DLP projector.
It should be noted that the accuracy of 3D measurement and reconstruction of translucent objects is basically determined by the intensity, contrast, and spatial frequency of the projected fringes and the background illumination. The optical parameters including the spatial frequency of projected fringes should be optimized according to the measuring conditions. The intensity and contrast of the projected fringes are related to the output power of the laser beam and the background illumination, while the possibly higher output power of the laser beam might be necessary when the background illuminance is higher.

5. Conclusions

Based on theoretical analyses and experimental results of this study, we have demonstrated that reliable measurements and accurate reconstructions can be obtained for determining 3D surface shapes of translucent objects being observed with the combination of laser-beam-based fringe projection, the polarization technique, and the optical filtering technique. The rigorous theoretical analyses presented in this study have described the formation, propagation, and physical feature of the generated sinusoidal signal by the designed optical system, the reflection and transmission of the projected monochromatic fringe pattern on the surface of the translucent object, and the formation and the separation of the direct-reflection and the global components of the surface radiance of the observed object. The designed novel sinusoidal optical signal generator can generate monochromatic, high contrast, truly sinusoidal, and linearly polarized fringe patterns, which is the critical basis for this study. The polarization technique should be emphasized in the developed method of this work for eliminating the effect of multiply-scattered photons under the condition that the projected fringe pattern is generated using designed sinusoidal optical signal generator. The optical filtering technique is important and necessary for removing the effect of the background illumination since the projected fringe pattern is a monochromatic signal. It should be noted that the optical system used in this work is portable and stable, and can be used for outdoor or in situ measurement and reconstruction of translucent objects. The results of the reliable measurement and accurate reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under the background illuminance E = 0∼1000 Lux shown in Figure 6 as well as the error analysis presented in Figure 8 are capable of proving the success of the developed approach of this work. The expected applications of the developed method and result of this work may include accurate in situ shape determination of any 3D translucent objects and related target recognition under general environmental conditions.

Author Contributions

Conceptualization, B.C. and P.S.; methodology, B.C. and P.S.; software, P.S. and H.M.; investigation, Y.W. and Y.X.; validation, Y.W. and R.W.; data curation, Y.W. and C.Z.; formal analysis, H.M. and P.C.; preparing figures, P.S.; writing—original draft preparation, B.C.; writing—review and editing, B.C. and P.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially supported by the Shandong Province Higher Educational Science and Technology Program under grant number J18KZ012, the National Natural Science Foundation of China under grant number 11975132, the Natural Science Foundation of Shandong Province under grant number ZR2016FB09, and the Natural Science Foundation of Shandong Province for Excellent Young Scholars under grant number ZR2019YQ01.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Holroyd, M.; Lawrence, J. An analysis of using high-frequency sinusoidal illumination to measure the 3D shape of translucent objects. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Colorado Springs, CO, USA, 20–25 June 2011; pp. 2985–2991. [Google Scholar] [CrossRef]
  2. Ihrke, I.; Kutulakos, K.N.; Lensch, H.P.A.; Magnor, M.; Heidrich, W. Transparent and specular object reconstruction. Comput. Graph. Forum 2010, 29, 2400–2426. [Google Scholar] [CrossRef]
  3. Park, J.; Sabharwal, A.; Veeraraghavan, A. Direct-global separation for improved imaging photoplethysmography. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Salt Lake City, UT, USA, 18–22 June 2018; pp. 1375–1384. [Google Scholar] [CrossRef]
  4. Subpa-Asa, A.; Fu, Y.; Zheng, Y.; Amano, T.; Sato, T. Separating the direct and global components of a single image. J. Inf. Process. 2018, 26, 755–767. [Google Scholar] [CrossRef] [Green Version]
  5. Nayar, S.K.; Krishnan, G.; Grossberg, M.D.; Raskar, R. Fast separation of direct and global components of a scene using high frequency illumination. ACM Trans. Graph. 2006, 25, 935–944. [Google Scholar] [CrossRef]
  6. Rao, L.; Da, F. Local blur analysis and phase error correction method for fringe projection profilometry systems. Appl. Opt. 2018, 57, 4267–4276. [Google Scholar] [CrossRef]
  7. Torii, M.; Okabe, T.; Amano, T. Multispectral direct-global separation of dynamic scenes. In Proceedings of the 19th IEEE Winter Conference on Applications of Computer Vision, Waikoloa Village, HI, USA, 8–10 January 2019; pp. 1923–1931. [Google Scholar] [CrossRef]
  8. Lockerman, Y.D.; Brenner, S.; Lanzone, J.; Doronin, A.; Rushmeier, H. Testing spatial patterns for acquiring shape and subsurface scattering properties. Electron. Imaging 2016, 2016, 1–7. [Google Scholar] [CrossRef]
  9. Rowe, M.P.; Pugh, E.N.; Tyo, J.S.; Engheta, N. Polarization-difference imaging: A biologically inspired technique for observation through scattering media. Opt. Lett. 1995, 20, 608–610. [Google Scholar] [CrossRef]
  10. Tyo, J.S.; Rowe, M.P.; Pugh, E.N., Jr.; Engheta, N. Target detection in optically scattering media by polarization-difference imaging. Appl. Opt. 1996, 35, 1855–1870. [Google Scholar] [CrossRef] [PubMed]
  11. Chen, T.; Lensch, H.P.A.; Fuchs, C.; Seidel, H.-P. Polarization and phase-shifting for 3D scanning of translucent objects. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, MN, USA, 17–22 June 2007; pp. 1829–1836. [Google Scholar] [CrossRef] [Green Version]
  12. Xu, Y.; Zhao, H.; Jiang, H.; Li, X. High-accuracy 3D shape measurement of translucent objects by fringe projection profilometry. Opt. Express 2019, 27, 18421–18434. [Google Scholar] [CrossRef]
  13. Lutzke, P.; Kuhmstedt, P.; Notni, G. Measuring error compensation on three-dimensional scans of translucent objects. Opt. Eng. 2011, 50, 063601. [Google Scholar] [CrossRef]
  14. Ohtani, K.; Li, L.; Baba, M. Determining surface shape of translucent objects by using laser rangefinder. In Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, Kobe, Japan, 15–17 December 2013; pp. 454–459. [Google Scholar] [CrossRef]
  15. Inoshita, C.; Mukaigawa, Y.; Matsushita, Y.; Yagi, Y. Surface normal deconvolution: Photometric stereo for optically thick translucent objects. Lect. Notes Comput. Sci. 2014, 8690, 346–359. [Google Scholar] [CrossRef]
  16. Lutzke, P.; Heist, S.; Kuhmstedt, P.; Kowarschik, R.; Notni, G. Monte Carlo simulation of three-dimensional measurements of translucent objects. Opt. Eng. 2015, 54, 084111. [Google Scholar] [CrossRef]
  17. Jiang, H.; Zhai, H.; Xu, Y.; Li, X.; Zhao, H. 3D shape measurement of translucent objects based on Fourier single-pixel imaging in projector-camera system. Opt. Express 2019, 27, 33564–33574. [Google Scholar] [CrossRef] [PubMed]
  18. Gupta, M.; Agrawal, A.; Veeraraghavan, A.; Narasimhan, S.G. A practical approach to 3D scanning in the presence of inter-reflections, subsurface scattering and defocus. Int. J. Comput. Vis. 2013, 102, 33–55. [Google Scholar] [CrossRef]
  19. Gkioulekas, I.; Walter, B.; Adelson, E.H.; Bala, K.; Zickler, T. On the appearance of translucent edges. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 5528–5536. [Google Scholar] [CrossRef] [Green Version]
  20. Goodman, J.W. Introduction to Fourier Optics, 2nd ed.; McGraw-Hill: New York, NY, USA, 1996; Chapter 3. [Google Scholar]
  21. Reichelt, A.; Storck, E.; Wolff, U. Near field diffraction pattern behind a sinusoidal phase grating. Opt. Commun. 1971, 3, 169–172. [Google Scholar] [CrossRef]
  22. Harvey, J.E.; Pfisterer, R.N. Understanding diffraction grating behavior: Including conical diffraction and Rayleigh anomalies from transmission gratings. Opt. Eng. 2019, 58, 087105. [Google Scholar] [CrossRef]
  23. Smolka, F.M.; Hill, H.A. Room temperature vulcanizing silicone rubber as an optical element. Appl. Opt. 1977, 16, 292–293. [Google Scholar] [CrossRef] [PubMed]
  24. Shackelford, J.F. Introduction to Materials Science for Engineers; McGraw-Hill: New York, NY, USA, 2000. [Google Scholar]
  25. Ishimaru, A. Wave Propagation and Scattering in Random Media; Academic: New York, NY, USA, 1978. [Google Scholar]
  26. Chen, B.; Stamnes, K.; Stamnes, J.J. Validity of the diffusion approximation in bio-optical imaging. Appl. Opt. 2001, 40, 6356–6366. [Google Scholar] [CrossRef]
  27. Umeyama, S.; Godin, G. Separation of diffuse and specular components of surface reflection by use of polarization and statistical analysis of images. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 639–647. [Google Scholar] [CrossRef]
  28. Li, H.; Chen, B.; Feng, K.; Ma, H. Modulation transfer function measurement method for fiber optic imaging bundles. Opt. Laser Technol. 2008, 40, 415–519. [Google Scholar] [CrossRef]
  29. Chen, B.; Li, H.; Yue, J.; Shi, P. Fourier-transform-based surface measurement and reconstruction of human face using the projection of monochromatic structured light. Sensors 2021, 21, 2529. [Google Scholar] [CrossRef]
  30. Chen, B.; Gao, H.; Li, H.; Ma, H.; Gao, P.; Chu, P.; Shi, P. Indoor and outdoor surface measurement of 3D objects under different background illuminations and wind conditions using laser-beam-based sinusoidal fringe projections. Photonics 2021, 8, 178. [Google Scholar] [CrossRef]
  31. Takeda, M.; Mutoh, K. Fourier transform profilometry for the automatic measurement of 3-D object shapes. Appl. Opt. 1983, 22, 3977–3982. [Google Scholar] [CrossRef] [PubMed]
  32. Maurel, A.; Cobelli, P.; Pagneux, V.; Petitjeans, P. Experimental and theoretical inspection of the phase-to-height relation in Fourier transform profilometry. Appl. Opt. 2009, 48, 380–392. [Google Scholar] [CrossRef] [PubMed]
  33. Zappa, E.; Busca, G. Static and dynamic features of Fourier transform profilometry: A review. Opt. Laser Eng. 2012, 50, 1140–1151. [Google Scholar] [CrossRef]
  34. Yun, H.; Li, B.; Zhang, S. Pixel-by-pixel absolute three-dimensional shape meausrement with Fourier transform profilometry. Appl. Opt. 2017, 56, 1472–1480. [Google Scholar] [CrossRef]
  35. Wang, Z.; Zhang, Z.; Gao, N.; Xiao, Y.; Gao, F.; Jiang, X. Single-shot 3D shape measurement of discontinuous objects based on a coaxial fringe projection system. Appl. Opt. 2019, 58, A169–A178. [Google Scholar] [CrossRef]
Figure 1. The developed the laser-beam-based optical system. (a) Experimental setup. S: CW laser source; G: grating; L: Fourier-transform positive lens; F: spatial-frequency filter; P: observation plane; T: target on the observation plane; C: CCD Camera connected to a computer; L E D : LED light for accurately adjusting the background illuminance. (b) Sketch of the sinusoidal optical signal generator.
Figure 1. The developed the laser-beam-based optical system. (a) Experimental setup. S: CW laser source; G: grating; L: Fourier-transform positive lens; F: spatial-frequency filter; P: observation plane; T: target on the observation plane; C: CCD Camera connected to a computer; L E D : LED light for accurately adjusting the background illuminance. (b) Sketch of the sinusoidal optical signal generator.
Sensors 21 06587 g001
Figure 2. Diagram of the projection, reflection, and transmission of the sinusoidal optical signal on the surface of a translucent object. I p is the intensity of the projected fringe pattern; I d stands for the directly-reflected radiance; I i represents the radiance of interreflection; I s denotes the contribution from multiply-scattered photons.
Figure 2. Diagram of the projection, reflection, and transmission of the sinusoidal optical signal on the surface of a translucent object. I p is the intensity of the projected fringe pattern; I d stands for the directly-reflected radiance; I i represents the radiance of interreflection; I s denotes the contribution from multiply-scattered photons.
Sensors 21 06587 g002
Figure 3. Original photos of selected translucent objects being measured. (a) Heptahedron-shape vulcanized silicone rubber. (b) The same object shown in (a) on white paper with printed letters. (c) Translucent plastic cap.
Figure 3. Original photos of selected translucent objects being measured. (a) Heptahedron-shape vulcanized silicone rubber. (b) The same object shown in (a) on white paper with printed letters. (c) Translucent plastic cap.
Sensors 21 06587 g003
Figure 4. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under background illuminance E = 202 Lux. (a) Image without optical filtering. (b) Image with an optical bandpass filter centered at 532 nm with FWHM being 10 nm. (c) Reconstruction using the measurement of (a). (d) Reconstruction using the measurement of (b). (e) The absolute-error distribution of retrieved surface height on the cross section perpendicular to the ridge with actual values in accordance with data of (d). (f) The enlarged figure of the marked blue-rectangle region in (b).
Figure 4. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under background illuminance E = 202 Lux. (a) Image without optical filtering. (b) Image with an optical bandpass filter centered at 532 nm with FWHM being 10 nm. (c) Reconstruction using the measurement of (a). (d) Reconstruction using the measurement of (b). (e) The absolute-error distribution of retrieved surface height on the cross section perpendicular to the ridge with actual values in accordance with data of (d). (f) The enlarged figure of the marked blue-rectangle region in (b).
Sensors 21 06587 g004
Figure 5. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm without background illumination. (a) Image without an optical polarizer. (b) Image with an optical polarizer. (c) Reconstruction using the measurement of (a). (d) Reconstruction using the measurement of (b). (e) Comparing the absolute-error distribution of retrieved surface height on the cross section perpendicular to the ridge with actual values based on the data in (c,d).
Figure 5. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm without background illumination. (a) Image without an optical polarizer. (b) Image with an optical polarizer. (c) Reconstruction using the measurement of (a). (d) Reconstruction using the measurement of (b). (e) Comparing the absolute-error distribution of retrieved surface height on the cross section perpendicular to the ridge with actual values based on the data in (c,d).
Sensors 21 06587 g005
Figure 6. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under different background illuminance E. (a) Measurement of E = 0 . (b) Measurement of E = 196 Lux. (c) Measurement of E = 1000 Lux. (d) Measurement of E = 3000 Lux. (e) Reconstruction using (a). (f) Reconstruction using (b). (g) Reconstruction using (c). (h) Reconstruction using (d). (i) Comparing the absolute-error distribution of retrieved surface height on the cross section perpendicular to the ridge with actual values based on the data in (eh).
Figure 6. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under different background illuminance E. (a) Measurement of E = 0 . (b) Measurement of E = 196 Lux. (c) Measurement of E = 1000 Lux. (d) Measurement of E = 3000 Lux. (e) Reconstruction using (a). (f) Reconstruction using (b). (g) Reconstruction using (c). (h) Reconstruction using (d). (i) Comparing the absolute-error distribution of retrieved surface height on the cross section perpendicular to the ridge with actual values based on the data in (eh).
Sensors 21 06587 g006aSensors 21 06587 g006b
Figure 7. Measurement and reconstruction of the translucent plastic cap with different background illuminance E. (a) Measurement of E = 0 . (b) Reconstruction using (a). (c) Measurement of E = 50 Lux. (d) Reconstruction using (c). (e) Measurement of E = 180 Lux. (f) Reconstruction using (e).
Figure 7. Measurement and reconstruction of the translucent plastic cap with different background illuminance E. (a) Measurement of E = 0 . (b) Reconstruction using (a). (c) Measurement of E = 50 Lux. (d) Reconstruction using (c). (e) Measurement of E = 180 Lux. (f) Reconstruction using (e).
Sensors 21 06587 g007
Figure 8. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under background illuminance E = 202 Lux. (a) Image of the illuminated target. (b) Reconstruction using the measurement of (a). (c) Comparing the retrieved height distribution on the cross section perpendicular to the ridge with actual values. (d) The absolute-error distribution of retrieved surface height in accordance with (c).
Figure 8. Measurement and reconstruction of the heptahedron-shape vulcanized silicone rubber illuminated by monochromatic sinusoidal fringe pattern at 532 nm under background illuminance E = 202 Lux. (a) Image of the illuminated target. (b) Reconstruction using the measurement of (a). (c) Comparing the retrieved height distribution on the cross section perpendicular to the ridge with actual values. (d) The absolute-error distribution of retrieved surface height in accordance with (c).
Sensors 21 06587 g008
Table 1. Comparison of RMSE of different reconstructions.
Table 1. Comparison of RMSE of different reconstructions.
No. of FigureColor of DataRMSE (mm)
Figure 4ered0.18
Figure 5eblue0.09
red0.28
Figure 6iblack0.10
green0.11
red0.16
blue0.23
Figure 8dred0.11
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, B.; Shi, P.; Wang, Y.; Xu, Y.; Ma, H.; Wang, R.; Zheng, C.; Chu, P. Determining Surface Shape of Translucent Objects with the Combination of Laser-Beam-Based Structured Light and Polarization Technique. Sensors 2021, 21, 6587. https://doi.org/10.3390/s21196587

AMA Style

Chen B, Shi P, Wang Y, Xu Y, Ma H, Wang R, Zheng C, Chu P. Determining Surface Shape of Translucent Objects with the Combination of Laser-Beam-Based Structured Light and Polarization Technique. Sensors. 2021; 21(19):6587. https://doi.org/10.3390/s21196587

Chicago/Turabian Style

Chen, Bingquan, Peng Shi, Yanhua Wang, Yongze Xu, Hongyang Ma, Ruirong Wang, Chunhong Zheng, and Pengcheng Chu. 2021. "Determining Surface Shape of Translucent Objects with the Combination of Laser-Beam-Based Structured Light and Polarization Technique" Sensors 21, no. 19: 6587. https://doi.org/10.3390/s21196587

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop