In laser triangulation measurement, high reflectivity and complex exposure environments can cause defects such as highlights, artifacts, and noise in the extracted laser stripes, affecting the accuracy of stripe center extraction. To address this problem, a U-net network structure integrated with additive attention gates is proposed. This structure uses contextual semantics to train the model to predict the weights of effective regions of the stripes, implicitly learning to suppress irrelevant areas in the stripe image. Additionally, an improved Steger algorithm is proposed, which utilizes the grayscale centroid method to filter out invalid center points in the stripe direction. Experimental results show that, compared to traditional small-sample fully convolutional networks, the attention U-net achieves higher accuracy in extracting the stripe centerline, with a mean squared error (MSE) of only 5.3544 pixel, a peak signal-to-noise ratio (PSNR) of 40.8437 dB, and a structural similarity index (SSIM) of 0.9801%. At the same time, the improved Steger algorithm effectively corrects extraction deviations at the edges of the stripe.
With the development of three-dimensional (3D) shape measurement technology, fringe projection has become an effective and reliable measurement method for its great performance on the robustness against environmental disturbance and ease of operation. Traditional method of fringe projection measurement using one projector and one camera needs multiple fringe patterns to project to get absolute phases. However, it always demands fast measuring speed in the condition of online measurement. In this work, one more camera is introduced to establish a binocular structured light system to extend the field of view. In addition, to improve the measurement speed, a triangular wave is supplemented to the sinusoidal fringe to form a superposed fringe to assist phase unwrapping with only three images projected. In the phase unwrapping process, geometric constraints are used to find the corresponding points in the images captured by left and right cameras, and then the unique corresponding points are selected by the embedded triangular waves and the period order is determined. Several experiments are performed and the results shows that the proposed method takes only one third of the time compared with the conventional measurement and thus can realize a fast 3D measurement by structured light.
The traditional method of line laser stripes extraction is to binarize the image taken by cameras with a threshold value, but the size of laser lines in different color areas are obviously different because the line laser is absorbed by different colors of the parts to be measured to different degrees. For some regions with similar color to the laser, the traditional laser fringe recognition method cannot even identify the laser line, which results in the serious vacancy of the points cloud obtained by linear structured light scanning in some regions. An adaptive threshold method for laser fringe extraction is proposed in this paper. A modified HSV space is firstly proposed and the characteristics of the laser stripes on different colors of cardboard based on the modified HSV space is analyzed. Then a new image with high contrast through the quantized characteristics of the stripes is synthesized; Finally, the neighborhood filtering method is used to perform filtering and the Steger method is employed to extract the center of the laser stripe. Experimental results show that the method proposed in this paper can better segment the laser fringe from the background than the traditional algorithm, which lays a good foundation for the calculation of laser fringe center and hence to improve the measuring capability of laser structured light.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.