In this contribution we introduce an imaging based measurement setup, that is able to very accurately measure 3D relative positions between moving light sources. The system consists of two highspeed cameras, one equipped with a telecentric, the other with an endocentric lens. To improve accuracy of image based position detection, each lens is upgraded with a computer-generated-hologram (CGH) to replicate a single object point into a predefined pattern of spots on image plane. By averaging the centers of all replications, noise and other error contributions can be reduced. We will show how to apply image processing using two different approaches. The first approach is based on a tracking algorithm running on CPU reaching 330 fps. The second is a FPGA implementation to process whole images with a speed of 390 fps. Furthermore, we will demonstrate how threedimensional calibration can be done using the Nanomeasurement and Nanopositioning Machine NPMM-200. For the calibration, a three-dimensional multivariate polynomial is used. The standard deviations of residual error in object space for a calibration in a volume of 100 mm × 100 mm × 24 mm are σx = 0.367 μm, σy = 0.373 μm and σz = 0.437 μm (polynomial order = 9).
|