Imagery acquired with modern imaging systems is susceptible to a variety of degradations, including blur from the point
spread function (PSF) of the imaging system, aliasing from undersampling, blur and warping from atmospheric
turbulence, and noise. A variety of image restoration methods have been proposed that estimate an improved image by
processing a sequence of these degraded images. In particular, multi-frame image restoration has proven to be a
particularly powerful tool for atmospheric turbulence mitigation (TM) and super-resolution (SR). However, these
degradations are rarely addressed simultaneously using a common algorithm architecture, and few TM or SR solutions
are capable of performing robustly in the presence of true scene motion, such as moving dismounts. Still fewer TM or
SR algorithms have found their way into practical real-time implementations. In this paper, we describe a new L-3 joint
TM and SR (TMSR) real-time processing solution and demonstrate its capabilities. The system employs a recently
developed versatile multi-frame joint TMSR algorithm that has been implemented using a real-time, low-power FPGA
processor system. The L-3 TMSR solution can accommodate a wide spectrum of atmospheric conditions and can
robustly handle moving vehicles and dismounts. This novel approach unites previous work in TM and SR and also
incorporates robust moving object detection. To demonstrate the capabilities of the TMSR solution, results using field
test data captured under a variety of turbulence levels, optical configurations, and applications are presented. The
performance of the hardware implementation is presented, and we identify specific insertion paths into tactical sensor
systems.
This paper presents a novel approach to target tracking using a measurement process based on spatio-temporal fractal
error. Moving targets are automatically detected using one-dimensional temporal fractal error. A template derived from
the two-dimensional spatial fractal error is then extracted for a designated target to allow for correlation-based template
matching in subsequent frames. The outputs of both the spatial and temporal fractal error components are combined and
presented as input to a kinematic tracking filter. It is shown that combining the two outputs provides improved tracking
performance in the presence of noise, occlusion, other moving objects, and when the target of interest stops moving.
Furthermore, reconciliation of the spatial and temporal components also provides a useful mechanism for detecting
occlusion and avoiding template drift, a problem typically present in correlation-based trackers. Results are
demonstrated using airborne MWIR sequences from the DARPA VIVID dataset.
A novel approach to motion detection in an image sequence is presented. This new approach computes a one-dimensional version of the fractal error metric applied temporally across each pixel. The original fractal error algorithm was developed by Cooper et al. as a two-dimensional metric for detecting man-made features in a single image using only spatial information. The fractal error metric is based on the observed propensity of natural image features to fit a fractional Brownian motion (fBm) model well, thus producing a small fractal error. On the other hand, man-made features do not fit the fBm model well and therefore produce a larger fractal error. Jansing et al. showed that spatial edges typically do not fit the fBm model due to their irregularity. The one-dimensional implementation of the algorithm presented in this paper exploits the irregularity of edges in a temporal signal, which are typically caused by moving objects. Emphasis is placed on moving target detection in the presence of noise and clutter-induced motion. Results are demonstrated using mid-wave infrared (MWIR) image sequences.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.