Image intensified systems are a compact, low power device that converts visible through near-infrared illumination to visible imagery. These devices provide usable imagery in a variety of ambient illuminations, and they are a preferred means for night imaging. Even though the device consists of objective or relay optics and an image intensified tube, to perform critical measurements on the device performance one needs to dis-assemble the device to perform testing on only the image intensified tube. This is a non-trivial process that requires the hardware to be re-aligned and re-purged during re-assembly. Using proper sources, reference cameras, and image processing techniques, it is possible to fully characterize an image intensified device for its relevant measurable parameters (signal to noise ratio, tube gain, and limiting resolution) without disassembly. This paper outlines the classic component image intensified measurement methodology, assumptions on performance that support those measurement techniques, and the new methodology procedure. A comparison of measurement results using both methods will demonstrate the validity of this new measurement approach.
An accurate prediction of the number of pixels on a target is critical in modeling the performance of a cameras ability to perform a task. This requires an accurate knowledge of the angle subtended by a pixel of interest, which can be calculated from a specification sheet or lens prescription. When such information is not available, it can be retrieved through a measurement of a known sized target at a known distance. In this correspondence, we utilize canonical images (ideal simple functions) together with non-linear optimization to provide sub-pixel target localization. This allows for accurate and repeatable measurement of the angular sampling of a camera. Additionally, the use of well-defined shapes and accurate location determination can be used to determine the blur, rotation, motion, contrast, distortion, and other camera metrics.
Time and resource constraints often limit the number of cameras available to establish statistical confidence in determining
if a device meets a desired range performance requirement. For thermal cameras, measurements of sampling/resolution,
sensitivity and temporal response are combined through the Targeting Task Performance (TTP) metric to predict range.
To accommodate a large volume of cameras, we utilized a rotation stage to iterate across the required measurements, with
only a single connection instance to the camera. Automation in collection, processing, and device communication reduces
opportunities for human error, further improving confidence in the results. To accommodate variations in mounting,
cameras were automatically registered to the measurement setup, ensuring accurate analysis and facilitating automatic
processing. Additional efficiency was accomplished through processing the measurements in parallel with data collection,
reducing the time for full analysis of a single camera from 30 minutes down to 4 minutes. From this work, a statistically
relevant sampling of range was accumulated, along with other metrics, to gain insight into manufacturing repeatability,
correlated metrics, and datasets for device emulation. In support of the reproducible research effort, many of the analysis
scripts used in this work are available for download at [1].
At NVESD, the targeting task performance (TTP) metric applies a weighting of different system specifications, that are determined from the scene geometry, to calculate a probability of task performance. In this correspondence we detail how to utilize an imaging system specification document to obtain a baseline performance estimate using the Night Vision Integrated Performance Model (NV-IPM), the corresponding input requirements, and potential assumptions. We then discuss how measurements can be performed to update the model to pro- vide a more accurate prediction of performance, detailing the procedures taken at the NVESD Advanced Sensor Evaluation Facility (ASEF) lay utilizing the Night Vision Laboratory Capture (NVLabCap) software. Finally, we show how the outputs of the measurement can be compared to those of the initial specification sheet based model, and evaluated against a requirements document. The modeling components and data set produced for this work are available upon request, and will serve as a means to benchmark performance for both modeling and measurement methods.
Typical thermal system performance measurements include measurements from a sensor’s digital or analog output while system performance characterizations are based upon measurements from those outputs while characterizing the performance of the display separately. This can be a improper assumption because additional signal processing could occur between the sensor test port and the display. Recent research has focused on the characterization of thermal system displays for better model fidelity. The next evolution in this research is to introduce a means for characterizing thermal system signal intensity transfer (SITF) and three dimensional noise (3DN) performance for systems that have a display as well as a known digital output. This correspondence presents an attempted means to characterize the SITF and 3DN performance for a thermal system when only using a display as the output.
Typically, a system level characterization of a thermal imaging device includes characterizing the objective optics, detector and readout electronics. Ultimately, the thermal imagery is converted to an 8-bit signal and presented on a display for human visual consumption. In some situations, direct characterization of the pre-sample imaging system is not possible, and measurements must be performed from analyzing the output from its display. Additionally, the performance of the display and display optics are significant contributors to the performance of the imaging system, yet they are both assumed to be ideal in many aspects. In this paper, we describe how the underlying imaging system non-uniformity is related to additional display contributions in the total system non-uniformity. This paper will be divided into three parts: the technique and considerations needed to properly measure system through its display, how we can use this information in the NVIPM performance model, and a comparison of performance from measurements at the pre-sample readout versus measurements only at the display.
The quality of an imaging system can be assessed through controlled laboratory objective measurements. Currently, all imaging measurements require some form of digitization in order to evaluate a metric. Depending on the device, the amount of bits available, relative to a fixed dynamic range, will exhibit quantization artifacts. From a measurement standpoint, measurements are desired to be performed at the highest possible bit-depth available. In this correspondence, we described the relationship between higher and lower bit-depth measurements. The limits to which quantization alters the observed measurements will be presented. Specifically, we address dynamic range, MTF, SiTF, and noise. Our results provide guidelines to how systems of lower bit-depth should be characterized and the corresponding experimental methods.
When new and unique task difficulties are requested to be determined, it is important to use methodologies that are consistent with previous research. Unfortunately, some new tasks break the paradigm of past research and require new techniques in order to properly determine their difficulty. This paper describes the process of determining the difficulty for tasks that are unique in that they have a null case (where no object or motion is present) and because these tasks have been requested to be quantified in environments that potentially contain high amounts of atmospheric turbulence. Because each of the calculated V50’s was based upon an assumption, a secondary field collection was necessary in order to validate which model assumptions correlated properly to field performance data.
KEYWORDS: 3D metrology, 3D modeling, 3D image processing, Imaging systems, Sensors, Nonuniformity corrections, Data modeling, Convolution, Performance modeling, Image processing
When evaluated with a spatially uniform irradiance, an imaging sensor exhibits both spatial and temporal variations,
which can be described as a three-dimensional (3D) random process considered as noise. In the 1990s, NVESD
engineers developed an approximation to the 3D power spectral density (PSD) for noise in imaging systems known as
3D noise. In this correspondence, we describe how the confidence intervals for the 3D noise measurement allows for
determination of the sampling necessary to reach a desired precision. We then apply that knowledge to create a smaller
cube that can be evaluated spatially across the 2D image giving the noise as a function of position. The method
presented here allows for both defective pixel identification and implements the finite sampling correction matrix. In
support of the reproducible research effort, the Matlab functions associated with this work can be found on the
Mathworks file exchange [1].
Laboratory measurements on thermal imaging systems are critical to understanding their performance in a field
environment. However, it is rarely a straightforward process to directly inject thermal measurements into thermal
performance modeling software to acquire meaningful results. Some of the sources of discrepancies between
laboratory and field measurements are sensor gain and level, dynamic range, sensor display and display brightness,
and the environment where the sensor is operating. If measurements for the aforementioned parameters could
be performed, a more accurate description of sensor performance in a particular environment is possible. This
research will also include the procedure for turning both laboratory and field measurements into a system model.
Thermal systems with a narrow spectral bandpass and mid-wave thermal imagers are useful for a variety of imaging
applications. Additionally, the sensitivity for these classes of systems is increasing along with an increase in
performance requirements when evaluated in a lab. Unfortunately, the uncertainty in the blackbody temperature
along with the temporal instability of the blackbody could lead to uncontrolled laboratory environmental effects
which could increase the measured noise. If the temporal uncertainty and accuracy of a particular blackbody
is known, then confidence intervals could be adjusted for source accuracy and instability. Additionally, because
thermal currents may be a large source of temporal noise in narrow band systems, a means to mitigate them is
presented and results are discussed.
KEYWORDS: Sensors, Imaging systems, Black bodies, Modulation transfer functions, Machine vision, Temperature metrology, Contrast transfer function, Cameras, Systems modeling, Eye
Researchers at the US Army Night Vision and Electronic Sensors Directorate have added the functionality of Machine Vision MRT (MV-MRT) to the NVLabCap software package. While the original calculations of MV-MRT were compared to human observers performance using digital imagery in a previous effort,1 the technical approach was not tested on 8-bit imagery using a variety of sensors in a variety of gain and level settings. Now that it is more simple to determine the MV-MRT for a sensor in multiple gain settings, it is prudent to compare the results of MV-MRT in multiple gain settings to the performance of human observers for thermal imaging systems that are linear and shift invariant. Here, a comparison of the results for a LWIR system to trained human observers is presented.
KEYWORDS: Sensors, Modulation transfer functions, Systems modeling, Data modeling, Performance modeling, Video, Image sensors, Cameras, Software development, Imaging systems
Engineers at the US Army Night Vision and Electronic Sensors Directorate have recently developed a software package called NVLabCap. This software not only captures sequential frames from thermal and visible sensors, but it also can perform measurements of signal intensity transfer function, 3-dimensional noise, field of view, super-resolved modulation transfer function, and image bore sight. Additionally, this software package, along with a set of commonly known inputs for a given thermal imaging sensor, can be used to automatically create an NV-IPM element for that measured system. This model data can be used to determine if a sensor under test is within certain tolerances, and this model can be used to objectively quantify measured versus given system performance.
KEYWORDS: Eye, Image segmentation, Sensors, Imaging systems, Thermography, Black bodies, Video, Minimum resolvable temperature difference, Image processing, Human vision and color perception
The GStreamer architecture allows for simple modularized processing. Individual GStreamer elements have been
developed that allow for control, measurement, and ramping of a blackbody, for capturing continuous imagery
from a sensor, for segmenting out a MRTD target, for applying a blur equivalent to that of a human eye and a
display, and for thresholding a processed target contrast for "calling" it. A discussion of each of the components
will be followed by an analysis of its performance relative to that of human observers.
Multiple source band image fusion can sometimes be a multi-step process that consists of several intermediate
image processing steps. Typically, each of these steps is required to be in a particular arrangement in order to
produce a unique output image. GStreamer is an open source, cross platform multimedia framework, and using
this framework, engineers at NVESD have produced a software package that allows for real time manipulation
of processing steps for rapid prototyping in image fusion.
Recent developments in image fusion give the user community many options for ways of presenting the imagery to
an end-user. Individuals at the US Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate
have developed an electronic system that allows users to quickly and efficiently determine optimal image fusion
algorithms and color parameters based upon collected imagery and videos from environments that are typical
to observers in a military environment. After performing multiple multi-band data collections in a variety
of military-like scenarios, different waveband, fusion algorithm, image post-processing, and color choices are
presented to observers as an output of the fusion system. The observer preferences can give guidelines as to how
specific scenarios should affect the presentation of fused imagery.
The US Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) sensor performance
models predict the ability of soldiers to perform a specified military discrimination task using an EO/IR sensor system.
Increasingly EO/IR systems are being used on manned and un-manned aircraft for surveillance and target acquisition
tasks. In response to this emerging requirement, the NVESD Modeling and Simulation division has been tasked to
compare target identification performance between ground-to-ground and air-to-ground platforms for both IR and visible
spectra for a set of wheeled utility vehicles. To measure performance, several forced choice experiments were designed
and administered and the results analyzed. This paper describes these experiments and reports the results as well as the
NVTherm model calibration factors derived for the infrared imagery.
Real MWIR Persistent Surveillance (PS) data was taken with a single human walking from a known point to different tents in the PS sensor field of view. The spatial resolution (ground sample distance) and revisit rate was varied from 0.5 to 2 meters and 1/8th to 4 Hz, respectively. A perception experiment was conducted where the observer was tasked to track the human to the terminal (end of route) tent. The probability of track is provided as a function of ground sample distance and revisit rate. These results can help determine PS design requirements for tracking and back-tracking humans on the ground. This paper begins with a summary of two previous simulation experiments: one for human tracking and one for vehicle tracking.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.