Using the latest models from the U.S. Army Night Vision Electronic Sensors Directorate (NVESD), a survey of monochrome and color imaging systems at daylight and low light levels is conducted. Each camera system is evaluated and compared under several different assumptions, such as equivalent field of view with equal and variable f/#, common lens focal length and aperture, with high dynamic range comparisons and over several light levels. The modeling is done by use of the Targeting Task Performance (TTP) metric using the latest version of the Night Vision Integrated Performance Model (NV⁸IPM). The comparison is performed over the V parameter, the main output of the TTP metric. Probability of identification (PID) versus range predictions are a direct non-linear mapping of the V parameter as a function of range. Finally, a comparison between the performance of a Bayer-filtered color camera, the Bayer-filtered color camera with the IR block filter removed, and a monochrome version of the same camera is also conducted.
KEYWORDS: Performance modeling, Visual process modeling, Eye models, Systems modeling, Eye, NVThermIP, Image quality, Modulation transfer functions, Contrast transfer function, Imaging systems
NVESDs new integrated sensor performance model, NV-IPM, replaces the discrete spectral band models that preceded it (NVTherm, SSCamIP, etc.). Many advanced modeling functions are now more readily available, easier to implement, and integrated within a single model architecture. For the legacy model user with ongoing modeling duties, however, the conversion of legacy decks to NV-IPM is of more immediate concern than mastering the many “power features” now available. This paper addresses the processes for the legacy model user to make a smooth transition to NV-IPM, including the conversion of legacy sensor decks to NV-IPM format decks, differences in parameters entered in the new versus old model, and a comparison of the predicted performance differences between NV-IPM and legacy models. Examples are presented to demonstrate the ease of sensor deck conversion from legacy models and to highlight enhanced model capabilities available with minimal transition effort.
Noise in an imaging infrared (IR) sensor is one of the major limitations on its performance. As such, noise estimation is one of the major components of imaging IR sensor performance models and modeling programs. When computing noise, current models assume that the target and background are either at or near a temperature of 300 K. This paper examines how the temperature of the scene impacts the noise in IR sensors and their performance. It exhibits a strategy that can be used to make a 300 K assumption-based model to compute the correct noise. It displays the results of some measurements of signatures of a cold target against a cold background. Range performance of a notional 3rd Gen sensor (midwave IR and long wave IR) is then modeled as a function of scene background temperature.
The Night Vision and Electronics Sensors Directorate Electro-optics Simulation Toolkit (NVEOST), follow-on to Paint-The-Night, produces real time simulation of IR scenes and sequences using modeled backgrounds and targets with physics and empirically based IR signatures. Range dependant atmospheric effects are incorporated, realistically degrading the infrared scene impinging on an infrared imaging device. Current sensor effects implementation for Paint the Night (PTN) and the Night Vision Image Generator (NVIG) is a 3 step process. First the scene energy is further attenuated by the sensor optic. Second, a prefilter kernel developed off-line, is applied to scenes or frames to affect the sensor modulation transfer function (MTF) "blurring" of scene elements. Thirdly, sensor noise is overlaid on scenes, or more often frames of scenes. NVESD is improving the PTN functionality, now entitled NVEOST, in several ways. In the near future, a sensor effects tool will directly read an NVTHERM input data file, extract that data which it can utilize and then automatically generate the sensor "world view" of a NVEOST scenario. These will include those elements currently employed: optical transmission, parameters used to calculate prefilter MTF (telescope, detector geometry) and temporal-spatial random noise (σTVH). Important improvements will include treatment of sampling effects (under sampling and super-resolution), certain significant postfilters (signal processing including boost and frame integration) and spatial noise. The sensor effects implementation will require minimal interaction; only a well developed NVTHERM input parameter set will be required. The developments described below will enhance NVEOST's utility not only as a virtual simulator but also as a formidable sensor design tool.
This research describes a comparison of target identification performance between targets in the longwave infrared, illuminated shortwave infrared and visible spectral bands. Increasing levels of Gaussian blur were applied to eight varying aspects of twelve targets in the longwave infrared, illuminated shortwave infrared and visible spectra. A double-blind experiment was conducted with the first group of observers trained to identify all the targets using longwave infrared imagery and the second group trained to identify all the targets using visible imagery. Results of the first group's visible identification scores and the second group's longwave identification scores were compared to their results for illuminated shortwave infrared identification scores. In both cases, the illuminated shortwave infrared identification scores fell below the untrained visible or longwave infrared counterpart.
The Night Vision ACQUIRE model predicts range performance when provided with parameters describing the atmosphere, a 2-D MRT curve which describes the sensor and three additional parameters. Two of the additional parameters (characteristic dimension and target- background contrast) describe the target. The third additional parameter, a cycle criterion (N50) relates to task difficulty. Characteristic dimension and target-background contrast are measured directly in the field. The third parameter N50 is empirically determined from the measured range performance associated with the task. The purpose of this communication is to define terms, protocols and where possible to give recommended values for parameters used with the ACQUIRE model. The methodology and recommended parameter values given here represent Night Vision's best estimates based on years of laboratory and field experience.
The C2NVEO FLIR90 thermal imaging systems model released in June 1990 has become the community standard model for evaluating tactical FLIR systems. This model, which successfully predicts MRTD performance for first- and second-generation thermal imagers, differs from the 1975 NVEOL thermal model in three critical areas: sampling, noise, and two-dimensional MRTD. This paper explains how each of these effects is modeled in FLIR90. First, the paper discusses how the model incorporates sampling effects by imposing Nyquist frequency limits and using pre- and postsampling MTFs. Second, the treatment of directional noise and modifications to the NETD prediction is discussed. Third, the paper discusses the two-dimensional MRTD methodology and the adjustments it imposes on the Johnson range performance prediction methodology modeled in ACQUIRE. Additionally, changes and additions in forthcoming upgrades to both FLIR90 and the ACQUIRE range performance model are described.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.