Regular Articles

Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes

[+] Author Affiliations
Mary Pagnutti, Robert E. Ryan, George Cazenavette, Maxwell Gold, Ryan Harlan, Edward Leggett, James Pagnutti

Innovative Imaging and Research, Stennis Space Center, United States

J. Electron. Imaging. 26(1), 013014 (Feb 11, 2017). doi:10.1117/1.JEI.26.1.013014
History: Received October 21, 2016; Accepted January 19, 2017
Text Size: A A A

Open Access Open Access

Abstract.  A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.

The Raspberry Pi Foundation provides low-cost, high-performance single-board Raspberry Pi computers to educate and solve real-world problems. As of early 2016, over 8 million Raspberry Pi’s had been sold, making it one of the most popular single-board computers on the market.1 These small single-board computers are quickly moving from the do-it-yourself, or DIY, community into mainstream technology development. Many are being used to acquire a wide range of measurements and are being incorporated into instruments for a multitude of applications including medical support and e-health27 robotics,8 surveillance monitoring,9 and food production optical sorting.10 The advent of open source software (and some hardware) has only quickened this trend. The Raspberry Pi credit-card-sized computer supports several accessories, including a camera module containing the Sony IMX219 sensor. This computer and camera configuration is of particular interest since it can provide raw-data format imagery that can be used for a multitude of applications, including computer vision, biophotonics, medical testing, remote sensing, astronomy, improved image quality, high dynamic range (HDR) imaging, and security monitoring. This paper evaluates the characteristics of the Raspberry Pi V2.1 camera based on the Sony IMX219 sensor and the radiometric performance of its raw-data format imagery, so the system can be effectively used for scientific imaging and engineering purposes.

The Raspberry Pi 3 is the third generation single board Raspberry Pi computer and became available to consumers in February 2016. Some of the more significant Raspberry Pi attributes, including interfaces, are described in Table 1. At the time of this writing, a Raspberry Pi 3 sold for about $35 USD and the V2.1 camera module sold for approximately $25 USD.1,11 The Raspberry Pi Foundation provides several operating systems for the Raspberry Pi 3, including Raspbian and a Debian-based Linux distribution, as well as third-party Ubuntu, Windows 10 IOT Core, RISC OS, and specialized distributions for download.

Table Grahic Jump Location
Table 1Raspberry Pi 3 computer attributes.

To understand the scientific and engineering potential of these versatile imaging sensors, a comprehensive laboratory-based radiometric characterization was performed on a small number of Raspberry Pi V2.1 camera modules. The camera is based on the Sony IMX219 silicon CMOS back-lit sensor and produces 8 megapixel images that are 3280×2464  pixels in size. The IMX219 sensor operates in the visible spectral range (400 to 700 nm) and uses a Bayer array with a BGGR pattern. Sensor specifications are detailed in Table 2.12 The Raspberry Pi also provides a visible and near-infrared version of the Sony IMX219 called the NoIR camera. This camera has No infrared (NoIR) filter on the lens, which allows imaging beyond the visible range. In this paper, the NoIR version was not considered.

Table Grahic Jump Location
Table 2Sony IMX219 sensor chip specifications.

The V2 camera module operates at a fixed focal length (3.04 mm) and single f-number (F2.0) typically focused from the near-field to infinity. Images can be captured at ISO settings between 100 and 800 in manually set increments of 100 (although not verified above 600 in this investigation) and camera exposure times between 9  μs and 6 s (although not verified above 1 s in this investigation) using a rolling shutter. Some of the more significant camera specifications are shown in Table 3. In addition to still photos, the Raspberry Pi Sony IMX219 sensor supports a cropped 1080p format at 30 frames per second (fps) and full-frame imaging video at up to 15 fps, but not in raw-data format. The entire camera board is small—25  mm×25  mm×9  mm and weighing about 3 g. It connects directly to the Raspberry Pi 3 through a 15 pin mobile industry processor interface (MIPI) camera serial interface and is shown alongside a Raspberry Pi 3 in Fig. 1.

Table Grahic Jump Location
Table 3Raspberry Pi camera specifications.
Graphic Jump Location
Fig. 1
F1 :

Raspberry Pi 3 and camera module V2.1.

Several scientific and engineering applications require raw-data format imagery with known and calibrated radiometric properties. A camera’s radiometric characterization typically includes dark frame assessments, linearity, image noise assessments, exposure or electronic shutter stability assessments, flat fielding, spectral response measurements, and an absolute radiometric calibration. Dark frame knowledge and flat fielding improve image quality by correcting for fixed pattern noise (FPN) and other spatial effects such as vignetting. Linearity characterization is essential for scientific and engineering applications. Understanding noise as a function of signal level is important for properly exposing imagery, determining the number of samples required for a particular application, and optimizing denoising algorithms. Spectral response information is used in traditional photographic color balancing13 and for spectroscopy,14,15 remote sensing,16 astronomy,17,18 and many other science and engineering applications.1921 Absolute calibration relates image acquisition conditions (including illumination and viewing geometry), exposure time, ISO, and pixel digital number (DN) value to spectral radiance.

To perform the radiometric characterizations described in this paper, the camera was accessed and controlled with software from within the Python programming language using the PiCamera application programming interface (API). While finer grain control of the camera can be achieved through low level C libraries, such as OpenMax IL, all of the functionality necessary for the activities in this paper is available from the PiCamera API. Raw-data format images were preprocessed on the Raspberry Pi with a Python script utilizing the NumPy library and saved in the NumPy file format. The preprocessed raw images were transferred to a separate computer and read into MATLAB with a NumPy data format reader. All further processing was accomplished using MATLAB.

The radiometric characterizations described in this investigation include dark frame assessments at multiple ISO and exposure settings, camera linearity assessments as a function of ISO setting and exposure time, sensor image noise as a function of ISO setting, exposure stability assessments, spectral band specific flat fielding function measurements, camera spectral response measurements, and an absolute radiometric calibration to tie measured camera DN values to NIST-traceable SI radiance units. Further information on the techniques that were utilized is described in Ref. 22.

A camera dark frame assessment was performed to quantify and correct for the camera’s fixed-pattern noise bias. Camera ISO settings were varied from 100 to 600 in steps of 100 at two different exposure times, 5 and 50 ms. After camera warm-up, defined in this investigation as 200 exposures, groups of 250 images were acquired in a dark room at 25°C with black cloth covering the camera aperture for each camera setting. The dark frame statistical properties were analyzed and are shown in Tables 4 and 5. The entire image frame was used in this assessment. As expected, the dark images became noisier with increasing ISO setting. Data taken at the higher exposure setting are also slightly noisier. In all cases, the mean and median values were essentially identical.

Table Grahic Jump Location
Table 4Raspberry Pi camera V2 dark frame statistics at 5 ms (250 frame mean).
Table Grahic Jump Location
Table 5Raspberry Pi camera V2 dark frame statistics at 50 ms (250 frame mean).

A 250-frame mean dark image was generated at each ISO setting. Since dark frames are temperature dependent, they were acquired at the same experimental conditions as the bright frames. Histogram plots generated for the mean dark images with the lowest and highest ISO setting further describe the noise variation in ISO and are shown in Fig. 2 for the 50-ms dataset. The ISO 600 histogram is slightly broader, and, while not shown, the tails are significantly longer. A 500×500  pixel subset of the corresponding 250-frame mean dark images is presented in Fig. 3 to show the fine scale spatial structure. As indicated in the tables, histogram comparisons between the two different ISO settings are nearly identical between imagery acquired at 5 and 50 ms.

Graphic Jump Location
Fig. 2
F2 :

250-frame mean dark image histograms at a 50 ms exposure time and an ISO setting of (a) 100 and (b) 600.

Graphic Jump Location
Fig. 3
F3 :

250-frame mean dark images at a 50 ms exposure time and an ISO setting of (a) 100 and (b) 600.

Raspberry Pi camera linearity was evaluated as a function of both exposure time and ISO setting. These measurements were obtained by imaging an in-house developed 1.5-m diameter large integrating sphere lamped with Luxeon Rebel 4000 K white-light LED sources mounted on relatively large 40-mm diameter heat sinks to maintain temperature stability.23 In an integrating sphere, light rays from a source (input) are uniformly scattered by highly reflective diffuse inner walls, as shown in Fig. 4, to produce uniform illumination across the camera field of view (placed at the output). The sphere’s spectral radiance was monitored with a NIST-traceable spectrometer using a bare fiber. The LED sources were powered using a stable power supply. LED current was set so that the measured DN value at the center of the image in the green band was 80% of the maximum DN value at the longest exposure time or highest ISO setting depending on the test sequence. Since the product of the light source spectral shape and sensor response peaks in the green spectral region,23 green band pixels have larger DN values than red or blue band pixels. Reducing the exposure time or ISO setting (depending on the test sequence) from this set point enabled the camera to be tested over an extended portion of its dynamic range. For this test, the camera was positioned in front of the sphere, as shown in Fig. 5. Since the focal length of the lens is 3.04 mm, the camera was effectively focused at infinity in this position. In this assessment, the data were normalized using the mean of a 200×200  pixel region in the center of the image.

Graphic Jump Location
Fig. 4
F4 :

Integrating sphere schematic.

Graphic Jump Location
Fig. 5
F5 :

Raspberry Pi camera acquiring imagery of an I2R integrating sphere lamped with white-light LED sources.

Linearity with Exposure Time

Camera linearity with exposure time was determined at an ISO setting of 100. In this set of measurements, five images were taken at each exposure time setting. The bright images were temporally and spatially averaged to establish a mean DN value within the center 200×200  pixel region. The raw data were found to be linear with respect to exposure time for the green band, as shown in Fig. 6. Table 6 summarizes the linear fit through the data. In this table and subsequent tables, root mean square error (RMSE) is defined as follows: Display Formula

RMSE=1ni=1nei2,(1)
where n is the number of data points and ei is the residual or difference between the model (a straight line fit in this case) and the measured data at each point.

Graphic Jump Location
Fig. 6
F6 :

Green band camera response (a) as a function of exposure time and (b) residuals when compared to a linear fit. Data acquired at an ISO setting of 100.

Table Grahic Jump Location
Table 6Linearity with exposure setting linear fit parameters.
Linearity with ISO

Camera linearity with ISO setting was determined at an exposure time setting of 10 ms. In this set of measurements, five images were taken at each ISO setting. As with the previous linearity assessment, the bright images were temporally and spatially averaged within a 200×200  pixel region in the center of each image to establish a mean DN value. The raw data were found to be linear with respect to ISO setting for the green band raw data, as shown in Fig. 7. Table 7 summarizes the linear fit through the data.

Graphic Jump Location
Fig. 7
F7 :

Green band camera response (a) as a function of ISO setting and (b) residuals when compared to a linear fit. Data acquired at an exposure time of 10 ms.

Table Grahic Jump Location
Table 7Linearity with ISO setting linear fit parameters.

Total sensor image noise (STotal) can be expressed in terms of photon shot noise (SShot), read noise (SRead), and FPN (SFPN), as described in Eq. (2):24Display Formula

STotal2=SShot2+SRead2+SFPN2.(2)

In this investigation, the team used a mean-variance method to characterize noise as a function of signal. This method, which plots pixel variance against the mean signal on a linear plot, yields results that are relatively simple to interpret. A more detailed description of various methods, including the photon transfer method, is described in Ref. 25.

Sensor noise characterization is usually performed on single or pairs of frames of data by acquiring imagery within an integrating sphere without a lens or optic in place. The near perfectly uniform illumination field produces a near-uniform mean signal (DN) across the FPA that is independent of position with the exception of FPN. Using this technique, a mean signal and variance are calculated for each frame of data acquired. Sphere illumination (radiance level) is varied to generate means and variances across the dynamic range of the sensor.

While some third party camera boards give users the ability to change lenses, the camera module provided by Raspberry Pi, has a fixed (glued) lens not easily removable, in front of the IMX219 sensor, which introduces signal roll-off with field angle (see Sec. 7) This spatially varying roll-off effect prevents one from obtaining a near-uniform mean signal within a single frame. In this investigation, temporal mean signal and pixel variance values were instead determined by analyzing N frames of data (all pixels) acquired at a fixed set of conditions (ISO, exposure time, and sphere illumination), as described by Ref. 24. While a large amount of data are needed, this technique removes FPN from the assessment and derives the lowest possible sensor image noise value, comprised solely of shot and read noise.

After warm-up, 250 frames of data were acquired at five different illumination levels (including dark frames) spanning the dynamic range of the sensor. Pixel locations were then sampled across the FPA at every 1000 pixels so that each 8 megapixel image produced 8000 data points of mixed RGB. Data were acquired at ISO values of 100, 200, and 400 at 5-ms exposure times. The resulting mean-variance plots are shown in Fig. 8.

Graphic Jump Location
Fig. 8
F8 :

Raspberry Pi camera mean-variance curves for ISO 100, 200, and 400.

Linear fits were made through the data, as shown in Table 8. As expected, the slope scales with ISO setting.

Table Grahic Jump Location
Table 8Mean-variance linear fit parameters.

Although one would expect that after turning the camera on, the electronic shutter should be very stable, the team saw some unexpected variation in camera output and decided to measure camera stability. Raspberry Pi camera exposure stability was tested at an exposure setting of 5 ms and an ISO setting of 100. Frames were acquired every 2.0 s. Illumination to the sphere was set such that the center green pixels measured 800  DN. Four hundred frames were acquired, spatially averaged, and normalized to the steady state temporal mean (mean of the last 150 data points). These values, shown as a percentage of the steady state temporal mean, are plotted as a function of time in Fig. 9 (dark frames) and Fig. 10 (bright frames). Turning the camera on and taking images can cause changes in output due to sensor warming. The plots show that after approximately a 200 frame warm-up period, data values reach steady state.

Graphic Jump Location
Fig. 9
F9 :

Camera dark frame exposure stability.

Graphic Jump Location
Fig. 10
F10 :

Camera bright frame exposure stability.

The data were modeled as a solution to a thermal lump circuit with a step function due to the initiation of acquiring data, as shown below:26Display Formula

Signal%=A[1exp(FFc)]+A0,(3)
where A is a scale factor, F is the frame number, Fc is a frame constant (analogous to a time constant), and A0 is an offset constant. A time constant can be calculated by multiplying Fc by the frame rate. The team expects that results will change slightly if the rate at which data are taken is changed. For the dark frame data, the fitted values for A, Fc, and A0 are 0.008, 53.1, and 100.0, respectively. The data show that dark frames are changing on the order of 0.01%, which is negligible for almost any potential application. For the bright frame data, the fitted values for A, Fc, and A0 are 0.309, 58.9, and 99.697, respectively. Although the bright frame transient behavior is small compared to photon noise, the data does show that one should allow the camera to come to equilibrium for some applications.

As part of this investigation, flat fielding surfaces were developed for the Raspberry Pi camera for each demosaicked RGB band. These measurements were acquired at an ISO setting of 100 and an exposure time of 20 ms. To reduce any local integrating sphere surface defects, the sphere was imaged at four different azimuthal positions and three different view angles. To reduce image noise from the flat fielding surface, three images were acquired at each azimuthal/view angle position. Median images were then generated based on these 36 images (4 azimuthal positions × 3 view angle positions × 3 images per position). To eliminate the influence of one band on another, a simple bilinear demosaicking algorithm was used.27 Since lens roll-off is the dominant feature in the flat fielding surface, a new flat fielding surface will need to be acquired each time the lens is changed.

The resulting RGB demosaicked flat fielding surfaces are shown as images and three-dimensional surfaces in Figs. 111213. All surfaces were peak normalized to one. For visualization purposes, the mesh plot (three-dimensional surface) sampling was reduced by displaying the mean value of each 16×16  pixel block.

Graphic Jump Location
Fig. 11
F11 :

Camera flat fielding surface at F2.0, ISO setting 100- and 20-ms exposure time, red band.

Graphic Jump Location
Fig. 12
F12 :

Camera flat fielding surface at F2.0, ISO setting 100- and 20-ms exposure time, green band.

Graphic Jump Location
Fig. 13
F13 :

Camera flat fielding surface at F2.0, ISO setting 100- and 20-ms exposure time, blue band.

Diagonal transects were taken across each of the three flat fielding surfaces, from top right to bottom left and from bottom right to top left. These diagonal transects overlay each other showing optical symmetry, as seen in Figs. 141516. Each figure contains transects through a single image alongside transects through the 36-image median image. The red band transect, while similar, is not identical to the blue and green transects, as shown in Fig. 17. This may be caused by the red band filter attenuating the signal as a function of field angle and warrants additional study.

Graphic Jump Location
Fig. 14
F14 :

Diagonal transects across the red band flat fielding surface for (a) a single image and (b) a 36-image median image.

Graphic Jump Location
Fig. 15
F15 :

Diagonal transects across the green band flat fielding surface for (a) a single image and (b) a 36-image median image.

Graphic Jump Location
Fig. 16
F16 :

Diagonal transects across the blue band flat fielding surface for (a) a single image and (b) a 36-image median image.

Graphic Jump Location
Fig. 17
F17 :

Diagonal transects across the red, green, and blue band flat fielding surface for a 36-image median image.

The RGB flat fielding surfaces shown in Figs. 1113 were fit to a surface using the functional form shown in Eq. (4). While higher order terms were considered within this Fourier series expansion, surface noise began to be fit in addition to the general shape of the surface, and the overall fit did not improve: Display Formula

f(x)=a0+a1cos(wx)+b1sin(wx)+a2cos(2wx)+b2sin(2wx).(4)

The coefficients that were obtained when fitting these functions are shown in Table 9. Parameters that measure the goodness of fit are also included in the table. Note that the coefficients for the green and blue bands are nearly identical and are consistent with the curves shown in Fig. 17.

Table Grahic Jump Location
Table 9Flat fielding surface functional fit parameters.

A camera’s spectral response is a measure of how each detector responds to a given input illumination as a function of wavelength. The Raspberry Pi camera’s spectral response was determined by imaging a quartz tungsten halogen lamp filtered using a monochromator, as shown in Fig. 18, and then comparing those measurements to that obtained with a calibrated power meter. Illumination wavelength was varied from 350 to 800 nm. At each wavelength step, the monochromator provided 1.5 to 2.0 nm spectrally wide illumination. Illumination from the monochromator exit slit was centered on the FPA to remove lens roll-off (vignetting) variability from the assessment. The light beam exiting the monochromator was also diffused using a few small sheets of lens paper. The acquired spectral response was peak normalized and is shown in Fig. 19 in arbitrary units. These measured spectral responses are broad and significantly overlap each other.

Graphic Jump Location
Fig. 18
F18 :

Raspberry Pi V2 camera spectral response measurement using a quartz tungsten halogen lamp filtered using a monochromator.

Graphic Jump Location
Fig. 19
F19 :

Raspberry Pi Camera V2 spectral response.

Spectral response measurements of two different Raspberry Pi cameras were taken. These measurements showed very similar results, as displayed in Figs. 202122.

Graphic Jump Location
Fig. 20
F20 :

Spectral response measurements of two separate Raspberry Pi V2 cameras show excellent camera-to-camera repeatability in the red band.

Graphic Jump Location
Fig. 21
F21 :

Spectral response measurements of two separate Raspberry Pi V2 cameras show excellent camera-to-camera repeatability in the green band.

Graphic Jump Location
Fig. 22
F22 :

Spectral response measurements of two separate Raspberry Pi V2 cameras show excellent camera-to-camera repeatability in the blue band.

An absolute radiometric calibration was performed on a single Raspberry Pi V2.1 camera, which enables one to convert camera acquired DN values into engineering units of radiance. An absolute radiometric calibration can be used to quantify the brightness of objects in a scene and enables a user to preset and optimize camera parameters, such as exposure time, ISO, and f-number, before image acquisition.

The absolute radiometric calibration is based on a general radiometric equation for a well behaved (or correctable) pixel at a fixed ISO or gain setting, within a linearly behaved (or correctable) sensor.22,28 Since a pixel’s DN (count) is proportional to the number of signal electrons Ne within a pixel,28 the generalized radiometric equation for a dark frame subtracted image, where the bias has been removed and the electronic gain is unity, can be written as follows: Display Formula

DN=NeQSEπτAd4(f#)2QSEhc0L(λ)T(λ)η(λ)λdλ,(5)
where QSE is the quantum scale equivalence,28 which relates counts to electrons, τ is the exposure time, Ad is the detector area, f# is the camera’s f-number, h is Planck’s constant, c is the speed of light, λ is the wavelength of light, L(λ) is the spectral radiance, T(λ) is the optical transmission, and η(λ) is the quantum efficiency. In this equation, the solid angle is approximated by π/4(f#)2, which yields an 4% error from the exact expression at F2.0. This error is corrected as part of the calibration process.

To simplify the above equation, one can define the camera’s spectral response S(λ), which is related to amps per watt, as follows: Display Formula

S(λ)=T(λ)η(λ)λ.(6)
In many cases, one does not know the exact quantum efficiency or optical transmission of a camera and what is measured (in DN) is actually a signal that is proportional to S(λ). If one peak normalizes S(λ) to unity, the integral of S(λ) over wavelength is the effective spectral width of the spectral response.29 This allows one to define average spectral radiance as follows: Display Formula
L¯=0L(λ)S(λ)dλ0S(λ)dλ.(7)

Using a parameter like the QSE,28 which relates the number of electrons to counts, one can rewrite Eq. (5) as follows: Display Formula

DN=πτAdISOL¯4(f#)2100QSEhc.(8)

The QSE can be defined as follows: Display Formula

QSE=NwellNDR,(9)
where Nwell is a pixel’s well capacity in electrons and NDR is the digital count range (1024 for a 10 bit system minus dark frame offset). Usually, QSE is defined for cases where electronic gain is unity. When ISO is used, this assumption is not always kept, but, for simplicity, we have used the ratio of ISO to QSE as a generalization to include electronic gain.

To perform an absolute camera calibration, the I2R 1.5 m diameter integrating sphere was illuminated with white-light Luxeon Rebel 4000K LEDs (as before) and imaged by the Raspberry Pi camera nearly simultaneously as a NIST-traceable spectrometer, calibrated to better than 5% absolute accuracy, to measure the sphere’s spectral radiance.

When acquiring imagery for the calibration, camera exposure was set to 10 ms and ISO was incrementally set to 300, 400, and 500. As mentioned earlier, current to the LEDs illuminating the 1.5-m sphere was set to maximize camera DN in the green band without causing saturation.

Five dark images and 60 bright images (4 azimuthal positions × 3 view angles × 5 images per position) of the sphere were acquired. As with the linearity measurements, these multiple bright images were acquired to reduce local integrating sphere surface defects and image noise. The entire image was used in this assessment. The bright images were dark frame subtracted, flat field corrected, and then temporally and spatially averaged to establish a mean DN value.

If we define a calibration coefficient C as follows: Display Formula

C=400QSEhcAdπ.(10)

We can rewrite Eq. (8) above as follows: Display Formula

DN=τISOL¯C(f#)2.(11)

The calibration coefficient can then be determined for each RGB band such that: Display Formula

L¯=C[(f#)2τISO]DN.(12)

Using F2.0, the resulting three-point mean calibration coefficients, determined at three different ISO values, are shown in Table 10.

Table Grahic Jump Location
Table 10Raspberry Pi camera V2 absolute radiometric calibration coefficients.

To keep the Raspberry Pi cameras radiometrically calibrated, this type of assessment would have to be performed periodically. The frequency of this calibration would depend on the radiometric accuracy required, camera operation, and operation conditions.

A comprehensive radiometric characterization was performed on the Raspberry Pi V2.1 camera module. The camera was found to be stable over short periods, measured in days, and performance was repeatable between multiple cameras. Camera exposure stability was extremely stable (<0.1% variation) after warm-up. Raw-data format DN values were linear with ISO and exposure time over the regions investigated. Flat fielding surfaces were symmetric, indicating that the optical center of the camera was aligned well to the geometric center of the FPA. Without flat fielding corrections, raw-data format image brightness decreased 75% when transecting from the center to the edge of the image.

To qualitatively evaluate the overall effect of applying dark frame subtraction, flat fielding and absolute radiometric calibration, a “typical” raw image was acquired at an ISO setting of 100 and an exposure time of 20 ms. The raw-data format image was demosaicked using a simple bilinear algorithm and displayed in RGB, as shown in Fig. 23. The image was then dark frame subtracted and flat fielded using the functions described above (Fig. 24) and finally radiometrically calibrated using the calibration coefficients provided in this paper (Fig. 25). The final image is a radiometrically correct image that can be converted to units of radiance. Note the improvement in color quality and brightness uniformity when all the corrections are applied.

Graphic Jump Location
Fig. 23
F23 :

Demosaicked raw-data format image displayed in RGB.

Graphic Jump Location
Fig. 24
F24 :

Raw-data format image corrected through dark frame subtraction and flat fielding.

Graphic Jump Location
Fig. 25
F25 :

Raw-data format image corrected through dark frame subtraction, flat fielding, and radiometric calibration.

The Raspberry Pi V2.1 camera module, operated using the Raspberry Pi 3 single-board computer, has been radiometrically calibrated to produce high quality imagery appropriate for scientific and engineering use. The radiometric calibration coefficients determined in this investigation were applied to imagery acquired with the V2.1 camera module to recover information in SI units of radiance. This finding opens up a wide range of scientific applications associated with computer vision, biophotonics, remote sensing, HDR imaging, and astronomy, to name a few. While the camera modules appeared stable after warm-up over the few month investigation, the camera’s value to the scientific community will be determined in part by longer term stability.

The small number of camera modules that were investigated produced consistent, repeatable results. A larger scale investigation involving many more cameras will need to be performed before the community can feel confident that the results of this investigation can be applied to other Raspberry Pi V2.1 camera modules. It should be noted that each camera module will be slightly different, and, for some applications, each individual camera module will have to be characterized.

Upton  E., “Raspberry Pi 3 on sale now at $35,” Raspberrypi, 2016, https://www.raspberrypi.org/blog/raspberry-pi-3-on-sale (3  January  2017).
Bravo  F. R. P., Support System to Help Parkinson’s Patients Read Books. , Máster en Ingeniería Informática, Facultad de Informática,Departamento Arquitectura de Computadores y Automática, curso 2014–2015,  Universidad Complutense de Madrid ,  Madrid  (2015).
Almeida  E., , Ferruzca  M., and Tlapanco  M., “Design of a system for early detection and treatment of depression in elderly case study,” in  Int. Symp. on Pervasive Computing Paradigms for Mental Health , pp. 115 –124,  Springer International Publishing  (2014).
Yoon  W.  et al., “6Lo bluetooth low energy for patient-centric healthcare service on the internet of things,” in  Proc. of the Int. Conf. on the Internet of Things  (2014).
Zainee  N. M., , Norhayati  M., and Chellappan  K., “Emergency clinic multi-sensor continuous monitoring prototype using e-health platform,” in  IEEE Conf. on Biomedical Engineering and Sciences (IECBES ‘14) , pp. 32 –37 (2014).CrossRef
Fuicu  S.  et al., “Real time e-health system for continuous care,” in  Proc. of the 8th Int. Conf. on Pervasive Computing Technologies for Healthcare , pp. 436 –439,  ICST  (2014).
Hacks  C., “e-health sensor platform V2.0 for Arduino and Raspberry Pi,” cooking-hacks, 2015, http://www.cooking-hacks.com/documentation/tutorials/ehealth-biometric-sensor-platform-arduino-raspberry-pi-medical (3  January  2017).
Chapman  L., , Gray  C., , Headleand  C., “A sense-think-act architecture for low-cost mobile robotics,” in Research and Development in Intelligent Systems XXXII. , , Bramer  M., and Petridis  M., pp. 405 –410,  Springer International Publishing ,  Switzerland  (2015).
Prasad  S.  et al., “Smart surveillance monitoring system using Raspberry Pi and PIR sensor,” Int. J. Comput. Sci. Inf. Technol.. 5, , 7107 –7109 (2014).
Desai  V., and Bavarva  A., “Image processing method for embedded optical peanut sorting,” Int. J. Image Graphics Signal Process.. 8, (2 ), 20 –27 (2016).CrossRef
Upton  E., “New 8-megapixel camera board on sale at $25,” Raspberrypi, 2016, https://www.raspberrypi.org/blog/new-8-megapixel-camera-board-sale-25/ (3  January  2017).
Sony, “IMX219 product brief version 1.0,” Electronicsdatasheets, 2016, https://www.electronicsdatasheets.com/manufacturers/raspberry-pi/parts/imx219 (3  January  2017).
Schanda  J., Ed., Colorimetry: Understanding the CIE System. ,  John Wiley & Sons ,  New York  (2007).
Yu  H., , Tang  Y., and Cunningham  B.T., “Smartphone fluorescence spectroscopy,” Anal. Chem.. 86, (17 ), 8805 –8813 (2014). 0003-2700 CrossRef
Spigulis  J., , Oshina  I., and Rupenheits  Z., “Smartphone single-snapshot mapping of skin chromophores,” in  Optical Tomography and Spectroscopy , JTu3A–46,  Optical Society of America  (2016).
Jensen  J., Remote Sensing of the Environment: an Earth Resource Perspective. ,  Pearson Prentice Hall ,  Upper Saddle River, New Jersey  (2007).
Hoot  J., “Photometry with DSLR cameras,” in  Society for Astronomical Sciences Annual Symp. , Vol. 26, p. 67  (2007).
Mobberley  M., Lunar and Planetary Webcam User’s Guide. ,  Springer Science & Business Media ,  London  (2006).
Mizoguchi  T., “6 evaluation of image sensors,” in Image Sensors and Signal Processing for Digital Still Cameras. , pp. 179 –203,  CRC ,  Boca Raton, Florida  (2006).
Jiang  J.  et al., “What is the space of spectral sensitivity functions for digital color cameras?” in  IEEE Workshop on Applications of Computer Vision (WACV) , pp. 168 –179,  IEEE  (2013).CrossRef
Verhoeven  G. J.  et al., “Spectral characterization of a digital still camera’s NIR modification to enhance archaeological observation,” IEEE Trans. Geosci. Remote Sens.. 47, (10 ), 3456 –3468 (2009). 0196-2892 CrossRef
Ryan  R., and Pagnutti  M., “Enhanced absolute and relative radiometric calibration for digital aerial cameras,” in  Photogrammetric Week , pp. 81 –90 (2009).
Lumileds, “DS107 LUXEON Rebel PLUS product datasheet 20140930,”  Lumileds , 2015, http://www.lumileds.com/uploads/380/DS107-pdf (3  January  2017).
Jacquot  B. C., , Bolla  B. M., and Maguire  S., “Hybrid approach to mean-variance and photon transfer measurement,” Proc. SPIE. 9481, , 94810D  (2015). 0277-786X CrossRef
Janesick  J. R., Photon Transfer. ,  SPIE Press ,  Bellingham, Washington  (2007).
Holman  J., Heat Transfer. , 9th ed.,  McGraw-Hill ,  New York, Boston  (2002).
Maschal  R. A.  Jr.  et al., “Review of Bayer pattern color filter array (CFA) demosaicing with new quality assessment algorithms,” Technical Report, DTIC Document (2010).
Fiete  R. D., Modeling the Imaging Chain of Digital Cameras. ,  SPIE Press ,  Bellingham, Washington  (2010).
Schott  J. R., Remote Sensing: the Image Chain Approach. ,  Oxford University Press ,  New York  (2007).

Mary Pagnutti is president and cofounder of Innovative Imaging and Research. She received her BE and ME degrees in mechanical engineering from Stony Brook University and has worked in the field of remote sensing and sensor calibration for over 18 years.

Robert E. Ryan is vice president and cofounder of Innovative Imaging and Research. He received his BS in physics from Hofstra University, his MS in electrophysics from Polytechnic Institute of New York, and his PhD in physics from Stony Brook University. He has worked in the field of remote sensing and sensors for over 30 years.

George Cazenavette was a 2016 summer intern at Innovative Imaging and Research. He is currently attending Louisiana Tech University in Ruston, Louisiana studying cyber engineering and computer science.

Maxwell Gold was a 2016 summer intern at Innovative Imaging and Research. He is currently attending Washington and Lee University in Lexington, Virginia studying mathematics.

Ryan Harlan is an imaging intern at Innovative Imaging and Research. He received a BS in chemical engineering from Washington University in Saint Louis, Missouri.

Edward Leggett is a senior research scientist at Innovative Imaging and Research. He received his BS and MS degrees in physics from Mississippi State University and is currently completing a PhD in applied physics. He has 10 years of experience in computational particle physics.

James Pagnutti was a 2016 summer intern at Innovative Imaging and Research. He is currently attending Louisiana Tech University in Ruston, Louisiana studying computer science.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Mary Pagnutti ; Robert E. Ryan ; George Cazenavette ; Maxwell Gold ; Ryan Harlan, et al.
"Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes", J. Electron. Imaging. 26(1), 013014 (Feb 11, 2017). ; http://dx.doi.org/10.1117/1.JEI.26.1.013014


Figures

Graphic Jump Location
Fig. 1
F1 :

Raspberry Pi 3 and camera module V2.1.

Graphic Jump Location
Fig. 2
F2 :

250-frame mean dark image histograms at a 50 ms exposure time and an ISO setting of (a) 100 and (b) 600.

Graphic Jump Location
Fig. 3
F3 :

250-frame mean dark images at a 50 ms exposure time and an ISO setting of (a) 100 and (b) 600.

Graphic Jump Location
Fig. 4
F4 :

Integrating sphere schematic.

Graphic Jump Location
Fig. 5
F5 :

Raspberry Pi camera acquiring imagery of an I2R integrating sphere lamped with white-light LED sources.

Graphic Jump Location
Fig. 6
F6 :

Green band camera response (a) as a function of exposure time and (b) residuals when compared to a linear fit. Data acquired at an ISO setting of 100.

Graphic Jump Location
Fig. 7
F7 :

Green band camera response (a) as a function of ISO setting and (b) residuals when compared to a linear fit. Data acquired at an exposure time of 10 ms.

Graphic Jump Location
Fig. 8
F8 :

Raspberry Pi camera mean-variance curves for ISO 100, 200, and 400.

Graphic Jump Location
Fig. 9
F9 :

Camera dark frame exposure stability.

Graphic Jump Location
Fig. 10
F10 :

Camera bright frame exposure stability.

Graphic Jump Location
Fig. 11
F11 :

Camera flat fielding surface at F2.0, ISO setting 100- and 20-ms exposure time, red band.

Graphic Jump Location
Fig. 12
F12 :

Camera flat fielding surface at F2.0, ISO setting 100- and 20-ms exposure time, green band.

Graphic Jump Location
Fig. 13
F13 :

Camera flat fielding surface at F2.0, ISO setting 100- and 20-ms exposure time, blue band.

Graphic Jump Location
Fig. 14
F14 :

Diagonal transects across the red band flat fielding surface for (a) a single image and (b) a 36-image median image.

Graphic Jump Location
Fig. 15
F15 :

Diagonal transects across the green band flat fielding surface for (a) a single image and (b) a 36-image median image.

Graphic Jump Location
Fig. 16
F16 :

Diagonal transects across the blue band flat fielding surface for (a) a single image and (b) a 36-image median image.

Graphic Jump Location
Fig. 17
F17 :

Diagonal transects across the red, green, and blue band flat fielding surface for a 36-image median image.

Graphic Jump Location
Fig. 18
F18 :

Raspberry Pi V2 camera spectral response measurement using a quartz tungsten halogen lamp filtered using a monochromator.

Graphic Jump Location
Fig. 19
F19 :

Raspberry Pi Camera V2 spectral response.

Graphic Jump Location
Fig. 20
F20 :

Spectral response measurements of two separate Raspberry Pi V2 cameras show excellent camera-to-camera repeatability in the red band.

Graphic Jump Location
Fig. 21
F21 :

Spectral response measurements of two separate Raspberry Pi V2 cameras show excellent camera-to-camera repeatability in the green band.

Graphic Jump Location
Fig. 22
F22 :

Spectral response measurements of two separate Raspberry Pi V2 cameras show excellent camera-to-camera repeatability in the blue band.

Graphic Jump Location
Fig. 23
F23 :

Demosaicked raw-data format image displayed in RGB.

Graphic Jump Location
Fig. 24
F24 :

Raw-data format image corrected through dark frame subtraction and flat fielding.

Graphic Jump Location
Fig. 25
F25 :

Raw-data format image corrected through dark frame subtraction, flat fielding, and radiometric calibration.

Tables

Table Grahic Jump Location
Table 1Raspberry Pi 3 computer attributes.
Table Grahic Jump Location
Table 2Sony IMX219 sensor chip specifications.
Table Grahic Jump Location
Table 3Raspberry Pi camera specifications.
Table Grahic Jump Location
Table 4Raspberry Pi camera V2 dark frame statistics at 5 ms (250 frame mean).
Table Grahic Jump Location
Table 5Raspberry Pi camera V2 dark frame statistics at 50 ms (250 frame mean).
Table Grahic Jump Location
Table 6Linearity with exposure setting linear fit parameters.
Table Grahic Jump Location
Table 7Linearity with ISO setting linear fit parameters.
Table Grahic Jump Location
Table 8Mean-variance linear fit parameters.
Table Grahic Jump Location
Table 9Flat fielding surface functional fit parameters.
Table Grahic Jump Location
Table 10Raspberry Pi camera V2 absolute radiometric calibration coefficients.

References

Upton  E., “Raspberry Pi 3 on sale now at $35,” Raspberrypi, 2016, https://www.raspberrypi.org/blog/raspberry-pi-3-on-sale (3  January  2017).
Bravo  F. R. P., Support System to Help Parkinson’s Patients Read Books. , Máster en Ingeniería Informática, Facultad de Informática,Departamento Arquitectura de Computadores y Automática, curso 2014–2015,  Universidad Complutense de Madrid ,  Madrid  (2015).
Almeida  E., , Ferruzca  M., and Tlapanco  M., “Design of a system for early detection and treatment of depression in elderly case study,” in  Int. Symp. on Pervasive Computing Paradigms for Mental Health , pp. 115 –124,  Springer International Publishing  (2014).
Yoon  W.  et al., “6Lo bluetooth low energy for patient-centric healthcare service on the internet of things,” in  Proc. of the Int. Conf. on the Internet of Things  (2014).
Zainee  N. M., , Norhayati  M., and Chellappan  K., “Emergency clinic multi-sensor continuous monitoring prototype using e-health platform,” in  IEEE Conf. on Biomedical Engineering and Sciences (IECBES ‘14) , pp. 32 –37 (2014).CrossRef
Fuicu  S.  et al., “Real time e-health system for continuous care,” in  Proc. of the 8th Int. Conf. on Pervasive Computing Technologies for Healthcare , pp. 436 –439,  ICST  (2014).
Hacks  C., “e-health sensor platform V2.0 for Arduino and Raspberry Pi,” cooking-hacks, 2015, http://www.cooking-hacks.com/documentation/tutorials/ehealth-biometric-sensor-platform-arduino-raspberry-pi-medical (3  January  2017).
Chapman  L., , Gray  C., , Headleand  C., “A sense-think-act architecture for low-cost mobile robotics,” in Research and Development in Intelligent Systems XXXII. , , Bramer  M., and Petridis  M., pp. 405 –410,  Springer International Publishing ,  Switzerland  (2015).
Prasad  S.  et al., “Smart surveillance monitoring system using Raspberry Pi and PIR sensor,” Int. J. Comput. Sci. Inf. Technol.. 5, , 7107 –7109 (2014).
Desai  V., and Bavarva  A., “Image processing method for embedded optical peanut sorting,” Int. J. Image Graphics Signal Process.. 8, (2 ), 20 –27 (2016).CrossRef
Upton  E., “New 8-megapixel camera board on sale at $25,” Raspberrypi, 2016, https://www.raspberrypi.org/blog/new-8-megapixel-camera-board-sale-25/ (3  January  2017).
Sony, “IMX219 product brief version 1.0,” Electronicsdatasheets, 2016, https://www.electronicsdatasheets.com/manufacturers/raspberry-pi/parts/imx219 (3  January  2017).
Schanda  J., Ed., Colorimetry: Understanding the CIE System. ,  John Wiley & Sons ,  New York  (2007).
Yu  H., , Tang  Y., and Cunningham  B.T., “Smartphone fluorescence spectroscopy,” Anal. Chem.. 86, (17 ), 8805 –8813 (2014). 0003-2700 CrossRef
Spigulis  J., , Oshina  I., and Rupenheits  Z., “Smartphone single-snapshot mapping of skin chromophores,” in  Optical Tomography and Spectroscopy , JTu3A–46,  Optical Society of America  (2016).
Jensen  J., Remote Sensing of the Environment: an Earth Resource Perspective. ,  Pearson Prentice Hall ,  Upper Saddle River, New Jersey  (2007).
Hoot  J., “Photometry with DSLR cameras,” in  Society for Astronomical Sciences Annual Symp. , Vol. 26, p. 67  (2007).
Mobberley  M., Lunar and Planetary Webcam User’s Guide. ,  Springer Science & Business Media ,  London  (2006).
Mizoguchi  T., “6 evaluation of image sensors,” in Image Sensors and Signal Processing for Digital Still Cameras. , pp. 179 –203,  CRC ,  Boca Raton, Florida  (2006).
Jiang  J.  et al., “What is the space of spectral sensitivity functions for digital color cameras?” in  IEEE Workshop on Applications of Computer Vision (WACV) , pp. 168 –179,  IEEE  (2013).CrossRef
Verhoeven  G. J.  et al., “Spectral characterization of a digital still camera’s NIR modification to enhance archaeological observation,” IEEE Trans. Geosci. Remote Sens.. 47, (10 ), 3456 –3468 (2009). 0196-2892 CrossRef
Ryan  R., and Pagnutti  M., “Enhanced absolute and relative radiometric calibration for digital aerial cameras,” in  Photogrammetric Week , pp. 81 –90 (2009).
Lumileds, “DS107 LUXEON Rebel PLUS product datasheet 20140930,”  Lumileds , 2015, http://www.lumileds.com/uploads/380/DS107-pdf (3  January  2017).
Jacquot  B. C., , Bolla  B. M., and Maguire  S., “Hybrid approach to mean-variance and photon transfer measurement,” Proc. SPIE. 9481, , 94810D  (2015). 0277-786X CrossRef
Janesick  J. R., Photon Transfer. ,  SPIE Press ,  Bellingham, Washington  (2007).
Holman  J., Heat Transfer. , 9th ed.,  McGraw-Hill ,  New York, Boston  (2002).
Maschal  R. A.  Jr.  et al., “Review of Bayer pattern color filter array (CFA) demosaicing with new quality assessment algorithms,” Technical Report, DTIC Document (2010).
Fiete  R. D., Modeling the Imaging Chain of Digital Cameras. ,  SPIE Press ,  Bellingham, Washington  (2010).
Schott  J. R., Remote Sensing: the Image Chain Approach. ,  Oxford University Press ,  New York  (2007).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.