Combining multiple satellite remote sensing sources provides a far richer, more frequent view of the earth than that of any single source; the challenge is in distilling these petabytes of heterogeneous sensor imagery into meaningful characterizations of the imaged areas. Meeting this challenge requires effective algorithms for combining multi-modal imagery over time to identify subtle but real changes among the intrinsic data variation. Here, we implement a joint-distribution framework for multi-sensor anomalous change detection (MSACD) that can effectively account for these differences in modality, and does not require any signal resampling of the pixel measurements. This flexibility enables the use of satellite imagery from different sensor platforms and modalities. We use multi-year construction of the SoFi Stadium in California as our testbed, and exploit synthetic aperture radar imagery from Sentinel-1 and multispectral imagery from both Sentinel-2 and Landsat 8. We show results for MSACD using real imagery with implanted, measurable changes, as well as real imagery with real, observable changes, including scaling our analysis over multiple years. |
ACCESS THE FULL ARTICLE
No SPIE Account? Create one
CITATIONS
Cited by 1 scholarly publication.
Multispectral imaging
Sensors
Remote sensing
Synthetic aperture radar
Earth observing sensors
Landsat
Satellites