In this work, we leverage deep learning to reproduce and expand Synthetic Aperture Radar (SAR) based deforestation detections generated using a probabilistic Bayesian model. Our Bayesian updating deforestation detections leverage SAR backscatter and InSAR coherence to perform change detection on forested areas and detect deforestation regardless of cloud cover. However, this model does not capture all deforestation events and is better suited to near-real time alerting than accurate forest loss acreage estimates. Here, we use SAR based probabilistic detections as deforested labels and Sentinel-2 optical composites as input features to train a neural network to differentiate deforested patches at various stages of regrowth from native forest. The deep learning model predictions demonstrate excellent recall of the original Bayesian label, and low precision due to providing better coverage of deforestation and detecting deforested patches not included in the imperfect Bayesian labels. These results provide an avenue to improve existing deforestation models, specifically with regards to their ability to quantify deforested acreage.
In this work we demonstrate that generative adversarial networks (GANs) can be used to generate realistic pervasive changes in RGB remote sensing imagery, even in an unpaired training setting. We investigate some transformation quality metrics based on deep embedding of the generated and real images which enable visualization and understanding of the training dynamics of the GAN, and provide a useful measure in terms of quantifying how distinguishable the generated images are from real images. We also identify some artifacts introduced by the GAN in the generated images, which are likely to contribute to the differences seen between the real and generated samples in the deep embedding feature space even in cases where the real and generated samples appear perceptually similar.
Although monitoring forest disturbance is crucial to understanding atmospheric carbon accumulation and biodiversity loss, persistent cloud cover, especially in tropical areas, makes detecting forest disturbances using optical remotely sensed imagery difficult. In Sentinel-1 synthetic aperture radar (SAR) images, forest clearings exhibit reduced backscatter as well as increased interferometric coherence. We combined SAR and Interferometric SAR metrics from Sentinel-1 data collected in Borneo between in 2017 and 2018 and applied unsupervised change detection methods to the time series. The results show that a simple log-ratio based detector performs similarly to a more sophisticated anomalous change detection algorithm. The log-ratio detector was deployed to compare a 2017 mean Sentinel-1 composite with a 2018 mean composite. Approximately 20000 newly deforested areas were identified in 2018, for a total of 3000 km2 . The findings suggest that leveraging SAR data to monitor deforestation has the potential to achieve better performance than Global Forest Watch, the current Landsat based gold standard. Future work will leverage the short revisit time (6-12 days) of Sentinel-1 as an opportunity for continuous monitoring of deforestation. The improved time resolution associated with SAR observations in cloudy regions might enable the identification of areas at risk of deforestation early enough in the clearing process to allow preventive actions to be taken.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.