Skip to main content

Large-Scale Multi-Source Satellite Data for Wildfire Detection and Assessment Using Deep Learning

Time: Thu 2022-06-02 09.00

Location: Kollegiesalen, Brinellvägen 8, Stockholm

Video link: https://kth-se.zoom.us/j/64389993343

Language: English

Subject area: Geodesy and Geoinformatics, Geoinformatics

Doctoral student: Xikun Hu , Geoinformatik

Opponent: Professor Martin Wooster, King's College London, Department of Geography

Supervisor: Professor Yifang Ban, Geoinformatik

Export to calendar

QC220517

Abstract

Earth Observation (EO) satellites have great potential in wildfire detection and assessment at fine spatial, temporal, and spectral resolutions. For a long time, satellite data have been employed to systematically monitor wildfire dynamics and assess wildfire impacts, including (i) to detect the location of actively burning spots, (ii) to map the spatial extent of the burn scars, (iii) to assess the wildfire damage levels. Active fire detection plays an important role in wildfire early warning systems. Accurate and timely burned area mapping is critical to delineate the fire perimeter and enables the analysis of fire suppression efforts and potential drivers of fire spread. Subsequently, burn severity assessment aims to infer the degree of environmental change caused by fire. Recent advances in deep learning (DL) empower the automatic interpretation of a huge amount of remote sensing data. The objective of the thesis is to employ large-scale multi-source satellite data that are publicly available, e.g., Landsat, Sentinel-1, and Sentinel-2, for detecting active fires and spatial delineation of burned areas and analyzing fire impacts using DL-based approaches.

A biome-based multi-criteria approach is developed to extract unambiguous active fire pixels using the reflectance of Sentinel-2 MultiSpectral Instrument (MSI) data at a 20-m resolution. The adaptive thresholds are statistically determined from 11 million observation samples acquired over summertime across broad geographic regions and fire regimes. The primary criterion takes advantage of the significant increase in fire reflectance in Sentinel-2 band 12 (2.20 μm) relative to band 4 (0.66 μm) over representative biome property. It proves to be effective in cool smoldering fire detection. Multiple conditional constraints that threshold the reflectance of band 11 (1.61 μm) and band 12 can decrease the commission errors caused by extremely bright flames around the hot cores. The overall commission and omission errors can be kept at a relatively low level around 0.14 and 0.04, respectively. The proposed algorithm is suitable for rapid active fire detection based on uni-temporal imagery without the requirement of time series.

Burned area mapping algorithms have been developed based on Landsat series, Sentinel-2 MSI, and Sentinel-1 SAR data. On one hand, the thesis expounds on the capability of DL-based methods for automatically mapping burn scars from uni-temporal (post-fire) Sentinel-2 imagery. The validation results demonstrate that deep semantic segmentation algorithms outperform traditional machine learning (ML) methods (e.g., random forest) and threshold-based methods (empirical and automatic) in detecting compact burn scars. When directly transferred to corresponding Landsat-8 test data, HRNet preserves the high accuracy. On the other hand, a large-scale annotated dataset for wildfire analysis (SAR-OPT-Wildfire) is proposed, which includes bi-temporal Sentinel-1 SAR imagery, Sentinel-2 MSI imagery, and rasterized fire perimeters across over 300 large wildfire events in Canada. These multi-source data are used for burned area mapping under three UNet-based change detection architectures, i.e., Early Fusion (EF) and two Siamese (Siam) variants. UNet-EF achieves the highest IoU score on Sentinel-1 data, while UNet-Siam-Difference performs best on the Sentinel-2 data with weight-sharing encoders. Bi-temporal scenes can significantly boost the IoU score to 0.86 and 0.80 for Sentinel-2 and Sentinel-1, respectively. By fusing bi-temporal Sentinel-1 backscatter with Sentinel-2 data, no improvement is observed compared to standalone optical-based results. This multi-source integration may provide new opportunities for near real-time wildfire progression mapping and could reduce the impacts of cloud cover.

Mapping burn severity with multispectral satellite data is typically performed through classifying bi-temporal indices (e.g., dNBR and RdNBR) using thresholds derived from parametric models incorporating field measures. The thesis re-organizes a large-scale Landsat-based bi-temporal burn severity dataset (LandSat-BSA) through visual data cleaning based on annotated MTBS data (around 1000 large fire events across the United States). The study emphasizes that multi-class semantic segmentation architectures can approximate the thresholding techniques used extensively for burn severity assessment. Specifically, UNet-like models substantially outperform other region-based CNN and Transformer-based models. Combined with the online hard example mining algorithm, Attention UNet can achieve the highest mIoU (0.7832) and Kappa coefficient close to 0.9. The bi-temporal inputs with multispectral bands and ancillary spectral indices perform much better than the uni-temporal inputs. When transferred to the Sentinel-2 data, Attention UNet maintains a Kappa value over 0.815 with high overall accuracy after the scaling operation.

Considering that SAR can effectively penetrate clouds and images under all-weather conditions during day and night, complementary use of both optical and SAR data is investigated for precise interpretation of fire-induced sites. Nevertheless, the widely used burn-sensitive spectral indices cannot be applied to SAR data because of the inherent difference between optical and SAR sensors in physical imaging mechanisms. The thesis proposes a new wildfire mapping framework by transforming SAR and optical data into a common domain based on Generative Adversarial Networks. Several experiments are conducted on the paired Sentinel-1 and Sentinel-2 images (SAR-OPT-Wildfire) using the ResNet-based Pix2Pix model. Translated optical images from SAR images preserve similar spectral characteristics and the corresponding generated spectral indices (i.e., dNBR, RdNBR, and Relativized Burn Ratio) also show good agreement with real optical ones. Regarding burned area detection using the generated indices, their medium values of the area under the receiver operating characteristics curves are over 0.85, which achieves competitive performance against the truly optical-based indices and significantly outperforms SAR-based ones. Furthermore, the derived burn severity maps from multi-source data can reach high accuracy (Kappa coefficient: 0.77). This study validates the feasibility and effectiveness of SAR-to-optical translation for the wildfire impacts assessment, which may have the potential to promote the multi-source fusion of optical and SAR data.

This thesis contributes to the development of approaches for detecting, mapping, and assessing wildfire through the large-scale publicly available EO data across fire-prone regions around the world. The research output compiled in this thesis demonstrates that the open-access medium-resolution EO data are convenient and efficient to monitor wildfires and assess the impact of fire damage. The frameworks developed in this thesis can be easily adapted to other SAR or optical data. The thesis mainly demonstrates that DL models can make full use of contextual information and capture spatial details on multiple scales from fire-sensitive spectral bands to map burned areas or burn severity. Combining with multi-source data will substantially increase the temporal observation frequency. The future work will focus on improving the generalization capability of DL models in wildfire applications by exploiting more diverse and complex study areas and the use of multi-frequency SAR (Sentinel-1 in C band, PALSAR-2 in L band, and future Biomass in P band) and multispectral data (Landsat-8, Landsat-9, and Sentinel-2).

urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-312283