This paper investigates the applicability of deep learning models for predicting the severity of forest wildfires, utilizing an innovative benchmark dataset called EO4WildFires. EO4WildFires integrates multispectral imagery from Sentinel-2, SAR data from Sentinel-1 and meteorological data from NASA Power annotated with EFFIS data for forest fire detection and size estimation. Resulting in a coverage of 45 countries (Figure \ref{Figure1-events}) with a total of 31730 wildfire events from 2018 to 2022. All these various sources of data are archived into data cubes, with the intention of assessing wildfire severity by considering both current and historical forest conditions, utilizing a broad range of data including temperature, precipitation, and soil moisture.
The experimental setup has been arranged to test different deep learning architectures’ effectiveness in predicting the size and shape of wildfire burned areas. The study incorporates both Image Segmentation networks and Visual Transformers, employing a consistent experimental design across various models to ensure comparability of results. Adjustments were made to the training data, such as the exclusion of empty labels and very small events, to refine the focus on more significant wildfire events and potentially improve prediction accuracy.
The models’ performance was evaluated using metrics like F1 Score, IoU Score, and Average Percentage Difference (aPD). These metrics offer a multi-faceted view of model performance, assessing aspects such as precision, sensitivity, and the accuracy of the burned area estimation. Through extensive testing with Image Segmentation networks and Visual Transformers, the research not only aims at enhancing the accuracy of estimating the burned area but also underscores the significance of quality training data and the comparative effectiveness of traditional segmentation methods over transformer-based models for this specific application.
The dataset is published with an Open Access license and is hosted at: \url{10.5281/zenodo.7762564}.