Preprint Article Version 1 This version is not peer-reviewed

Forest Fire Discrimination Based on Angle Slope Index and Himawari-8

Version 1 : Received: 30 September 2024 / Approved: 30 September 2024 / Online: 1 October 2024 (11:55:15 CEST)

How to cite: Liu, P.; Zhang, G. Forest Fire Discrimination Based on Angle Slope Index and Himawari-8. Preprints 2024, 2024100009. https://doi.org/10.20944/preprints202410.0009.v1 Liu, P.; Zhang, G. Forest Fire Discrimination Based on Angle Slope Index and Himawari-8. Preprints 2024, 2024100009. https://doi.org/10.20944/preprints202410.0009.v1

Abstract

In the context of high frequency and intensity forest fires driven by future warming and drying climate, early detection and effective control of fires are extremely important to reduce losses. Satellite remote sensing, characterized by its broad observational scope and strong repeat observation capability, has become an important method for rapid monitoring of forest fires, replacing traditional means such as manual measurement and aerial photography. This paper addresses the misjudgments caused by solely relying on changes in infrared band brightness values and single-band separation in forest fire discrimination. It constructs the angle slope index ANIR based on the red band, near-infrared band and short-wave infrared band, Angle slope index AMIR based on the short-wave infrared band, mid-infrared band and far-infrared band, and angle slope index AMNIR based on the difference between the AMIR and ANIR, integrating the strong inter-band correlations and the reflectance characteristics of visible and short-wave infrared bands to simultaneously monitor smoke and fuel biomass changes in forest fires. To address the omissions caused by single-image fire discrimination algorithms, the paper constructs time-series Angle slope difference indices ∆ANIR and ∆AMIR. To address the different fire point discrimination thresholds caused by background difference in different time and space, the paper uses the decomposed three-dimensional OTSU(Maximum inter-class variance method) algorithm to calculate the segmentation threshold of the sub-regions constructed from the AMNIR data. The methods were validated using high temporal resolution meteorological satellite Himawari-8 and ground truth fire data from Hunan Province, China, from 2018-2019. The results show that the discrimination method based on the ASITR(Angle slope indices thresholds) method exhibits an accuracy of 0.88, a missed detection rate of 0.17, and an overall evaluation value of 0.85, which is 0.09 higher in accuracy and 0.03 higher in overall evaluation than the discrimination method based on infrared bands brightness values proposed by Feng et al. [1] (accuracy 0.79, missed detection rate 0.14, overall evaluation value 0.82); the method based on the FAMN_OTSU_ASITR (Fusion of Angle slope difference index (false) data, Decomposed three-dimensional OTSU adaptive threshold segmentation Algorithm and ASITR) exhibits an accuracy of 0.85, a missed detection rate of 0.08, and an overall evaluation value of 0.88, which is 0.09 lower in missed detection rate and 0.03 higher in overall evaluation as compared to the ASITR. The results demonstrate that the forest fire discrimination method based on Angle slope indices, Himawari-8 satellite data, and the decomposed OTSU adaptive threshold segmentation algorithm can improve the accuracy of forest fire discrimination and reduce errors in missed detections.

Keywords

Forest fire discrimination; Angle slope index; Himawari-8; decomposed three-dimensional OTSU

Subject

Environmental and Earth Sciences, Remote Sensing

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.