TY - JOUR
T1 - Deep learning-based intelligent post-bushfire detection using UAVs
AU - Qadir, Zakria
AU - Le, Khoa
AU - Bao, V. N. Q.
AU - Tam, Vivian W. Y.
PY - 2024
Y1 - 2024
N2 - Bushfires across remote areas can easily spread regionally due to extreme weather conditions. Unmanned aerial vehicles (UAVs) can play a significant role in postdisaster assessment as they have low-cost and flexible deployment characteristics. Detecting bushfires in their initial phase is important to save the inhabitants, infrastructure, and ecosystem. In the context of smart cities, we have applied state-of-the-art deep learning (DL) algorithms to detect and differentiate between fire and no-fire regions. However, certain factors like reachability to forest fires and timely detection of the region of interest (ROI) are quite challenging. Therefore, we incorporate UAVs for capturing the real-time images and process these images into our proposed YOLOv5-s (you only look once-small) model that helps achieve fast and accurate detection of the affected region. We have proposed a lightweight single-stage network with an improved bottleneck cross-stage partial (CSP) module and pyramid attention network (PAN) layers to enhance precise feature extraction and reduce computation time in fire detection. Notably, the HardSwish activation function outperformed ReLU in a specific fire detection scenario. Based on the provided dataset, simulation results demonstrate that the optimized model can effectively detect and differentiate between fire and nonfire regions, which may be challenging to discern with the naked eye. The results indicate that the proposed model surpasses existing models, achieving an accuracy of 97.4%, a low false-positive rate of 1.258%, and nonmaximum suppression (NMS) of 3 ms. Our model can provide real-time applications for fire and rescue relief teams.
AB - Bushfires across remote areas can easily spread regionally due to extreme weather conditions. Unmanned aerial vehicles (UAVs) can play a significant role in postdisaster assessment as they have low-cost and flexible deployment characteristics. Detecting bushfires in their initial phase is important to save the inhabitants, infrastructure, and ecosystem. In the context of smart cities, we have applied state-of-the-art deep learning (DL) algorithms to detect and differentiate between fire and no-fire regions. However, certain factors like reachability to forest fires and timely detection of the region of interest (ROI) are quite challenging. Therefore, we incorporate UAVs for capturing the real-time images and process these images into our proposed YOLOv5-s (you only look once-small) model that helps achieve fast and accurate detection of the affected region. We have proposed a lightweight single-stage network with an improved bottleneck cross-stage partial (CSP) module and pyramid attention network (PAN) layers to enhance precise feature extraction and reduce computation time in fire detection. Notably, the HardSwish activation function outperformed ReLU in a specific fire detection scenario. Based on the provided dataset, simulation results demonstrate that the optimized model can effectively detect and differentiate between fire and nonfire regions, which may be challenging to discern with the naked eye. The results indicate that the proposed model surpasses existing models, achieving an accuracy of 97.4%, a low false-positive rate of 1.258%, and nonmaximum suppression (NMS) of 3 ms. Our model can provide real-time applications for fire and rescue relief teams.
UR - https://hdl.handle.net/1959.7/uws:76715
U2 - 10.1109/LGRS.2023.3329509
DO - 10.1109/LGRS.2023.3329509
M3 - Article
SN - 1545-598X
VL - 21
JO - IEEE Geoscience and Remote Sensing Letters
JF - IEEE Geoscience and Remote Sensing Letters
M1 - 5000605
ER -