Evaluating YOLOv4’s Performance in Real-Time Flood Object Detection
DOI:
https://doi.org/10.62647/IJITCE2025V13I3PP235-240Keywords:
Flood detection, YOLOv4, real-time object detection, disaster management, search and rescueAbstract
Floods are a great threat to human life and infrastructure thus neglecting efficient and real time flood detection systems for effective disaster response becomes impossible. Traditional flood monitoring methods like satellite imaging, as well as sensor-based detection, for example, have the characteristics of long delays, high costs and are not very applicable in real time. This study proposes investigated how YOLOv4, a state-of-the-art deep learning-based object detection model, can be used to detect flood environment in real time and to identify human in SAR operation. The methodology learns and tests YOLOv4 using flood related image datasets by using the architectures of CSPDarknet 53, PANet for feature extraction and detection error correction. Key performance metrics of the model were evaluated using precision, recall, mAP, and FPS. The experimental results show that YOLOv4 can reach an average precision mAP of 79.46% at the inference speed of 13.26 FPS which is suitable for real-time UAV assisted flood detection. Even its efficiency is marred by environmental challenges like low visibility, water reflections and occlusions that diminish the accuracy of detection. The results indicate that YOLOv4 could help to speed up and automate flood monitoring in disaster response applications. Adaptation of YOLOv4 with future enhancements like the incorporation of segmentation models, thermal imaging, and multimodal sensor fusion would further improve the detection accuracy and operational efficiency, and therefore make YOLOv4 a promising real-time tool for flood disaster management and emergency response.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Authors

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.