TY - GEN
T1 - Wildland Fire Detection and Monitoring using a Drone-collected RGB/IR Image Dataset
AU - Chen, Xiwen
AU - Hopkins, Bryce
AU - Wang, Hao
AU - O'Neill, Leo
AU - Afghah, Fatemeh
AU - Razi, Abolfazl
AU - Fule, Peter
AU - Coen, Janice
AU - Rowell, Eric
AU - Watts, Adam
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Drone-based Unmanned Aerial Systems (UAS) provide an efficient means for early detection and monitoring of remote wildland fires due to their rapid deployment, low flight altitudes, high 3D maneuverability, and ever-expanding sensor capabilities. Recent sensor advancements have made side-by-side RGB/IR sensing feasible for UASs. The aggregation of optical and thermal images enables robust environmental observation, as the thermal feed provides information that would otherwise be obscured in a purely RGB setup, effectively "seeing through"thick smoke and tree occlusion. In this work, we present Fire detection and modeling: Aerial Multi-spectral image dataset (FLAME 2) [1], the first ever labeled collection of UAS-collected side-by-side RGB/IR aerial imagery of prescribed burns. Using FLAME 2, we then present two image-processing methodologies with Multi-modal Learning on our new dataset: (1) Deep Learning (DL)-based benchmarks for detecting fire and smoke frames with Transfer Learning and Feature Fusion. (2) an exemplary image-processing system cascaded in the DL-based classifier to perform fire localization. We show these two techniques achieve reasonable gains than either single-domain video inputs or training models from scratch in the fire detection task.
AB - Drone-based Unmanned Aerial Systems (UAS) provide an efficient means for early detection and monitoring of remote wildland fires due to their rapid deployment, low flight altitudes, high 3D maneuverability, and ever-expanding sensor capabilities. Recent sensor advancements have made side-by-side RGB/IR sensing feasible for UASs. The aggregation of optical and thermal images enables robust environmental observation, as the thermal feed provides information that would otherwise be obscured in a purely RGB setup, effectively "seeing through"thick smoke and tree occlusion. In this work, we present Fire detection and modeling: Aerial Multi-spectral image dataset (FLAME 2) [1], the first ever labeled collection of UAS-collected side-by-side RGB/IR aerial imagery of prescribed burns. Using FLAME 2, we then present two image-processing methodologies with Multi-modal Learning on our new dataset: (1) Deep Learning (DL)-based benchmarks for detecting fire and smoke frames with Transfer Learning and Feature Fusion. (2) an exemplary image-processing system cascaded in the DL-based classifier to perform fire localization. We show these two techniques achieve reasonable gains than either single-domain video inputs or training models from scratch in the fire detection task.
UR - http://www.scopus.com/inward/record.url?scp=85153714939&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85153714939&partnerID=8YFLogxK
U2 - 10.1109/AIPR57179.2022.10092208
DO - 10.1109/AIPR57179.2022.10092208
M3 - Conference contribution
AN - SCOPUS:85153714939
T3 - Proceedings - Applied Imagery Pattern Recognition Workshop
BT - 2022 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2022
Y2 - 11 October 2022 through 13 October 2022
ER -