Natural disasters have a major impact on the economy, environment and people’s quality of life. Events such as earthquakes, tsunamis, floods, cyclones and forest fires can cause serious damage to countries and communities. These disasters destroy buildings, spread diseases, and disturb the natural ecosystem. Earthquakes can cause many buildings to collapse due to strong ground movements. Such disasters not only damage homes and important infrastructure but can also cause long-term changes to the environment.
To reduce the damage caused by natural disasters, many researchers use deep learning techniques to detect and study them. However, identifying disasters from images is difficult because the images are complex and unbalanced. To address this problem, we developed a multilayered deep convolutional neural network. This model can classify different types of natural disasters and show the severity of the damage. This performance of this model is compared with a pretrained model to improve disaster detection.
The proposed system successfully integrates deep learning with a web-based platform to classify natural disasters and estimate their severity levels. By employing a tailored CNN architecture, the model achieved high accuracy (95%) and outperformed pretrained models such as VGG16, ResNet50, and MobileNet. The introduction of a color-coded severity classification (Green, Yellow, Red) provides an intuitive and practical representation of disaster intensity, enabling faster decision-making for emergency responders. Furthermore, the integration of the CNN model with user and admin modules ensures real-time usability, efficient communication, and reliable record management through the SQL database. Overall, the system bridges the gap between AI-based disaster detection and practical disaster management, offering a scalable solution for preparedness and response.
Despite its promising results, the system has certain limitations. The dataset, while diverse, is restricted to four disaster categories, and performance may vary when applied to unseen or highly complex disaster scenarios. Additionally, severity estimation relies primarily on visual features, which may not capture the full extent of damage in certain cases.
References
1.N. Jay, K. M. N., M. S., N. H. R., and M.
K. G., Disaster Classification and Assessment, Malnad College of Engineering, Hassan, India, 2023.
2.M. H. Bashir, M. Ahmad, D. R. Rizvi, and
A. A. Abd El-Latif, “Efficient CNN-based disaster events classification using UAV- aided images for emergency response application,” Neural Computing and Applications, vol. 36, pp. 10599–10612, Springer Nature, Mar. 2024.
3.A. Rathod, V. Pariawala, M. Surana, and K. Saxena, “Leveraging CNNs and Ensemble Learning for Automated Disaster Image Classification,” arXiv preprint arXiv:2311.13531, Nov. 2023. 4.A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” Communications of the ACM, vol. 60, no. 6,
pp. 84–90, 2017.
5.K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.
6.
K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), pp. 770–778, 2016.
7.A. Howard et al., “MobileNets: Efficient convolutional neural networks for mobile vision applications,” arXiv preprint arXiv:1704.04861, 2017.
8.Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, pp. 436–444,
2015.
9.S. J. Pan and Q. Yang, “A survey on transfer learning,” IEEE Transactions on Knowledge and Data Engineering, vol. 22, no. 10, pp. 1345–1359, 2010.
10.R. Yamashita, M. Nishio, R. Do, and K. Togashi, “Convolutional neural networks: an overview and application in radiology,” Insights into Imaging, vol. 9, pp. 611–629, 2018.
How to Cite This Paper
Sradha Sreekumar, M.Subalakshitha, S.Yazhini (2026). AUTOMATED INTENSITY CLASSIFICATION OF NATURAL DISASTER WITH AI TECHNOLOGY. International Journal of Computer Techniques, 13(2). ISSN: 2394-2231.