Comparative Assessment of Different Architectures of Convolutional Neural Network for Semantic Segmentation of Forest Disturbances from Multi-Temporal Satellite Images
- Autores: Podoprigorova N.S.1, Tarasov A.V.2, Shikhov A.N.2, Kanev A.I.1
-
Afiliações:
- Bauman Moscow State Technical University
- Perm State University
- Edição: Nº 3 (2024)
- Páginas: 3-15
- Seção: МЕТОДЫ И СРЕДСТВА ОБРАБОТКИ И ИНТЕРПРЕТАЦИИ КОСМИЧЕСКОЙ ИНФОРМАЦИИ
- URL: https://ogarev-online.ru/0205-9614/article/view/270599
- DOI: https://doi.org/10.31857/S0205961424030013
- EDN: https://elibrary.ru/FBIUOM
- ID: 270599
Citar
Resumo
Algorithms based on convolutional neural networks are the most efficient for semantic segmentation of images, including segmentation of forest cover disturbances from satellite images. In this study, we consider the applicability of various modifications of the U-net architecture of convolutional neural network for recognizing logged, burnt and windthrow areas in forests from multi-temporal and multi-seasonal Sentinel-2 satellite images. The assessment was carried out on three test sites that differ significantly in the characteristics of forest stands and forest management. The highest accuracy (average F-measure of 0.59) was obtained from the U-net model, while the models that showed the best results during training (Attention U-Net and MobilNetv2 U-Net) did not improve segmentation of independent data. The resulting accuracy estimates are close to those previously published for forests with a substantial proportion of selective logged areas. Characteristics of logged areas and windthrows, namely their area and type are the main factor determining the accuracy of semantic segmentation. Substantial differences were also revealed between the images taken in different seasons of the year, with the maximum segmentation accuracy based on winter pairs of images. According to summertime and different-season pairs of images, the area of forest disturbances is substantially underestimated. Forest species composition has a less significant effect, although for two of the three test sites, the maximum accuracy was observed in dark coniferous forests, and the minimum in deciduous forests. There was no statistically significant effect of slope lighting factor calculated from digital elevation model on segmentation accuracy based for winter pairs of images. The accuracy of segmentation of burnt areas, which was assessed using the example of 14 large forest fires in 2021-2022, is unsatisfactory, which is probably due to the varying degrees of damage to the forest cover in the burnt areas.
Palavras-chave
Texto integral

Sobre autores
N. Podoprigorova
Bauman Moscow State Technical University
Email: and3131@inbox.ru
Rússia, Moscow
A. Tarasov
Perm State University
Email: and3131@inbox.ru
Rússia, Perm
A. Shikhov
Perm State University
Autor responsável pela correspondência
Email: and3131@inbox.ru
Rússia, Perm
A. Kanev
Bauman Moscow State Technical University
Email: and3131@inbox.ru
Rússia, Moscow
Bibliografia
- Al-Dabbagh A.M., Ilyas M. Uni-temporal Sentinel-2 imagery for wildfire detection using deep learning semantic segmentation models. Geomatics, Nat. Hazards and Risk. 2023. V. 14(1). Art. No. 2196370. doi: 10.1080/19475705.2023.2196370.
- Bartalev S.A., Egorov V.A., Zharko V.O., Lupyan E.A., Plotnikov D.E., Khvostikov S.A., Shabanov N.V. Sputnikovoe kartografirovanie rastitel'nogo pokrova Rossii [Satellite-based mapping of the vegetation cover in Russia]. Moscow, Space Research Institute of RAS. 2016. 208 p. (In Russian).
- Gorbachev V.A., Krivorotov I.A., Markelov A.O., Kotlyarova E.V. Semanticheskaya segmentatsiya sputnikovykh snimkov aeroportov s pomoshch'yu svertochnykh neironnykh setei [Semantic segmentation of airport satellite images using convolutional neural networks]. Komp'yuternaya optika. 2020. V. 44(4). P. 636–645. doi: 10.18287/2412-6179-CO-636. (In Russian).
- Hansen M.C., Potapov P.V., Moore R., Hancher M., Turubanova S.A., Tyukavina A., Thau D., Stehman S.V., Goetz S.J., Loveland T.R., Kommareddy A., Egorov A., Chini L., Justice C.O., Townshend J.R.G. High-Resolution Global Maps of 21st-Century Forest Cover Change. Science. 2013. V. 342(6160). P. 850–853. doi: 10.1126/science.1244693.
- Hawker L., Uhe P., Paulo L., Sosa J., Savage J., Sampson C., Neal J. A 30 m global map of elevation with forests and buildings removed. Environ. Res. Letters, 2022. V. 17. Art. No. 024016. doi: 10.1088/1748-9326/ac4d4f.
- He K., Zhang X., Ren S., Sun J. Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2016. P. 770–778.
- John D., Zhang C. An attention-based U-Net for detecting deforestation within satellite sensor imagery. Int. J. Applied Earth Observations Geoinf. 2022. V. 107. Art. No. 102685. doi: 10.1016/j.jag.2022.102685.
- Ibtehaz N., Rahman M.S. MultiResUNet: Rethinking the U-Net architecture for multimodal biomedical image segmentation. Neural networks. 2020. V. 121. P. 74–87. doi: 10.1016/j.neunet.2019.08.025.
- Isaienkov K., Yushchuk M., Khramtsov V., Seliverstov O. Deep Learning for Regular Change Detection in Ukrainian Forest Ecosystem with Sentinel-2. IEEE J. Selected Topics in Applied Earth Observations and Rem. Sens. 2021. V. 14. P. 364–376. doi: 10.1109/JSTARS.2020.3034186.
- Kanev A.I., Tarasov A.V., Shikhov A.N., Podoprigorova N.S., Safonov F.A. Raspoznavanie vyrubok i vetrovalov po sputnikovym snimkam Sentinel-2 s primeneniem svertochnoi neironnoi seti U-net i faktory, vliyayushchie na ego tochnost' [Identification of logged and windthrow areas from Sentinel-2 satellite images using the U-net convolutional neural network and factors affecting its accuracy]. Sovremennye problemy distantsionnogo zondirovaniya Zemli iz kosmosa. 2023. V. 20(3). P. 136–151. doi: 10.21046/2070-7401-2023-20-3-136-151. (In Russian).
- Kislov D.E., Korznikov K.A. Automatic windthrow detection using very-high-resolution satellite imagery and deep learning. Rem. Sens. 2020. V. 12(7). Art. No. 1145. doi: 10.3390/rs12071145.
- Kislov D.E., Korznikov K.A., Altman J., Vozmishcheva A.S., Krestov P.V. Extending deep learning approaches for forest disturbance segmentation on very high-resolution satellite images. Rem. Sens. Ecol. Conservation. 2021. V. 7(3). P. 355–368. doi: 10.1002/rse2.194.
- Knopp L., Wieland M., Rättich M., Martinis S. A Deep Learning Approach for Burned Area Segmentation with Sentinel-2 Data. Rem. Sens. 2020. V. 12. Art. No. 2422. doi: 10.3390/rs12152422.
- Larabi M., Liu Q., Wang Y. Convolutional neural network features based change detection in satellite images. Proc. 1 st Intern. Workshop Pattern Recognition, RRPR 2016. Dec. 4, 2016, Cancún, Mexico. 2016. Art. No. 100110W.
- Lee C., Park S., Kim T., Liu S., Md Reba M.N., Oh J., Han Y. Machine Learning-Based Forest Burned Area Detection with Various Input Variables: A Case Study of South Korea. Applied Sci. 2022. V. 12. Art. No. 10077. doi: 10.3390/app121910077.
- Lupyan E.A., Bartalev S.A., Balashov I.V., Bartalev S.S., Burtsev M.A., Egorov V.A., Efremov V.Yu., Zharko V.O., Kashnitskii A.V., Kolbudaev P.A., Kramareva L.S., Mazurov A.A., Oksyukevich A.Yu., Plotnikov D.E., Proshin A.A., Sen'ko K.S., Uvarov I.A., Khvostikov S.A., Khovratovich T.S. Informatsionnaya sistema kompleksnogo distantsionnogo monitoringa lesov [“VEGA-Primor'e” Vega-Primorie: complex remote forest monitoring information system], Sovremennye problemy distantsionnogo zondirovaniya Zemli iz kosmosa. 2016. V. 13(5). P. 11–28. doi: 10.21046/2070-7401-2016-13-5-11-28. (In Russian).
- Mou L., Bruzzone L., Zhu X.X. Learning spectral-spatialoral features via a recurrent convolutional neural network for change detection in multispectral imagery. IEEE Trans. Geosci. Rem. Sens. 2019. V. 57(2). P. 924–935. doi: 10.1109/TGRS.2018.2863224.
- Mountrakis G., Im J. Ogole C. Support vector machines in remote sensing: A review. ISPRS J. of Photogram. Rem. Sens. 2011. V. 66(3). P. 247–259. doi: 10.1016/j.isprsjprs.2010.11.001.
- Potapov P., Li X., Hernandez-Serna A., Tyukavina A., Hansen M.C., Kommareddy A., Pickens A., Turubanova S., Tang H., Silva C.E., Armston J., Dubayah R., Blair J.B., Hofton M. Mapping global forest canopy height through integration of GEDI and Landsat data. Rem. Sens. Environ. 2021. V. 253. Art. No. 112165. doi: 10.1016/j.rse.2020.112165.
- Pyo J., Han K.-j., Cho Y., Kim D., Jin D. Generalization of U-Net Semantic Segmentation for Forest Change Detection in South Korea Using Airborne Imagery. Forests. 2022. V. 13. Art. No. 2170. doi: 10.3390/f13122170.
- Rodriguez-Galiano V.F., Ghimire B., Rogan J., Chica-Olmo M., Rigol-Sanchez J.P. An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS J. Photogram. Rem. Sens. 2012. V. 67(1). P. 93–104. doi: 10.1016/j.isprsjprs.2011.11.002.
- Ronneberger O., Fischer P., Brox T. U-Net: Convolutional networks for biomedical image segmentation. arXiv: e-print service. arXiv:1505.04597. 2015. 8 p. https://arxiv.org/pdf/1505.04597.pdf.
- Sandler M., Howard A.G., Zhu M., Zhmoginov A., Chen L-C. MobileNetV2: Inverted residuals and linear bottlenecks. IEEE Conf. Computer Vis. Pattern Recognition (CVPR), 2018. P. 4510−4520.
- Shirvani Z., Abdi O., Goodman R.C. High-Resolution Semantic Segmentation of Woodland Fires Using Residual Attention UNet and Time Series of Sentinel-2. Rem. Sens. 2023. V. 15. Art. No. 1342. doi: 10.3390/rs15051342.
- Scharvogel D., Brandmeier M., Weis M. A Deep Learning Approach for Calamity Assessment Using Sentinel-2 Data. Forests. 2020. V. 11(2). Art. No. 1239. 21 p. doi: 10.3390/f11121239.
- Tarasov A.V., Shikhov A.N., Shabalina T.V. Raspoznavanie narushenii lesnogo pokrova po sputnikovym snimkam Sentinel-2 s pomoshch'yu svertochnykh neironnykh setei [Detection of forest disturbances in Sentinel-2 images with convolutional neural networks]. Sovremennye problemy distantsionnogo zondirovaniya Zemli iz kosmosa. 2021. V. 18(3). P. 51–64. doi: 10.21046/2070-7401-2021-18-3-51-64. (In Russian).
- Trier O., Salberg A., Larsen R., Nyvoll O.T. Detection of forest roads in Sentinel-2 images using U-Net. Proc. Northern Lights Deep Learning Workshop, 2022. V. 3. doi: 10.7557/18.6246.
Arquivos suplementares
