Spectral-Spatial Deep Learning model for seaweed cultivation mapping using PlanetScope imagery in Pangkajene and Islands Regency

Authors

  • Marzuki Master of Remote Sensing, Faculty of Geography, Universitas Gadjah Mada, Yogyakarta, Indonesia
  • Sanjiwana Arjasakusuma Department of Geographic Information Science, Faculty of Geography, Universitas Gadjah Mada, Yogyakarta, Indonesia
  • Nurul Khakhim Department of Geographic Information Science, Faculty of Geography, Universitas Gadjah Mada, Yogyakarta, Indonesia
  • Pramaditya Wicaksono Department of Geographic Information Science, Faculty of Geography, Universitas Gadjah Mada, Yogyakarta, Indonesia
  • Nur Mohammad Farda Department of Geographic Information Science, Faculty of Geography, Universitas Gadjah Mada, Yogyakarta, Indonesia
  • Nur Laila Eka Utami Bachelor in Cartography and Remote Sensing Study, Faculty of Geography, Universitas Gadjah Mada, Yogyakarta, Indonesia

DOI:

https://doi.org/10.33175/mtr.2025.273926

Keywords:

Spectral-Spatial, Deep Learning, Seaweed Cultivation, Remote Sensing, Spectral-Spatial; Deep Learning; Seaweed cultivation; Remote sensing

Abstract

The efficient mapping of seaweed cultivation over large areas is essential for supporting sustainable management of coastal resources. This study introduces a novel Spectral-Spatial Deep Learning model that integrates spectral and spatial data from high-resolution remote sensing imagery to automate and improve the accuracy of seaweed cultivation mapping. Based on a Convolutional Neural Network architecture, UNet, enhanced with a Spectral-Spatial Attention Module, the model effectively captures the complex relationships between seaweed and its environment. PlanetScope imagery, known for its high spectral and spatial resolution, serves as the primary input data. The model’s performance was evaluated using evaluation metrics, achieving an accuracy of 94.71 %, loss of 13.09 %, precision of 80.93 %, recall of 73.63 %, and Intersection over Union (IoU) of 48.51 % on the training data. For the validation data, the model attained an accuracy of 93.64 %, loss of 16.75 %, precision of 84.34 %, recall of 57.57 %, and IoU of 42.98 %. These results demonstrate the model’s ability to rapidly and accurately map seaweed cultivation areas, making it a valuable tool for environmental monitoring.

Highlights

  • Efficient mapping of seaweed cultivation over large areas is crucial for sustainable coastal resource management.
  • Remote sensing technology, integrated with Artificial Intelligence (AI) methods like Deep Learning, enhances the efficiency of seaweed cultivation mapping.
  • The integration of remote sensing’s spatial, spectral, and temporal advantages with Deep Learning-based image segmentation forms the foundation for automating seaweed cultivation mapping.

References

Alam, M., Wang, J. F., Guangpei, C., Yunrong, L., & Chen, Y. (2021). Convolutional neural network for the semantic segmentation of remote sensing images. Mobile Networks and Applications, 26(1), 200-215. https://doi.org/10.1007/s11036-020-01703-3

Andréfouët, S., Dewantama, I. M. I., & Ampou, E. E. (2021). Seaweed farming collapse and fast changing socio-ecosystems exacerbated by tourism and natural hazards in Indonesia: A view from space and from the households of Nusa Lembongan Island. Ocean and Coastal Management, 207, 105586. https://doi.org/10.1016/j.ocecoaman.2021.105586

Bajpai, K., Student, M., & Soni, R. (2017). Analysis of image enhancement techniques used in remote sensing satellite imagery. International Journal of Computer Applications, 169(10), 1-11. http://dx.doi.org/10.5120/ijca2017914884

BPS Provinsi Sulawesi Selatan. (2021). Hasil Survei Komoditas Perikanan Potensi Rumput Laut Provinsi Sulawesi Selatan 2021. Badan Pusat Statistik (BPS).

Cheng, J., Jia, N., Chen, R., Guo, X., Ge, J., & Zhou, F. (2022). High-resolution mapping of seaweed aquaculture along the Jiangsu Coast of China using google earth engine (2016-2022). Remote Sensing, 14(24), 6202. https://doi.org/10.3390/rs14246202

Colliot, O. (2023). Machine learning for brain disorders. Humana Press. https://doi.org/10.1007/978-1-0716-3195-9

Dang, L., Weng, L., Dong, W., Li, S., & Hou, Y. (2022). Spectral-spatial attention transformer with dense connection for hyperspectral image classification. Computational Intelligence and Neuroscience, 2022, 7071485. https://doi.org/10.1155/2022/7071485

Dora, J. T., Lindberg, S. K., James, P., & Wang, X. (2024). Assessing the potential of fluorescence as a monitoring tool for reproductive tissue in selected macroalgae species. Journal of Applied Phycology, 36, 2153-2159. https://doi.org/10.1007/s10811-024-03211-3

Jin, R., Ye, Z., Chen, S., Gu, J., He, J., Huang, L., Christakos, G., Agusti, S., Duarte, C. M., & Wu, J. (2023). Accurate mapping of seaweed farms with high-resolution imagery in China. Geocarto International, 38(1), 2203114. https://doi.org/10.1080/10106049.2023.2203114

Karimi, D., Dou, H., Warfield, S. K., & Gholipour, A. (2020). Deep learning with noisy labels: Exploring techniques and remedies in medical image analysis. Medical Image Analysis, 65, 101759. https://doi.org/10.1016/J.MEDIA.2020.101759

Kington, J., & Collison, A. (2022). Scene level normalization and harmonization of planet dove imagery. San Francisco, USA: Planet Labs.

Kirillov, A., Mintun, E., Ravi, N., Mao, H., Rolland, C., Gustafson, L., Xiao, T., Whitehead, S., Berg, A. C., Lo, W. Y., Dollár, P., & Girshick, R. (2023). Segment anything. https://doi.org/10.48550/arXiv.2304.026433

Langford, A., Waldron, S., Sulfahri, & Saleh, H. (2021). Monitoring the COVID-19-affected Indonesian seaweed industry using remote sensing data. Marine Policy, 127, 104431. https://doi.org/10.1016/j.marpol.2021.104431

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436-444. https://doi.org/10.1038/nature14539

Li, C., Liu, Y., Yin, H., Li, Y., Guo, Q., Zhang, L., & Du, P. (2021). Attention residual U-Net for building segmentation in aerial images (pp. 4047-4050). In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium, Brussels, Belgium. https://doi.org/10.1109/IGARSS47720.2021.9554058

Liu, J., Lu, Y., Guo, X., & Ke, W. (2023). A deep learning method for offshore raft aquaculture extraction based on medium resolution remote sensing images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 16, 6296-6309. https://doi.org/10.1109/JSTARS.2023.3291499

Ma, L., Liu, Y., Zhang, X., Ye, Y., Yin, G., & Johnson, B. A. (2019). Deep learning in remote sensing applications: A meta-analysis and review. ISPRS Journal of Photogrammetry and Remote Sensing, 152, 166-177. https://doi.org/10.1016/j.isprsjprs.2019.04.015

Marquez, L., Fragkopoulou, E., Cavanaugh, K. C., Houskeeper, H. F., & Assis, J. (2022). Artificial intelligence convolutional neural networks map giant kelp forests from satellite imagery. Scientific Reports, 12(1), 22196. https://doi.org/10.1038/s41598-022-26439-w

Mei, X., Pan, E., Ma, Y., Dai, X., Huang, J., Fan, F., Du, Q., Zheng, H., & Ma, J. (2019). Spectral-spatial attention networks for hyperspectral image classification. Remote Sensing, 11(8), 963. https://doi.org/10.3390/rs11080963

Nurdin, N., Alevizos, E., Syamsuddin, R., Asis, H., Zainuddin, E. N., Aris, A., Oiry, S., Brunier, G., Komatsu, T., & Barillé, L. (2023). Precision aquaculture drone mapping of the spatial distribution of Kappaphycus alvarezii biomass and carrageenan. Remote Sensing, 15(14), 3674. https://doi.org/10.3390/rs15143674

Osco, L. P., Wu, Q., de Lemos, E. L., Gonçalves, W. N., Ramos, A. P. M., Li, J., & Junior, J. M. (2023). The segment anything model (SAM) for remote sensing applications: From zero to one shot. http://arxiv.org/abs/2306.16623

Planet. (2023). Planet imagery product specifications. San Francisco, USA: Planet Labs.

Pratama, I., & Albasri, H. (2021). Mapping and estimating harvest potential of seaweed culture using Worldview-2 Satellite images: A case study in Nusa Lembongan, Bali - Indonesia. Aquatic Living Resources, 34, 15. https://doi.org/10.1051/alr/2021015

Ronneberger, O., Fischer, P., & Brox, T. (2015). U-net: Convolutional networks for biomedical image segmentation. Lecture Notes in Computer Science, 9351, 234-241. https://doi.org/10.1007/978-3-319-24574-4_28

Takahashi, R., Matsubara, T., & Uehara, K. (2015). Data augmentation using random image cropping and patching for deep CNNs. Journal Of Latex Class Files, 14(8), 1-16. https://doi.org/10.1109/TCSVT.2019.2935128

Wang, X., Jing, S., Dai, H., & Shi, A. (2023). High-resolution remote sensing images semantic segmentation using improved UNet and SegNet. Computers and Electrical Engineering, 108, 108734. https://doi.org/10.1016/j.compeleceng.2023.108734

Wang, X., Wang, X., Zhao, K., Zhao, X., & Song, C. (2022). FSL-Unet: Full-scale linked Unet with spatial-spectral joint perceptual attention for hyperspectral and multispectral image fusion. IEEE Transactions on Geoscience and Remote Sensing, 60, 5539114. https://doi.org/10.1109/TGRS.2022.3208125

Wu, Q., & Osco, L. P. (2023). Samgeo: A Python package for segmenting geospatial data with the segment anything model (SAM). Journal of Open Source Software, 8(89), 5663. https://doi.org/10.21105/joss.05663

Yan, S., Xu, L., Yu, G., Yang, L., Yun, W., Zhu, D., Ye, S., & Yao, X. (2021). Glacier classification from Sentinel-2 imagery using spatial-spectral attention convolutional model. International Journal of Applied Earth Observation and Geoinformation, 102, 102445. https://doi.org/10.1016/j.jag.2021.102445

Zhao, Z., Chen, Y., Li, K., Ji, W., & Sun, H. (2024). Extracting photovoltaic panels from heterogeneous remote sensing images with spatial and spectral differences. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 17, 5553-5564. https://doi.org/10.1109/JSTARS.2024.3369660

Zhu, M., Jiao, L., Liu, F., Yang, S., & Wang, J. (2021). Residual spectral-spatial attention network for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing, 59(1), 449-462. https://doi.org/10.1109/TGRS.2020.2994057

Downloads

Published

2024-11-23