E-mail: editor@ijeetc.com; nancy.liu@ijeetc.com
Prof. Pascal Lorenz
University of Haute Alsace, FranceIt is my honor to be the editor-in-chief of IJEETC. The journal publishes good papers which focus on the advanced researches in the field of electrical and electronic engineering & telecommunications.
2024-11-13
2024-10-24
2024-09-24
Manuscript received October 22, 2023; revised December 18, 2023; accepted January 15, 2024.
Abstract—This paper presents the SARSNet architecture, developed to address the growing challenges in Synthetic Aperture Radar (SAR) deep learning-based automatic water body extraction. Such a task is riddled with significant challenges, encompassing issues like cloud interference, scarcity of annotated dataset, and the intricacies associated with varied topography. Recent strides in Convolutional Neural Networks (CNNs) and multispectral segmentation techniques offer a promising avenue to address these predicaments. In our research, we propose a series of solutions to elevate the process of water body segmentation. Our proposed solutions span several domains, including image resolution enhancement, refined extraction techniques tailored for narrow water bodies, self-balancing of the class pixel level, and minority class-influenced loss function, all aimed at amplifying prediction precision and streamlining computational complexity inherent in deep neural networks. The framework of our approach includes the introduction of a multichannel Data-Fusion Register, the incorporation of a CNN-based Patch Adaptive Network augmentation method, and the integration of class pixel level balancing and the Tversky loss function. We evaluated the performance of the model using the Sentinel-1 SAR electromagnetic signal dataset from the Earth flood water body extraction competition organized by the artificial intelligence department of Microsoft. In our analysis, our suggested SARSNet was compared to well-known semantic segmentation models, and a comprehensive assessment demonstrates that SARSNet consistently outperforms these models in all data subsets, including training, validation, and testing sets.