Semi-Supervised Urban Change Detection Using Multi-Modal Sentinel-1 SAR and Sentinel-2 MSI Data
Abstract
:1. Introduction
2. Methods
2.1. Problem Formulation
2.2. Dataset Preparation
2.3. Proposed Method
2.3.1. Network Architecture
2.3.2. Training Process
2.4. Experimental Setup
2.4.1. Comparison Experiments
- U-Net early fusion [17], a classical U-Net that concatenates bitemporal image pairs along the channel axis, also referred to as early fusion.
- Siam-diff [17], which uses two U-Net encoders with shared weights to extract features from the images separately. The extracted bitemporal feature pair is subtracted and subsequently fed to a U-Net decoder via skip connections.
- Siamese SSL [44], which also uses the Siam-diff dual-task network but an unsupervised loss is employed to enforce consistency between the outputs of the change decoder and change predictions derived from the bitemporal buildings predictions obtained from the semantic decoder.
- SemiCD [43], which employs an encoder with shared weights to extract features from bitemporal image pairs. Then, consistency is enforced between the change prediction obtained from decoding the subtracted features and a change prediction obtained from adding small perturbations to the subtracted features by using a separate decoder. It should be noted that while the original paper used several different perturbations, we only considered random feature noise since the ablation study in [43] showed that adding additional perturbations had little effect on the performance of the model.
- Dual0stream U-Net [33], which processes the S1 and S2 image pairs in separate U-Nets using early fusion, before fusing the extracted change features at the decision level.
- Multi-modal Siam-diff [32], which is a multi-modal version of the Siam-diff network, consisting of two encoders to separately extract features from the the S1 and S2 image pair. A single decoder is used to detect changes by concatenating the multi-modal features.
2.4.2. Training Setup
2.4.3. Accuracy Metrics
3. Results
3.1. Change Detection Results
3.2. Semantic Segmentation Results
3.3. Ablation Study
4. Discussion
4.1. Fusion of SAR and Optical Data
4.2. Multi-Modal Consistency Regularization
4.3. Limitations and Perspective
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ban, Y.; Yousif, O. Change detection techniques: A review. Multitemporal Remote Sens. 2016, 19–43. [Google Scholar]
- Paolini, L.; Grings, F.; Sobrino, J.A.; Jiménez Muñoz, J.C.; Karszenbaum, H. Radiometric correction effects in Landsat multi-date/multi-sensor change detection studies. Int. J. Remote Sens. 2006, 27, 685–704. [Google Scholar] [CrossRef]
- Dekker, R. Speckle filtering in satellite SAR change detection imagery. Int. J. Remote Sens. 1998, 19, 1133–1146. [Google Scholar] [CrossRef]
- Lu, D.; Mausel, P.; Brondizio, E.; Moran, E. Change detection techniques. Int. J. Remote Sens. 2004, 25, 2365–2401. [Google Scholar] [CrossRef]
- Lv, Z.; Zhong, P.; Wang, W.; You, Z.; Shi, C. Novel Piecewise Distance based on Adaptive Region Key-points Extraction for LCCD with VHR Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–9. [Google Scholar] [CrossRef]
- Ban, Y.; Yousif, O.A. Multitemporal spaceborne SAR data for urban change detection in China. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 1087–1094. [Google Scholar] [CrossRef]
- Bazi, Y.; Bruzzone, L.; Melgani, F. An unsupervised approach based on the generalized Gaussian model to automatic change detection in multitemporal SAR images. IEEE Trans. Geosci. Remote Sens. 2005, 43, 874–887. [Google Scholar] [CrossRef]
- Bovolo, F.; Marin, C.; Bruzzone, L. A hierarchical approach to change detection in very high resolution SAR images for surveillance applications. IEEE Trans. Geosci. Remote Sens. 2012, 51, 2042–2054. [Google Scholar] [CrossRef]
- Marin, C.; Bovolo, F.; Bruzzone, L. Building change detection in multitemporal very high resolution SAR images. IEEE Trans. Geosci. Remote Sens. 2014, 53, 2664–2682. [Google Scholar] [CrossRef]
- Sun, Y.; Lei, L.; Guan, D.; Wu, J.; Kuang, G.; Liu, L. Image regression with structure cycle consistency for heterogeneous change detection. IEEE Trans. Neural Netw. Learn. Syst. 2022. [Google Scholar] [CrossRef]
- Bruzzone, L.; Prieto, D.F. Automatic analysis of the difference image for unsupervised change detection. IEEE Trans. Geosci. Remote Sens. 2000, 38, 1171–1182. [Google Scholar] [CrossRef]
- Hu, H.; Ban, Y. Unsupervised change detection in multitemporal SAR images over large urban areas. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 3248–3261. [Google Scholar] [CrossRef]
- Cao, G.; Li, Y.; Liu, Y.; Shang, Y. Automatic change detection in high-resolution remote-sensing images by means of level set evolution and support vector machine classification. Int. J. Remote Sens. 2014, 35, 6255–6270. [Google Scholar] [CrossRef]
- Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.S.; Zhang, L.; Xu, F.; Fraundorfer, F. Deep learning in remote sensing: A comprehensive review and list of resources. IEEE Geosci. Remote Sens. Mag. 2017, 5, 8–36. [Google Scholar] [CrossRef]
- Jiang, H.; Peng, M.; Zhong, Y.; Xie, H.; Hao, Z.; Lin, J.; Ma, X.; Hu, X. A Survey on Deep Learning-Based Change Detection from High-Resolution Remote Sensing Images. Remote Sens. 2022, 14, 1552. [Google Scholar] [CrossRef]
- Daudt, R.C.; Le Saux, B.; Boulch, A.; Gousseau, Y. Urban change detection for multispectral earth observation using convolutional neural networks. In Proceedings of the IGARSS 2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 2115–2118. [Google Scholar]
- Daudt, R.C.; Le Saux, B.; Boulch, A. Fully convolutional siamese networks for change detection. In Proceedings of the 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; pp. 4063–4067. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Jiang, H.; Hu, X.; Li, K.; Zhang, J.; Gong, J.; Zhang, M. Pga-siamnet: Pyramid feature-based attention-guided siamese network for remote sensing orthoimagery building change detection. Remote Sens. 2020, 12, 484. [Google Scholar] [CrossRef]
- Fang, S.; Li, K.; Shao, J.; Li, Z. SNUNet-CD: A densely connected Siamese network for change detection of VHR images. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, M.; Hu, X.; Li, K.; Sun, J. A Siamese convolutional neural network with high–low level feature fusion for change detection in remotely sensed images. Remote Sens. Lett. 2021, 12, 387–396. [Google Scholar] [CrossRef]
- Wang, X.; Du, J.; Tan, K.; Ding, J.; Liu, Z.; Pan, C.; Han, B. A high-resolution feature difference attention network for the application of building change detection. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102950. [Google Scholar] [CrossRef]
- Basavaraju, K.; Sravya, N.; Lal, S.; Nalini, J.; Reddy, C.S.; Dell’Acqua, F. UCDNet: A Deep Learning Model for Urban Change Detection from Bi-temporal Multispectral Sentinel-2 Satellite Images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–10. [Google Scholar] [CrossRef]
- Lv, Z.; Zhong, P.; Wang, W.; You, Z.; Falco, N. Multi-scale Attention Network Guided with Change Gradient Image for Land Cover Change Detection Using Remote Sensing Images. IEEE Geosci. Remote Sens. Lett. 2023, 20, 1–5. [Google Scholar]
- Daudt, R.C.; Le Saux, B.; Boulch, A.; Gousseau, Y. Multitask learning for large-scale semantic change detection. Comput. Vis. Image Underst. 2019, 187, 102783. [Google Scholar] [CrossRef]
- Liu, Y.; Pang, C.; Zhan, Z.; Zhang, X.; Yang, X. Building Change Detection for Remote Sensing Images Using a Dual-Task Constrained Deep Siamese Convolutional Network Model. IEEE Geosci. Remote Sens. Lett. 2020, 18, 811–815. [Google Scholar] [CrossRef]
- Papadomanolaki, M.; Vakalopoulou, M.; Karantzalos, K. A Deep Multitask Learning Framework Coupling Semantic Segmentation and Fully Convolutional LSTM Networks for Urban Change Detection. IEEE Trans. Geosci. Remote Sens. 2021, 59, 7651–7668. [Google Scholar] [CrossRef]
- Chen, H.; Shi, Z. A spatial-temporal attention-based method and a new dataset for remote sensing image change detection. Remote Sens. 2020, 12, 1662. [Google Scholar] [CrossRef]
- Chen, H.; Qi, Z.; Shi, Z. Remote sensing image change detection with transformers. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–14. [Google Scholar] [CrossRef]
- Liu, W.; Lin, Y.; Liu, W.; Yu, Y.; Li, J. An attention-based multiscale transformer network for remote sensing image change detection. Isprs J. Photogramm. Remote Sens. 2023, 202, 599–609. [Google Scholar] [CrossRef]
- Bandara, W.G.C.; Patel, V.M. A transformer-based siamese network for change detection. In Proceedings of the IGARSS 2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 207–210. [Google Scholar]
- Ebel, P.; Saha, S.; Zhu, X.X. Fusing multi-modal data for supervised change detection. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, 43, 243–249. [Google Scholar] [CrossRef]
- Hafner, S.; Nascetti, A.; Azizpour, H.; Ban, Y. Sentinel-1 and Sentinel-2 Data Fusion for Urban Change Detection Using a Dual Stream U-Net. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Saha, S.; Shahzad, M.; Ebel, P.; Zhu, X.X. Supervised Change Detection Using Prechange Optical-SAR and Postchange SAR Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 8170–8178. [Google Scholar] [CrossRef]
- Yousif, O.; Ban, Y. Fusion of SAR and optical data for unsupervised change detection: A case study in Beijing. In Proceedings of the 2017 Joint Urban Remote Sensing Event (JURSE), Dubai, United Arab Emirates, 6–8 March 2017; pp. 1–4. [Google Scholar]
- Saha, S.; Bovolo, F.; Bruzzone, L. Unsupervised deep change vector analysis for multiple-change detection in VHR images. IEEE Trans. Geosci. Remote Sens. 2019, 57, 3677–3693. [Google Scholar] [CrossRef]
- Saha, S.; Solano-Correa, Y.T.; Bovolo, F.; Bruzzone, L. Unsupervised deep transfer learning-based change detection for HR multispectral images. IEEE Geosci. Remote Sens. Lett. 2020, 18, 856–860. [Google Scholar] [CrossRef]
- Kondmann, L.; Toker, A.; Saha, S.; Schölkopf, B.; Leal-Taixé, L.; Zhu, X.X. Spatial Context Awareness for Unsupervised Change Detection in Optical Satellite Images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–15. [Google Scholar] [CrossRef]
- Chapelle, O.; Scholkopf, B.; Zien, A. Semi-Supervised Learning (Chapelle, o. et al., Eds.; 2006) [Book reviews]. IEEE Trans. Neural Netw. 2009, 20, 542. [Google Scholar] [CrossRef]
- Oliver, A.; Odena, A.; Raffel, C.; Cubuk, E.D.; Goodfellow, I.J. Realistic evaluation of deep semi-supervised learning algorithms. arXiv 2018, arXiv:1804.09170. [Google Scholar]
- Laine, S.; Aila, T. Temporal ensembling for semi-supervised learning. arXiv 2016, arXiv:1610.02242. [Google Scholar]
- Sajjadi, M.; Javanmardi, M.; Tasdizen, T. Regularization with stochastic transformations and perturbations for deep semi-supervised learning. arXiv 2016, arXiv:1606.04586. [Google Scholar]
- Bandara, W.G.C.; Patel, V.M. Revisiting consistency regularization for semi-supervised change detection in remote sensing images. arXiv 2022, arXiv:2204.08454. [Google Scholar]
- Hafner, S.; Ban, Y.; Nascetti, A. Urban change detection using a dual-task Siamese network and semi-supervised learning. In Proceedings of the IGARSS 2022–2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 1071–1074. [Google Scholar]
- Shu, Q.; Pan, J.; Zhang, Z.; Wang, M. MTCNet: Multitask consistency network with single temporal supervision for semi-supervised building change detection. Int. J. Appl. Earth Obs. Geoinf. 2022, 115, 103110. [Google Scholar] [CrossRef]
- Van Etten, A.; Hogan, D.; Martinez-Manso, J.; Shermeyer, J.; Weir, N.; Lewis, R. The Multi-Temporal Urban Development SpaceNet Dataset. arXiv 2021, arXiv:2102.04420. [Google Scholar]
- Ji, S.; Wei, S.; Lu, M. Fully convolutional networks for multisource building extraction from an open aerial and satellite imagery data set. IEEE Trans. Geosci. Remote Sens. 2018, 57, 574–586. [Google Scholar] [CrossRef]
- Gorelick, N.; Hancher, M.; Dixon, M.; Ilyushchenko, S.; Thau, D.; Moore, R. Google Earth Engine: Planetary-scale geospatial analysis for everyone. Remote Sens. Environ. 2017, 202, 18–27. [Google Scholar] [CrossRef]
- Chini, M.; Pelich, R.; Hostache, R.; Matgen, P.; Lopez-Martinez, C. Towards a 20 m global building map from Sentinel-1 SAR data. Remote Sens. 2018, 10, 1833. [Google Scholar] [CrossRef]
- Hafner, S.; Ban, Y.; Nascetti, A. Exploring the Fusion of Sentinel-1 SAR and Sentinel-2 MSI Data for Built-Up Area Mapping Using Deep Learning. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 4720–4723. [Google Scholar]
- Schmitt, M.; Hughes, L.H.; Qiu, C.; Zhu, X.X. Aggregating cloud-free Sentinel-2 images with Google earth engine. Isprs Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 4, 145–152. [Google Scholar] [CrossRef]
- Duque-Arias, D.; Velasco-Forero, S.; Deschaud, J.E.; Goulette, F.; Serna, A.; Decencière, E.; Marcotegui, B. On power Jaccard losses for semantic segmentation. In Proceedings of the VISAPP 2021: 16th International Conference on Computer Vision Theory and Applications, Online, 8–10 February 2021. [Google Scholar]
- Scheibenreif, L.; Hanna, J.; Mommert, M.; Borth, D. Self-supervised vision transformers for land-cover segmentation and classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 1422–1431. [Google Scholar]
- Hafner, S.; Ban, Y.; Nascetti, A. Unsupervised domain adaptation for global urban extraction using Sentinel-1 SAR and Sentinel-2 MSI data. Remote Sens. Environ. 2022, 280, 113192. [Google Scholar] [CrossRef]
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. Pytorch: An imperative style, high-performance deep learning library. arXiv 2019, arXiv:1912.01703. [Google Scholar]
- Bovolo, F.; Bruzzone, L. The time variable in data fusion: A change detection perspective. IEEE Geosci. Remote Sens. Mag. 2015, 3, 8–26. [Google Scholar] [CrossRef]
- Yu, X.; Wu, X.; Luo, C.; Ren, P. Deep learning in remote sensing scene classification: A data augmentation enhanced convolutional neural network framework. Giscience Remote Sens. 2017, 54, 741–758. [Google Scholar] [CrossRef]
- Loshchilov, I.; Hutter, F. Fixing Weight Decay Regularization in Adam. arXiv 2018, arXiv:1711.05101. [Google Scholar]
- Koppel, K.; Zalite, K.; Voormansik, K.; Jagdhuber, T. Sensitivity of Sentinel-1 backscatter to characteristics of buildings. Int. J. Remote Sens. 2017, 38, 6298–6318. [Google Scholar] [CrossRef]
- Chen, Y.; Bruzzone, L. Self-Supervised Change Detection by Fusing SAR and Optical Multi-Temporal Images. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 3101–3104. [Google Scholar]
- Papadomanolaki, M.; Verma, S.; Vakalopoulou, M.; Gupta, S.; Karantzalos, K. Detecting urban changes with recurrent neural networks from multitemporal Sentinel-2 data. In Proceedings of the IGARSS 2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 214–217. [Google Scholar]
- Hafner, S.; Ban, Y. Multi-Modal Deep Learning for Multi-Temporal Urban Mapping with a Partly Missing Optical Modality. In Proceedings of the 2023 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Pasadena, CA, USA, 16–21 July 2023; pp. 6843–6846. [Google Scholar]
Train | Validation | Test | ||
---|---|---|---|---|
Labeled | Unlabeled | |||
Number of sites | 30 | 20 | 15 | 15 |
Fraction of the Labeled Training Set Used | |||||||||
---|---|---|---|---|---|---|---|---|---|
Input | Network | 10% | 20% | 40% | 100% | ||||
F1 | IoU | F1 | IoU | F1 | IoU | F1 | IoU | ||
S1 | U-Net EF | 0.291 | 0.170 | 0.339 | 0.204 | 0.357 | 0.217 | 0.363 | 0.222 |
Siam-diff | 0.182 | 0.100 | 0.368 | 0.226 | 0.359 | 0.219 | 0.410 | 0.246 | |
Siam-diff DT | 0.267 | 0.154 | 0.341 | 0.206 | 0.363 | 0.222 | 0.414 | 0.261 | |
S2 | U-Net EF | 0.266 | 0.153 | 0.429 | 0.273 | 0.466 | 0.303 | 0.520 | 0.351 |
Siam-diff | 0.347 | 0.210 | 0.447 | 0.288 | 0.522 | 0.353 | 0.522 | 0.353 | |
Siam-diff DT | 0.350 | 0.212 | 0.459 | 0.298 | 0.484 | 0.319 | 0.551 | 0.380 | |
Siamese SSL | 0.387 | 0.240 | 0.478 | 0.314 | 0.496 | 0.330 | 0.515 | 0.347 | |
SemiCD | 0.402 | 0.252 | 0.467 | 0.305 | 0.506 | 0.338 | 0.513 | 0.345 | |
S1S2 | DS U-Net | 0.309 | 0.183 | 0.397 | 0.248 | 0.522 | 0.354 | 0.559 | 0.388 |
MM Siam-diff | 0.383 | 0.237 | 0.458 | 0.297 | 0.500 | 0.333 | 0.554 | 0.383 | |
Proposed | 0.491 | 0.325 | 0.501 | 0.335 | 0.537 | 0.367 | 0.555 | 0.384 |
Fraction of the Labeled Training Set Used | |||||||||
---|---|---|---|---|---|---|---|---|---|
Input | Network | 10% | 20% | 40% | 100% | ||||
F1 | IoU | F1 | IoU | F1 | IoU | F1 | IoU | ||
S1 | Siam-diff DT | 0.302 | 0.178 | 0.399 | 0.249 | 0.504 | 0.337 | 0.480 | 0.316 |
Proposed | 0.361 | 0.221 | 0.475 | 0.312 | 0.492 | 0.327 | 0.486 | 0.321 | |
S2 | Siam-diff DT | 0.356 | 0.216 | 0.488 | 0.323 | 0.524 | 0.355 | 0.589 | 0.417 |
Siamese SSL | 0.416 | 0.263 | 0.473 | 0.310 | 0.515 | 0.346 | 0.524 | 0.355 | |
Proposed | 0.414 | 0.261 | 0.559 | 0.388 | 0.578 | 0.406 | 0.591 | 0.420 | |
S1S2 | Proposed | 0.526 | 0.356 | 0.590 | 0.418 | 0.586 | 0.415 | 0.612 | 0.441 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hafner, S.; Ban, Y.; Nascetti, A. Semi-Supervised Urban Change Detection Using Multi-Modal Sentinel-1 SAR and Sentinel-2 MSI Data. Remote Sens. 2023, 15, 5135. https://doi.org/10.3390/rs15215135
Hafner S, Ban Y, Nascetti A. Semi-Supervised Urban Change Detection Using Multi-Modal Sentinel-1 SAR and Sentinel-2 MSI Data. Remote Sensing. 2023; 15(21):5135. https://doi.org/10.3390/rs15215135
Chicago/Turabian StyleHafner, Sebastian, Yifang Ban, and Andrea Nascetti. 2023. "Semi-Supervised Urban Change Detection Using Multi-Modal Sentinel-1 SAR and Sentinel-2 MSI Data" Remote Sensing 15, no. 21: 5135. https://doi.org/10.3390/rs15215135
APA StyleHafner, S., Ban, Y., & Nascetti, A. (2023). Semi-Supervised Urban Change Detection Using Multi-Modal Sentinel-1 SAR and Sentinel-2 MSI Data. Remote Sensing, 15(21), 5135. https://doi.org/10.3390/rs15215135