
Journal of Remote Sensing, Год журнала: 2024, Номер 4
Опубликована: Май 19, 2024
Agricultural applications of remote sensing data typically require high spatial resolution and frequent observations. The increasing availability imagery meets the requirement well. However, long revisit period cloud contamination severely compromise their ability to monitor crop growth, which is characterized by temporal heterogeneity. Many spatiotemporal fusion methods have been developed produce synthetic images with resolutions. these existing focus on fusing low medium satellite in terms model development validation. When it comes images, applicability remains unknown may face various challenges. To address this issue, we propose a novel method, dual-stream decoupling architecture model, fully realize prediction images. Compared other methods, has distinct advantages: (a) It maintains accuracy good detail combining deep-learning-based super-resolution method partial least squares regression through edge color-based weighting loss function; (b) demonstrates improved transferability over time introducing image gradient maps model. We tested StarFusion at 3 experimental sites compared 4 traditional methods: STARFM (spatial adaptive reflectance fusion), FSDAF (flexible Fit-FC (regression fitting, filtering, residual compensation), FIRST (fusion incorporating spectral autocorrelation), deep learning base method—super-resolution generative adversarial network. In addition, also investigated possibility our use multiple pairs coarse fine training process. results show that provide better overall performance but both them are than comparison methods. Considering difficulty obtaining cloud-free practice, recommended high-quality Gaofen-1 most cases since degradation single pair not significant.
Язык: Английский