
Remote Sensing, Journal Year: 2025, Volume and Issue: 17(9), P. 1609 - 1609
Published: May 1, 2025
Novel view synthesis of remote sensing scenes from satellite images is a meaningful but challenging task. Due to the wide temporal span image acquisition, collections often exhibit significant appearance variations, such as seasonal changes and shadow movements, well transient objects, making it difficult reconstruct original scene accurately. Previous work has noted that large amount variation in caused by changing light conditions. To address this, researchers have proposed incorporating direction solar rays into neural radiance fields (NeRF) model sunlight reaching each point scene. However, this approach fails effectively account for variations suffers long training time slow rendering speeds due need evaluate numerous samples field pixel. achieve fast, efficient, high-quality novel multi-temporal scenes, we propose SatGS, method leverages 3D Gaussian points reconstruction with an appearance-adaptive adjustment strategy. This strategy enables our adaptively adjust features regions rendered based on characteristics angles. Additionally, impact objects mitigated through use visibility maps uncertainty optimization. Experiments conducted WorldView-3 demonstrate SatGS not only renders superior quality compared existing State-of-the-Art methods also surpasses them speed, showcasing its potential practical applications sensing.
Language: Английский