Combining hyperspectral UAV and mulitspectral FORMOSAT-2 imagery for precision agriculture applications

Detta är en Master-uppsats från Lunds universitet/Institutionen för naturgeografi och ekosystemvetenskap

Sammanfattning: Precision agriculture requires detailed information regarding the crop status variability within a field. Remote sensing provides an efficient way to obtain such information through observing biophysical parameters, such as canopy nitrogen content, leaf coverage, and plant biomass. However, individual remote sensing sensors often fail to provide information which meets the spatial and temporal resolution required by precision agriculture. The purpose of this study is to investigate methods which can be used to combine imagery from various sensors in order to create a new dataset which comes closer to meeting these requirements. More specifically, this study combined multispectral satellite imagery (Formosat-2) and hyperspectral Unmanned Aerial Vehicle (UAV) imagery of a potato field in the Netherlands. The imagery from both platforms was combined in two ways. Firstly, data fusion methods brought the spatial resolution of the Formosat-2 imagery (8 m) down to the spatial resolution of the UAV imagery (1 m). Two data fusion methods were applied: an unmixing-based algorithm and the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The unmixing-based method produced vegetation indices which were highly correlated to the measured LAI (rs= 0.866) and canopy chlorophyll values (rs=0.884), whereas the STARFM obtained lower correlations. Secondly, a Spectral-Temporal Reflectance Surface (STRS) was constructed to interpolate a daily 101 band reflectance spectra using both sources of imagery. A novel STRS method was presented, which utilizes Bayesian theory to obtain realistic spectra and accounts for sensor uncertainties. The resulting surface obtained a high correlation to LAI (rs=0.858) and canopy chlorophyll (rs=0.788) measurements at field level. The multi-sensor datasets were able to characterize significant differences of crop status due to differing nitrogen fertilization regimes from June to August. Meanwhile, the yield prediction models based purely on the vegetation indices extracted from the unmixing-based fusion dataset explained 52.7% of the yield variation, whereas the STRS dataset was able to explain 72.9% of the yield variability. The results of the current study indicate that the limitations of each individual sensor can be largely surpassed by combining multiple sources of imagery, which is beneficial for agricultural management. Further research could focus on the integration of data fusion and STRS techniques, and the inclusion of imagery from additional sensors.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)