Fusion of multi-source UAV and PhenoCam data for advancing forage crop monitoring and yield prediction
Abstract
Kang Xu*, Yihang Wu, Yifan Zhang and Xia Wang
Accurate monitoring and reliable prediction of forage crop productivity are essential for promoting sustainable agriculture and ensuring food security. Low-cost and non-invasive remote sensing platforms, particularly Unmanned Aerial Vehicles (UAVs) and PhenoCams, offer substantial potential for achieving these objectives. This study reviews the main types of sensors, including Red, Green, Blue (RGB) bands, multispectral, hyperspectral, thermal infrared, and Light Detection and Ranging (LiDAR), deployed on UAV platforms and ground- based PhenoCams, as well as their applications in forage crop monitoring and yield estimation. It further examines the algorithmic models and predictive frameworks developed from Vegetation Indices (VIs) and plant-level traits derived from these diverse data sources. Existing research indicates that integrating data from multiple platforms leverages their complementary strengths, thereby enabling more precise yield prediction models. Furthermore, traditional regression models are increasingly being outperformed by Artificial Intelligence (AI)-driven models, which excel at processing large, multi-dimensional datasets from remote sensing. Future research should focus on developing standardized, multi-scale monitoring protocols that integrate ground-based, satellite, and UAV platforms. This requires establishing more comprehensive, openly shared datasets and developing refined yield prediction models that incorporate the physiological adaptations of different forage species and include region-specific parameterization. Such integrated frameworks are essential for translating precise, data-driven insights into actionable management strategies, particularly for informing decisions on fertilization, irrigation, and harvesting.
HTML PDFShare this article