Mini Review - Modern Phytomorphology ( 2025) Volume 19, Issue 6
Fusion of multi-source UAV and PhenoCam data for advancing forage crop monitoring and yield prediction
Kang Xu*, Yihang Wu, Yifan Zhang and Xia WangKang Xu, Jiangsu Key Laboratory for Recognition and Remediation of Emerging Pollutants in Taihu Basin, School of Environmental Science and Engineering, Wuxi University, Wuxi, 214105, China, Email: xukang@cwxu.edu.cn
Received: 01-Dec-2025, Manuscript No. mp-25-176564; Accepted: 30-Dec-2025, Pre QC No. mp-25-176564 (PQ); Editor assigned: 03-Dec-2025, Pre QC No. mp-25-176564 (PQ); Reviewed: 17-Dec-2025, QC No. mp-25-176564; Revised: 24-Dec-2025, Manuscript No. mp-25-176564 (R); Published: 31-Dec-2025, DOI: 10.5281/zenodo.18231119
Abstract
Accurate monitoring and reliable prediction of forage crop productivity are essential for promoting sustainable agriculture and ensuring food security. Low-cost and non-invasive remote sensing platforms, particularly Unmanned Aerial Vehicles (UAVs) and PhenoCams, offer substantial potential for achieving these objectives. This study reviews the main types of sensors, including Red, Green, Blue (RGB) bands, multispectral, hyperspectral, thermal infrared, and Light Detection and Ranging (LiDAR), deployed on UAV platforms and ground- based PhenoCams, as well as their applications in forage crop monitoring and yield estimation. It further examines the algorithmic models and predictive frameworks developed from Vegetation Indices (VIs) and plant-level traits derived from these diverse data sources. Existing research indicates that integrating data from multiple platforms leverages their complementary strengths, thereby enabling more precise yield prediction models. Furthermore, traditional regression models are increasingly being outperformed by Artificial Intelligence (AI)-driven models, which excel at processing large, multi-dimensional datasets from remote sensing. Future research should focus on developing standardized, multi-scale monitoring protocols that integrate ground-based, satellite, and UAV platforms. This requires establishing more comprehensive, openly shared datasets and developing refined yield prediction models that incorporate the physiological adaptations of different forage species and include region-specific parameterization. Such integrated frameworks are essential for translating precise, data-driven insights into actionable management strategies, particularly for informing decisions on fertilization, irrigation, and harvesting.
Keywords
Forage crop, Yield prediction, Unmanned aerial system, Phenology, Precision agriculture, Vegetation indices, Machine learning
Introduction
The production of forage crops provides essential feed for global livestock and supports the increasing demand for ruminant livestock products (Mottet, et al. 2017). As an integral component of agriculture, the livestock industry not only supplies high-quality protein products for human consumption but also plays a pivotal role in promoting rural economic development (FAO, 2015). However, the continuous expansion of livestock production is placing increasing pressure on the supply of forage crops.
To address this supply challenge and ensure the sustainable development of the livestock sector, it is imperative to advance precision monitoring and predictive modeling for forage crops. Monitoring forage crops throughout the growing season, establishing quantitative relationships between key growth parameters and yield across different growth stages, and developing accurate yield prediction models based on phenological or growth indicators are of substantial practical significance. These efforts enable the timely acquisition of crop phenotypic information related to yield formation. Through real-time monitoring and precise prediction, planting, irrigation and fertilization strategies can be adjusted proactively, thereby enhancing the production efficiency of forage crops. These improvements provide a solid foundation for the stable development of the livestock industry and contribute to the advancement of sustainable agriculture.
Traditional field-based sampling surveys for crop yield estimation and forage mass estimation are not only time-consuming and labor-intensive but also invasive (Psomiadis, et al. 2025). With advances in sensor technologies, cropland monitoring is increasingly carried out using remote sensing techniques, which offer a non-invasive and non-destructive alternative. Satellite remote sensing datasets, such as MODIS, Landsat and Sentinel serve, as a major source for analyzing crop growth status and estimating productivity. However, the inability to simultaneously achieve high temporal and spatial accuracy constrains satellite remote sensing for plot-scale agricultural monitoring (Weiss, et al. 2020).
Unmanned Aerial Vehicles (UAVs), as a proximal sensing platform, provide an effective approach for monitoring seasonal crops at fine spatial scales. They can provide remote sensing imagery with high spatial and temporal resolution. With advances in UAV-mounted radar and sensor technologies, centimeter-level multispectral and hyperspectral data and improved resistance to environmental interference have further expanded UAV applications in agriculture, particularly in crop monitoring and yield estimation. This high-resolution data acquisition capability enables farmers and agronomists to make timely and informed decisions regarding irrigation, fertilization, pest management, and harvesting, ultimately improving crop productivity and resource use-efficiency.
PhenoCams, as near-surface remote sensing tools, can substantially reduce the influence of atmospheric conditions on observations and therefore complement and validate satellite- and UAV-based remote sensing data. PhenoCams are capable of capturing the crop growth dynamics at a high temporal resolution. Real-time variations of greenness indices serve as critical biological indicators for characterizing phenological development and predicting crop productivity.
This work aims to review UAV and PhenoCam technologies and assess their applications in monitoring and predicting forage crop productivity.
Literature Review
UAV-based monitoring in forage crops
UAV-mounted sensors can be categorized into five main types, including visible light (RGB), multispectral, hyperspectral, thermal infrared, and LiDAR (IvoševiÃÂÂÂ, et al. 2023). RGB-based UAV remote sensing can readily achieve centimeter or sub-centimeter-level resolution, and the resulting data generally require minimal processing. RGB imagery can be used to derive Vegetation Indices (VIs) such as Greenness Index (GI), Excess Green Index (EGI), Green Leaf Index (GLI), and Triangular Greenness Index (TGI). Morgan, et al. 2021 reported that quadratic models based on EGI and TGI achieved the highest accuracy in estimating the biomass of Spartina alterniflora. Additionally, several studies have integrated RGB VIs with structural parameters such as crop height to construct models with improved estimation accuracy (Poley and McDermid, 2020).
Multispectral sensors provide Red-Edge (RE), and Near-Infrared (NIR) bands in addition to the RGB band. By combining NIR and RE bands, VIs such as Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI) and Leaf Chlorophyll Index (LCI) have been shown to effectively capture phenological characteristics. For example, five VIs derived from NIR and RE-based indices exhibited significant correlations with maize yield (Shrestha, et al. 2023). Thus, compared with RGB sensors, multispectral imaging is more widely used for classifying crop growth stage and predicting crop yield.
Hyperspectral sensors can capture hundreds of spectral bands across the electromagnetic spectrum, characterized by narrow bandwidths and nanometer-level spectral resolution. Continuous spectral data provide comprehensive information on crop characteristics and enable the development of more diverse VI models. However, they also introduce challenges related to feature extraction and data analysis (Thomas, et al. 2017). Full-spectrum hyperspectral data are widely recommended for characterizing crop growth (Thenkabail and Lyon, 2016) and predicting crop yield, particularly in crops with dense vegetation cover, such as maize (Guo, et al. 2023).
In crop growth and yield estimation, point-cloud data acquired through LiDAR active sensing provide accurate measurements of crop height. LiDAR-derived height information is often integrated with other VI data to develop predictive models. UAVs equipped with thermal infrared sensors are primarily used to capture canopy temperature data. Owing to the strong relationship between canopy temperature, plant transpiration, and water status, thermal imagery can indirectly indicate leaf water potential and stomatal conductance. The negative association of canopy temperature and crop yield can therefore be exploited for yield estimation (Ergo, et al. 2018, Siegfried, et al. 2024). A comparison of different UAV-based sensors is presented in Tab. 1.
| Type of Sensors | Detected signals | Advantages | Limitations |
|---|---|---|---|
| RGB | R, G, and B | 1. Low cost, lightweight design, and ease of operation 2. Requires only basic data processing 3. Minimal dependence on weather conditions |
1. Limited to the acquisition of visible RGB information 2. Exhibits significant limitations in vegetation index development and subsequent analytical applications |
| Multispectral | R, G, B, Red edge, and near-infrared | 1. Relatively low cost 2. Captures spectral information extending beyond the RGB range 3. Requires less complex processing than hyperspectral data 4. Widely applied in vegetation index-based assessments |
1. Provides only a limited number of bands with relatively low and discontinuous spectral resolution 2. Limited capability for detailed spectral modeling and fine-scale extraction of vegetation growth information |
| Hyperspectral | Ultraviolet, R, G, B, Red edge, near-infrared, and mid-infrared | 1. Provides narrow and continuous spectral bands with high spectral resolution 2. Facilitates the development of diverse spectral vegetation index models 3. Enables detailed characterization of crop growth dynamics and subtle spectral differences |
1. High cost and comparatively lower spatial resolution 2. Spectral redundancy and large data volume 3. Requires complex preprocessing procedures |
| Thermal Infrared | Infrared thermal radiation | 1. Enables rapid, large-area acquisition of crop canopy temperature data | 1. Relatively heavy equipment 2. Environmental factors (e.g., soil background, wind) are difficult to fully mitigate 3. Presents challenges in detecting subtle temperature differences |
| LiDAR | Laser pulses | 1. Provides high-resolution measurements with strong low-altitude detection capability 2. Strong anti-interference performance 3. Independent of external illumination or target radiative properties 4. Smaller and more flexible than traditional microwave sensors |
1. Characterized by substantial weight, high cost, and intensive data processing demands 2. Water absorption of laser pulses may compromise measurement accuracy 3. Less suitable for retrieving phenotypic traits in low-stature, densely planted crop canopies |
Table 1. Advantages and limitations of different UAV-based sensors (modified from Liu, et al. 2018).
PhenoCam phenology metrics
Plant phenology refers to the key physiological developmental stages that plants go through, such as periodic germination, flowering, fruiting, and senescence. As an effective indicator of ecosystem functioning, phenology exerts a significant influence on the carbon and water cycles, energy exchange processes, and climate feedback mechanisms. Phenological shifts not only reflect vegetation responses to environmental variability (Yang, et al. 2023), but also directly influence plant developmental processes and overall productivity (Xue, et al. 2023).
PhenoCam systems represent near-surface remote sensing platforms specifically designed for monitoring vegetation phenology. They enable continuous, high-temporal-resolution observations and provide detailed canopy imagery. By capturing changes in canopy color, particularly variations in greenness intensity, PhenoCams can effectively identify and track phenological development stages. Greenness indices such as Green Chromatic Coordinates (GCC) and Excess Green (EXG), derived from R, G, and B bands, are commonly used to quantify canopy greenness. Yan, et al. 2025 found that GCC was a more effective indicator than UAV-based NDVI and LCI to monitoring height dynamics of Zea mays. Time-series phenological models constructed using PhenoCam-based greenness indices are highly valuable for identifying key growth periods and informing corresponding agricultural management decisions. Appropriate image acquisition timing and a fixed Region Of Interest (ROI) are essential for developing accurate phenological models (Sunoj, et al. 2025).
UAV-PhenoCam data integration
The integration of UAV and PhenoCam data leverages the complementary strengths of both platforms, thereby mitigating their individual limitations and generating a more robust and temporally continuous dataset for monitoring forage crop dynamics. For example, Simpson, et al. 2025 found strong agreement between greenness indices derived from PhenoCam and UAV observations, validating the feasibility of using data from both platforms to capture vegetation dynamics. The sparse VI observations collected by UAVs can be used to calibrate and validate the complete seasonal phenological curves derived from continuous PhenoCam observations (Thapa, et al. 2021). Ground-based PhenoCam observations also serve as a reference for correcting atmospheric distortions-including cloud cover and aerosol effects-in UAV and satellite imagery (Berra, et al. 2019). Moreover, integrating multi-scale and multi-source VIs can improve the predictive performance of growth-stage or crop-yield models. Models combining PhenoCam and UAV-based VIs have been shown to predict the yields of Zea mays and Avena sativa more accurately than models relying on either data source alone (Yan, et al. 2025).
Yield prediction using multi-source data
Linear models are commonly employed to describe the relationships between various VIs and crop yield. UAV-based NDVI exhibited a strong correlation with wheat height and grain yield with R2>0.4 (Hassan, et al. 2019). Multiple linear and nonlinear regression models incorporating additional VIs derived from UAV and PhenoCam data, along with parameters such as soil nutrient and water content, can predict crop height and yield with greater precision. Furthermore, accounting for specific VIs at distinct growth stages or using cumulative spectral VIs across multiple periods can significantly enhance model accuracy for predicting crop yield and protein content (Yan, et al. 2025).
To manage the large volumes of data generated by remote sensing platforms, Artificial Intelligence (AI)-based processing is widely employed in image analysis and modeling. A data-driven AI automated processing pipeline efficiently handles agricultural near-surface hyperspectral big data through techniques including radiometric calibration, Bidirectional Reflectance Distribution Function (BRDF) correction, rule-based and Support Vector Machine (SVM)-integrated soil/shadow masking, and no-reference quality assessment, thereby supporting high-throughput phenotyping and crop yield modeling (Sagan, et al. 2022). Machine learning models incorporating multiple indicators, including multispectral and hyperspectral-based VIs, textural indices, nitrogen content, and crop height, generally outperform traditional regression models based on single VIs (Li, et al. 2021, Guo, et al. 2022).
Discussion
Numerous factors influence crop yield, including crop height, Leaf Area Index (LAI), and environmental factor such as irrigation amount, fertilizer level, climatic conditions, and soil texture. High-precision data acquired via 3D LiDAR and canopy analyzers equipped with fisheye sensors can provide vertical structural information, including crop height, canopy volume, and LAI. UAV-derived thermal and hyperspectral data enable timely monitoring of plant water stress (Psomiadis, et al. 2025) and soil nutrient content (Hossen, et al. 2021). In the future, comprehensive datasets should be acquired to provide informative variables that enhance the predictive performance of regression and machine learning models. Simulation results of yield and protein content models can also guide farmers in adjusting fertilization, irrigation, and harvesting practices.
Most forage crops complete only a single growth cycle per year, leading to a scarcity of sufficient data for modeling. Limited availability of publicly accessible datasets further constrains the application of machine learning in yield estimation. Therefore, for diverse forage species and their cultivars, it is crucial to establish extensive monitoring networks and shared datasets. Data collection and input variables should account for the physiological mechanisms underlying distinct growth stages, as these mechanisms critically influence biomass accumulation and final yield. Furthermore, standardized monitoring protocols and associated predictive models should be developed for diverse crop types. Mechanistic models can more effectively capture the effects of environmental and structural factors (e.g., LAI, canopy temperature, and soil moisture) on crop yield (Weiss, et al. 2020). These models should emphasize adaptability and incorporate regional parameterization to improve practical accuracy across diverse geographical and climatic conditions.
A fundamental challenge in precision agriculture is the scale gap between fine-resolution field observations and coarse-resolution satellite monitoring. UAVs can be integrated into these systems to bridge this gap. Multi-scale platforms leveraging ground-satellite- UAV synergies should be established to enhance prediction accuracy and substantially reduce spatial uncertainty (Fathololoumi, et al. 2025). Integrated multi-scale platforms facilitate standardized monitoring and prediction, which are crucial for sustaining crop yield and ensuring food security under changing climatic conditions.
Conclusion
The integration of UAV and PhenoCam technologies represents a highly efficient and cost-effective approach to addressing the critical challenge of monitoring and predicting forage crop productivity. By leveraging the high spatial resolution of UAVs alongside the continuous temporal coverage provided by PhenoCams, this multi-scale approach effectively bridges the observational gap between ground-based surveys and satellite remote sensing. The fusion of complementary data, encompassing structural parameters from LiDAR, physiological indicators and VIs from multispectral and hyperspectral sensors, provides a robust foundation for monitoring and yield modeling. To address the inherent data scarcity in perennial forage systems, future efforts should prioritize the construction of extensive shared datasets and the development of adaptable AI-driven models incorporating regional parameterization. Ultimately, advancing standardized, integrated monitoring frameworks based on ground-satellite-UAV synergies is essential for translating precise, data-driven insights into actionable management strategies. This approach will optimize resource utilization, enhance yield stability, and strengthen the resilience of forage production systems within sustainable agricultural practices under changing climatic conditions.
Funding
This research was funded by the Wuxi University Research Start-up Fund for Introduced Talents (2023r031).
References
- Berra EF, Gaulton R, Barr S. (2019). Assessing spring phenology of a temperate woodland: A multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations. Remote Sens Environ. 223:229-42.
- Ergo VV, Lascano R, Vega CR, Parola R, Carrera CS. (2018). Heat and water stressed field-grown soybean: A multivariate study on the relationship between physiological-biochemical traits and yield. Environ Exp Bot. 148:1-11.
- FAO. (2015). The second report on the state of the World’s animal genetic resources for food and agriculture. Rome.
- Fathololoumi S, Vasava HB, Firozjaei MK, Daggupati P, Sulik J, Biswas A. (2025). Reducing corn yield prediction uncertainty through multi-scale integration of ground, drone, and satellite data. Precision Agric. 26:84.
- Poley LG, McDermid GJ. (2020). A systematic review of the factors influencing the estimation of vegetation aboveground biomass using unmanned aerial systems. Remote Sensing. 12:1052.
- Guo Y, Xiao Y, Hao F, Zhang X, Chen J, de Beurs K, He Y, Fu YH. (2023). Comparison of different machine learning algorithms for predicting maize grain yield using UAV-based hyperspectral images. Int J Appl Earth Obs Geoinf. 124:103528.
- Guo Y, Zhang X, Chen S, Wang H, Jayavelu S, Cammarano D, Fu Y. (2022). Integrated UAV-based multi-source data for predicting maize grain yield using machine learning approaches. Remote Sensing. 14:6290.
- Hassan MA, Yang M, Rasheed A, Yang G, Reynolds M, Xia X, Xiao Y, He Z. (2019). A rapid monitoring of NDVI across the wheat growth cycle for grain yield prediction using a multi-spectral UAV platform. Plant Sci. 282:95-103.
[Crossref] [Google Scholar] [PubMed]
- Hossen MA, Diwakar PK, Ragi S. (2021). Total nitrogen estimation in agricultural soils via aerial multispectral imaging and LIBS. Sci Rep. 11:12693.
[Crossref] [Google Scholar] [PubMed]
- IvoševiÄ B, KostiÄ M, LjubiÄiÄ N, GrboviÄ Å½, PaniÄ M. (2023). Chapter 2-A drone view for agriculture. InUnmanned Aerial Systems in Agriculture. 25-47. Academic Press.
- Li D, Miao Y, Gupta SK, Rosen CJ, Yuan F, Wang C, Wang L, Huang Y. (2021). Improving potato yield prediction by combining cultivar information and UAV remote sensing data using machine learning. Remote Sensing. 13:3322.
- Liu Z, Wan W, Huang J, Han YW, Wang JY. (2018). Progress on key parameters inversion of crop growth based on unmanned aerial vehicle remote sensing. TCSAE. 34:60-71.
- Morgan GR, Wang C, Morris JT. (2021). RGB indices and canopy height modelling for mapping tidal marsh biomass from a small unmanned aerial system. Remote Sensing. 13:3406.
- Mottet A, De Haan C, Falcucci A, Tempio G, Opio C, Gerber P. (2017). Livestock: On our plates or eating at our table? A new analysis of the feed/food debate. Global Food Security. 14:1-8.
- Psomiadis E, Philippopoulos PI, Kakaletris G. (2025). Non-invasive estimation of crop water stress index and irrigation management with upscaling from field to regional level using remote sensing and agrometeorological data. Remote Sensing. 17:2522.
- Sagan VM, Maimaitijiang M, Paheding S, Bhadra S, Gosselin N, Burnette M, Demieville J, Hartling S, Lebauer D, Newcomb M, Pauli D, Peterson KT, Shakoor N, Stylianou A, Zender CS, Mockler TC. (2022). Data-driven artificial intelligence for calibration of hyperspectral big data. IEEE Transactions on Geoscience and Remote Sensing. 60:1-20.
[Crossref]
- Shrestha A, Bheemanahalli R, Adeli A, Samiappan S, Czarnecki JM, McCraine CD, Reddy KR, Moorhead R. (2023). Phenological stage and vegetation index for predicting corn yield under rainfed environments. Front Plant Sci. 14:1168732.
[Crossref] [Google Scholar] [PubMed]
- Siegfried J, Rajan N, Adams CB, Neely H, Hague S, Hardin R, Schnell R, Han X. (2024). High-accuracy infrared thermography of cotton canopy temperature by unmanned aerial systems (UAS): Evaluating in-season prediction of yield. Smart Agri Tech. 7:100393.
- Simpson G, Wade T, Helfter C, Jones MR, Yeung K, Nichol CJ. (2025). Inter-annual variability of peatland vegetation captured using PhenoCam-and UAV imagery. Remote Sensing. 17:526.
- Sunoj S, Igathinathane C, Saliendra N, Hendrickson J, Archer D, Liebig M. (2025). PhenoCam Guidelines for phenological measurement and analysis in an agricultural cropping environment: A case study of soybean. Remote Sensing. 17:724.
- Thapa S, Millan VEG, Eklundh L. (2021). Assessing forest phenology: A multi-scale comparison of near-surface (UAV, spectral reflectance sensor, PhenoCam) and satellite (MODIS, Sentinel-2) remote sensing. Remote Sensing. 13:1597.
- Thenkabail PS, Lyon JG. (2016). Hyperspectral remote sensing of vegetation. CRC Press, Boca Raton.
- Thomas S, Kuska MT, Bohnenkamp D, Brugger A, Alisaac E, Wahabzada M, Behmann J, Mahlein AK. (2017). Benefits of hyperspectral imaging for plant disease detection and plant protection: A technical perspective. J Plant Dis Prot. 125:5-20.
- Weiss M, Jacob F, Duveiller G. (2020). Remote sensing for agricultural applications: A meta-review. Remote Sens Environ. 236:111402.
- Xue Y, Bai X, Zhao C, Tan Q, Li Y, Luo G, Wu L, Chen F, Li C, Ran C, Zhang S, Liu M, Gong S, Xiong L, Song F, Du C, Xiao B, Li Z, Long M. (2023). Spring photosynthetic phenology of Chinese vegetation in response to climate change and its impact on net primary productivity. Agric For Meteorol. 342:109734.
[Crossref]
- Yan W, Zhao S, Zheng C, Zhang P, Shen H, Chang J, Xu K. (2025). Growth monitoring and yield estimation of forage based on multiple phenological indicators. Chin J Plant Ecol. 49:1096-1109.
- Yang X, Chen Y, Zhang T, Zhang P, Guo Z, Hu G, Zhang H, Sun Y, Huang L, Ma M. (2023). Different responses of functional groups to N addition increased synchrony and shortened community reproductive duration in an alpine meadow. J Ecol. 111:2231-2244.