Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Sci Rep ; 14(1): 16927, 2024 Jul 23.
Article in English | MEDLINE | ID: mdl-39043833

ABSTRACT

Precision in grazing management is highly dependent on accurate pasture monitoring. Typically, this is often overlooked because existing approaches are labour-intensive, need calibration, and are commonly perceived as inaccurate. Machine-learning processes harnessing big data, including remote sensing, can offer a new era of decision-support tools (DST) for pasture monitoring. Its application on-farm remains poor because of a lack of evidence about its accuracy. This study aimed at evaluating and quantifying the minimum data required to train a machine-learning satellite-based DST focusing on accurate pasture biomass prediction using this approach. Management data from 14 farms in New South Wales, Australia and measured pasture biomass throughout 12 consecutive months using a calibrated rising plate meter (RPM) as well as pasture biomass estimated using a DST based on high temporal/spatial resolution satellite images were available. Data were balanced according to farm and week of each month and randomly allocated for model evaluation (20%) and for progressive training (80%) as follows: 25% training subset (1W: week 1 in each month); 50% (2W: week 1 and 3); 75% (3W: week 1, 3, and 4); and 100% (4W: week 1 to 4). Pasture biomass estimates using the DST across all training datasets were evaluated against a calibrated rising plate meter (RPM) using mean-absolute error (MAE, kg DM/ha) among other statistics. Tukey's HSD test was used to determine the differences between MAE across all training datasets. Relative to the control (no training, MAE: 498 kg DM ha-1) 1W did not improve the prediction accuracy of the DST (P > 0.05). With the 2W training dataset, the MAE decreased to 342 kg DM ha-1 (P < 0.001), while for the other training datasets, MAE decreased marginally (P > 0.05). This study accounts for minimal training data for a machine-learning DST to monitor pastures from satellites with comparable accuracy to a calibrated RPM which is considered the 'gold standard' for pasture biomass monitoring.


Subject(s)
Biomass , Dairying , Machine Learning , Remote Sensing Technology , Remote Sensing Technology/methods , Animals , Dairying/methods , Australia , Cattle , New South Wales
2.
Front Plant Sci ; 14: 1206535, 2023.
Article in English | MEDLINE | ID: mdl-37404539

ABSTRACT

Maize silage is a key component of feed rations in dairy systems due to its high forage and grain yield, water use efficiency, and energy content. However, maize silage nutritive value can be compromised by in-season changes during crop development due to changes in plant partitioning between grain and other biomass fractions. The partitioning to grain (harvest index, HI) is affected by the interactions between genotype (G) × environment (E) × management (M). Thus, modelling tools could assist in accurately predicting changes during the in-season crop partitioning and composition and, from these, the HI of maize silage. Our objectives were to (i) identify the main drivers of grain yield and HI variability, (ii) calibrate the Agricultural Production Systems Simulator (APSIM) to estimate crop growth, development, and plant partitioning using detailed experimental field data, and (iii) explore the main sources of HI variance in a wide range of G × E × M combinations. Nitrogen (N) rates, sowing date, harvest date, plant density, irrigation rates, and genotype data were used from four field experiments to assess the main drivers of HI variability and to calibrate the maize crop module in APSIM. Then, the model was run for a complete range of G × E × M combinations across 50 years. Experimental data demonstrated that the main drivers of observed HI variability were genotype and water status. The model accurately simulated phenology [leaf number and canopy green cover; Concordance Correlation Coefficient (CCC)=0.79-0.97, and Root Mean Square Percentage Error (RMSPE)=13%] and crop growth (total aboveground biomass, grain + cob, leaf, and stover weight; CCC=0.86-0.94 and RMSPE=23-39%). In addition, for HI, CCC was high (0.78) with an RMSPE of 12%. The long-term scenario analysis exercise showed that genotype and N rate contributed to 44% and 36% of the HI variance. Our study demonstrated that APSIM is a suitable tool to estimate maize HI as one potential proxy of silage quality. The calibrated APSIM model can now be used to compare the inter-annual variability of HI for maize forage crops based on G × E × M interactions. Therefore, the model provides new knowledge to (potentially) improve maize silage nutritive value and aid genotype selection and harvest timing decision-making.

SELECTION OF CITATIONS
SEARCH DETAIL
...