Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Gigascience ; 132024 01 02.
Article in English | MEDLINE | ID: mdl-38897734

ABSTRACT

BACKGROUND: This study addresses the importance of precise referencing in 3-dimensional (3D) plant phenotyping, which is crucial for advancing plant breeding and improving crop production. Traditionally, reference data in plant phenotyping rely on invasive methods. Recent advancements in 3D sensing technologies offer the possibility to collect parameters that cannot be referenced by manual measurements. This work focuses on evaluating a 3D printed sugar beet plant model as a referencing tool. RESULTS: Fused deposition modeling has turned out to be a suitable 3D printing technique for creating reference objects in 3D plant phenotyping. Production deviations of the created reference model were in a low and acceptable range. We were able to achieve deviations ranging from -10 mm to +5 mm. In parallel, we demonstrated a high-dimensional stability of the reference model, reaching only ±4 mm deformation over the course of 1 year. Detailed print files, assembly descriptions, and benchmark parameters are provided, facilitating replication and benefiting the research community. CONCLUSION: Consumer-grade 3D printing was utilized to create a stable and reproducible 3D reference model of a sugar beet plant, addressing challenges in referencing morphological parameters in 3D plant phenotyping. The reference model is applicable in 3 demonstrated use cases: evaluating and comparing 3D sensor systems, investigating the potential accuracy of parameter extraction algorithms, and continuously monitoring these algorithms in practical experiments in greenhouse and field experiments. Using this approach, it is possible to monitor the extraction of a nonverifiable parameter and create reference data. The process serves as a model for developing reference models for other agricultural crops.


Subject(s)
Beta vulgaris , Phenotype , Printing, Three-Dimensional , Beta vulgaris/genetics , Plant Breeding/methods
2.
Plant Dis ; 108(3): 711-724, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37755420

ABSTRACT

Rhizoctonia crown and root rot (RCRR), caused by Rhizoctonia solani, can cause severe yield and quality losses in sugar beet. The most common strategy to control the disease is the development of resistant varieties. In the breeding process, field experiments with artificial inoculation are carried out to evaluate the performance of genotypes and varieties. The phenotyping process in breeding trials requires constant monitoring and scoring by skilled experts. This work is time demanding and shows bias and heterogeneity according to the experience and capacity of each individual person. Optical sensors and artificial intelligence have demonstrated great potential to achieve higher accuracy than human raters and the possibility to standardize phenotyping applications. A workflow combining red-green-blue and multispectral imagery coupled to an unmanned aerial vehicle (UAV), as well as machine learning techniques, was applied to score diseased plants and plots affected by RCRR. Georeferenced annotation of UAV-orthorectified images was carried out. With the annotated images, five convolutional neural networks were trained to score individual plants. The training was carried out with different image analysis strategies and data augmentation. The custom convolutional neural network trained from scratch together with pretrained MobileNet showed the best precision in scoring RCRR (0.73 to 0.85). The average per plot of spectral information was used to score the plots, and the benefit of adding the information obtained from the score of individual plants was compared. For this purpose, machine learning models were trained together with data management strategies, and the best-performing model was chosen. A combined pipeline of random forest and k-nearest neighbors has shown the best weighted precision (0.67). This research provides a reliable workflow for detecting and scoring RCRR based on aerial imagery. RCRR is often distributed heterogeneously in trial plots; therefore, considering the information from individual plants of the plots showed a significant improvement in UAV-based automated monitoring routines.


Subject(s)
Beta vulgaris , Unmanned Aerial Devices , Humans , Rhizoctonia , Artificial Intelligence , Plant Breeding , Machine Learning , Vegetables , Sugars
SELECTION OF CITATIONS
SEARCH DETAIL
...