Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
Plant Phenomics ; 6: 0180, 2024.
Article in English | MEDLINE | ID: mdl-38779576

ABSTRACT

The last decades have witnessed a rapid development of noninvasive plant phenotyping, capable of detecting plant stress scale levels from the subcellular to the whole population scale. However, even with such a broad range, most phenotyping objects are often just concerned with leaves. This review offers a unique perspective of noninvasive plant stress phenotyping from a multi-organ view. First, plant sensing and responding to abiotic stress from the diverse vegetative organs (leaves, stems, and roots) and the interplays between these vital components are analyzed. Then, the corresponding noninvasive optical phenotyping techniques are also provided, which can prompt the practical implementation of appropriate noninvasive phenotyping techniques for each organ. Furthermore, we explore methods for analyzing compound stress situations, as field conditions frequently encompass multiple abiotic stressors. Thus, our work goes beyond the conventional approach of focusing solely on individual plant organs. The novel insights of the multi-organ, noninvasive phenotyping study provide a reference for testing hypotheses concerning the intricate dynamics of plant stress responses, as well as the potential interactive effects among various stressors.

2.
J Anim Sci ; 1022024 Jan 03.
Article in English | MEDLINE | ID: mdl-38134209

ABSTRACT

Computer vision (CV), a non-intrusive and cost-effective technology, has furthered the development of precision livestock farming by enabling optimized decision-making through timely and individualized animal care. The availability of affordable two- and three-dimensional camera sensors, combined with various machine learning and deep learning algorithms, has provided a valuable opportunity to improve livestock production systems. However, despite the availability of various CV tools in the public domain, applying these tools to animal data can be challenging, often requiring users to have programming and data analysis skills, as well as access to computing resources. Moreover, the rapid expansion of precision livestock farming is creating a growing need to educate and train animal science students in CV. This presents educators with the challenge of efficiently demonstrating the complex algorithms involved in CV. Thus, the objective of this study was to develop ShinyAnimalCV, an open-source cloud-based web application designed to facilitate CV teaching in animal science. This application provides a user-friendly interface for performing CV tasks, including object segmentation, detection, three-dimensional surface visualization, and extraction of two- and three-dimensional morphological features. Nine pre-trained CV models using top-view animal data are included in the application. ShinyAnimalCV has been deployed online using cloud computing platforms. The source code of ShinyAnimalCV is available on GitHub, along with detailed documentation on training CV models using custom data and deploying ShinyAnimalCV locally to allow users to fully leverage the capabilities of the application. ShinyAnimalCV can help to support the teaching of CV, thereby laying the groundwork to promote the adoption of CV in the animal science community.


The integration of cameras and data science has great potential to revolutionize livestock production systems, making them more efficient and sustainable by replacing human-based management with real-time individualized animal care. However, applying these digital tools to animal data presents challenges that require computer programming and data analysis skills, as well as access to computing resources. Additionally, there is a growing need to train animal science students to analyze image or video data using data science algorithms. However, teaching computer programming to all types of students from the ground up can prove complicated and challenging. Therefore, the objective of this study was to develop ShinyAnimalCV, a user-friendly online web application that supports users to learn the application of data science to analyze animal digital video data, without the need for complex coding. The application includes nine pre-trained models for detecting and segmenting animals in image data and can be easily accessed through a web browser. We have also made the source code and detailed documentation available online for advanced users who wish to use the application locally. This software tool facilitates the teaching of digital animal data analysis in the animal science community, with potential benefits to livestock production systems.


Subject(s)
Cloud Computing , Imaging, Three-Dimensional , Animals , Imaging, Three-Dimensional/veterinary , Software , Computers , Animal Husbandry , Livestock
3.
Plants (Basel) ; 12(21)2023 Oct 25.
Article in English | MEDLINE | ID: mdl-37960032

ABSTRACT

Rice blast has caused major production losses in rice, and thus the early detection of rice blast plays a crucial role in global food security. In this study, a semi-supervised contrastive unpaired translation iterative network is specifically designed based on unmanned aerial vehicle (UAV) images for rice blast detection. It incorporates multiple critic contrastive unpaired translation networks to generate fake images with different disease levels through an iterative process of data augmentation. These generated fake images, along with real images, are then used to establish a detection network called RiceBlastYolo. Notably, the RiceBlastYolo model integrates an improved fpn and a general soft labeling approach. The results show that the detection precision of RiceBlastYolo is 99.51% under intersection over union (IOU0.5) conditions and the average precision is 98.75% under IOU0.5-0.9 conditions. The precision and recall rates are respectively 98.23% and 99.99%, which are higher than those of common detection models (YOLO, YOLACT, YOLACT++, Mask R-CNN, and Faster R-CNN). Additionally, external data also verified the ability of the model. The findings demonstrate that our proposed model can accurately identify rice blast under field-scale conditions.

4.
Molecules ; 28(14)2023 Jul 13.
Article in English | MEDLINE | ID: mdl-37513250

ABSTRACT

Tea polyphenol and epigallocatechin gallate (EGCG) were considered as key components of tea. The rapid prediction of these two components can be beneficial for tea quality control and product development for tea producers, breeders and consumers. This study aimed to develop reliable models for tea polyphenols and EGCG content prediction during the breeding process using Fourier Transform-near infrared (FT-NIR) spectroscopy combined with machine learning algorithms. Various spectral preprocessing methods including Savitzky-Golay smoothing (SG), standard normal variate (SNV), vector normalization (VN), multiplicative scatter correction (MSC) and first derivative (FD) were applied to improve the quality of the collected spectra. Partial least squares regression (PLSR) and least squares support vector regression (LS-SVR) were introduced to establish models for tea polyphenol and EGCG content prediction based on different preprocessed spectral data. Variable selection algorithms, including competitive adaptive reweighted sampling (CARS) and random forest (RF), were further utilized to identify key spectral bands to improve the efficiency of the models. The results demonstrate that the optimal model for tea polyphenols calibration was the LS-SVR with Rp = 0.975 and RPD = 4.540 based on SG-smoothed full spectra. For EGCG detection, the best model was the LS-SVR with Rp = 0.936 and RPD = 2.841 using full original spectra as model inputs. The application of variable selection algorithms further improved the predictive performance of the models. The LS-SVR model for tea polyphenols prediction with Rp = 0.978 and RPD = 4.833 used 30 CARS-selected variables, while the LS-SVR model build on 27 RF-selected variables achieved the best predictive ability with Rp = 0.944 and RPD = 3.049, respectively, for EGCG prediction. The results demonstrate a potential of FT-NIR spectroscopy combined with machine learning for the rapid screening of genotypes with high tea polyphenol and EGCG content in tea leaves.


Subject(s)
Polyphenols , Spectroscopy, Near-Infrared , Spectroscopy, Near-Infrared/methods , Polyphenols/analysis , Fourier Analysis , Least-Squares Analysis , Algorithms , Machine Learning , Tea/chemistry , Plant Leaves/chemistry , Support Vector Machine
5.
Curr Microbiol ; 78(8): 3201-3211, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34213616

ABSTRACT

Cellulase plays an important role in addressing the issue of the energy crisis. However, the yield and degradation efficiency of cellulase remain a major challenge. In the present study, we aimed to verify whether ammonium ion (NH4+) could induce cellulase synthesis from T. koningii AS3.2774 and to explore new functional genes related to the cellulase production. Our results indicated that NH4+ induces cellulase production in a way different from nitrogen sources. NH4+-mediated mycelia displayed a significant increase in transport vesicles. Under NH4+ mediation, CBHI, CBHII, glycoside hydrolase family 5 proteins, Hap2/3/5 complexes, "ribosome biogenesis", and "heme binding" were significantly up-regulated, and differentially expressed genes (DEGs) were mainly involved in "Metabolism". Collectively, our findings illustrated that NH4+ induced the cellulase production at morphological and gene expression levels, which might be related to the Hap2/3/5 complex, ribosomes, and genes involved in various amino acid metabolism, pyruvate metabolism, and glycolysis/gluconeogenesis. Taken together, our results provided valuable insights into the regulatory network of cellulase gene expression in filamentous fungi.


Subject(s)
Ammonium Compounds , Cellulase , Trichoderma , Cellulase/genetics , Cellulase/metabolism , Gene Expression Regulation, Fungal , Hypocreales , Ions , Trichoderma/genetics , Trichoderma/metabolism
6.
Plant J ; 107(6): 1837-1853, 2021 09.
Article in English | MEDLINE | ID: mdl-34216161

ABSTRACT

Brassinosteroids (BRs) are a group of plant steroid hormones involved in regulating growth, development, and stress responses. Many components of the BR pathway have previously been identified and characterized. However, BR phenotyping experiments are typically performed in a low-throughput manner, such as on Petri plates. Additionally, the BR pathway affects drought responses, but drought experiments are time consuming and difficult to control. To mitigate these issues and increase throughput, we developed the Robotic Assay for Drought (RoAD) system to perform BR and drought response experiments in soil-grown Arabidopsis plants. RoAD is equipped with a robotic arm, a rover, a bench scale, a precisely controlled watering system, an RGB camera, and a laser profilometer. It performs daily weighing, watering, and imaging tasks and is capable of administering BR response assays by watering plants with Propiconazole (PCZ), a BR biosynthesis inhibitor. We developed image processing algorithms for both plant segmentation and phenotypic trait extraction to accurately measure traits including plant area, plant volume, leaf length, and leaf width. We then applied machine learning algorithms that utilize the extracted phenotypic parameters to identify image-derived traits that can distinguish control, drought-treated, and PCZ-treated plants. We carried out PCZ and drought experiments on a set of BR mutants and Arabidopsis accessions with altered BR responses. Finally, we extended the RoAD assays to perform BR response assays using PCZ in Zea mays (maize) plants. This study establishes an automated and non-invasive robotic imaging system as a tool to accurately measure morphological and growth-related traits of Arabidopsis and maize plants in 3D, providing insights into the BR-mediated control of plant growth and stress responses.


Subject(s)
Arabidopsis/physiology , Brassinosteroids/metabolism , Image Processing, Computer-Assisted/methods , Robotics/methods , Zea mays/physiology , Arabidopsis/drug effects , Arabidopsis Proteins/genetics , Droughts , Equipment Design , Machine Learning , Phenotype , Protein Kinases/genetics , Robotics/instrumentation , Seedlings/physiology , Soil/chemistry , Triazoles/pharmacology
7.
Sensors (Basel) ; 21(2)2021 Jan 13.
Article in English | MEDLINE | ID: mdl-33450839

ABSTRACT

Accurate corn stand count in the field at early season is of great interest to corn breeders and plant geneticists. However, the commonly used manual counting method is time consuming, laborious, and prone to error. Nowadays, unmanned aerial vehicles (UAV) tend to be a popular base for plant-image-collecting platforms. However, detecting corn stands in the field is a challenging task, primarily because of camera motion, leaf fluttering caused by wind, shadows of plants caused by direct sunlight, and the complex soil background. As for the UAV system, there are mainly two limitations for early seedling detection and counting. First, flying height cannot ensure a high resolution for small objects. It is especially difficult to detect early corn seedlings at around one week after planting, because the plants are small and difficult to differentiate from the background. Second, the battery life and payload of UAV systems cannot support long-duration online counting work. In this research project, we developed an automated, robust, and high-throughput method for corn stand counting based on color images extracted from video clips. A pipeline developed based on the YoloV3 network and Kalman filter was used to count corn seedlings online. The results demonstrate that our method is accurate and reliable for stand counting, achieving an accuracy of over 98% at growth stages V2 and V3 (vegetative stages with two and three visible collars) with an average frame rate of 47 frames per second (FPS). This pipeline can also be mounted easily on manned cart, tractor, or field robotic systems for online corn counting.

8.
Cell Prolif ; 54(1): e12948, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33145869

ABSTRACT

Metastasis refers to the progressive dissemination of primary tumour cells and their colonization of other tissues and is associated with most cancer-related mortalities. The disproportional and systematic distribution pattern of distant metastasis in different cancers has been well documented, as is termed metastatic organotropism, a process orchestrated by a combination of anatomical, pathophysiological, genetic and biochemical factors. Extracellular vesicles (EVs), nanosized cell-derived membrane-bound particles known to mediate intercellular communication, are now considered crucial in organ-specific metastasis. Here, we review and summarize recent findings regarding EV-associated organotropic metastasis as well as some of the general mechanisms by which EVs contribute to this important process in cancer and provide a future perspective on this emerging topic. We highlight studies that demonstrate a role of tumour-derived EVs in organotropic metastasis via pre-metastatic niche modulation. The bioactive cargo carried by EVs is of diagnostic and prognostic values, and counteracting the functions of such EVs may be a novel therapeutic strategy targeting metastasis. Further investigations are warranted to better understand the functions and mechanisms of EVs in organotropic metastasis and accelerate the relevant clinical translation.


Subject(s)
Extracellular Vesicles/pathology , Neoplasm Metastasis , Neoplasms/pathology , Animals , Extracellular Vesicles/metabolism , Humans , Neoplasms/metabolism
9.
Sensors (Basel) ; 18(3)2018 Mar 07.
Article in English | MEDLINE | ID: mdl-29518958

ABSTRACT

Non-destructive plant growth measurement is essential for plant growth and health research. As a 3D sensor, Kinect v2 has huge potentials in agriculture applications, benefited from its low price and strong robustness. The paper proposes a Kinect-based automatic system for non-destructive growth measurement of leafy vegetables. The system used a turntable to acquire multi-view point clouds of the measured plant. Then a series of suitable algorithms were applied to obtain a fine 3D reconstruction for the plant, while measuring the key growth parameters including relative/absolute height, total/projected leaf area and volume. In experiment, 63 pots of lettuce in different growth stages were measured. The result shows that the Kinect-measured height and projected area have fine linear relationship with reference measurements. While the measured total area and volume both follow power law distributions with reference data. All these data have shown good fitting goodness (R² = 0.9457-0.9914). In the study of biomass correlations, the Kinect-measured volume was found to have a good power law relationship (R² = 0.9281) with fresh weight. In addition, the system practicality was validated by performance and robustness analysis.


Subject(s)
Vegetables , Algorithms , Automation , Biomass , Plant Leaves
SELECTION OF CITATIONS
SEARCH DETAIL
...