Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Pest Manag Sci ; 79(2): 645-654, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36223137

ABSTRACT

BACKGROUND: Ecballium elaterium (common name: squirting cucumber) is an emerging weed problem in hedgerow or superintensive olive groves under no tillage. It colonizes the inter-row area infesting the natural or sown cover crops, and is considered a hard-to-control weed. Research in other woody crops has shown E. elaterium has a patchy distribution, which makes this weed susceptible to design a site-specific control strategy only addressed to E. elaterium patches. Therefore, the aim of this work was to develop a methodology based on the analysis of imagery acquired with an uncrewed aerial vehicle (UAV) to detect and map E. elaterium infestations in hedgerow olive orchards. RESULTS: The study was conducted in two superintensive olive orchards, and the images were taken using a UAV equipped with an RGB sensor. Flights were conducted on two dates: in May, when there were various weeds infesting the orchard, and in September, when E. elaterium was the only infesting weed. UAV-orthomosaics in the first scenario were classified using random forest models, and the orthomosaics from September with E. elaterium as the only weed, were analyzed using an unsupervised algorithm. In both cases, the overall accuracies were over 0.85, and the producer's accuracies for E. elaterium ranged between 0.74 and 1.00. CONCLUSION: These results allow the design of a site-specific and efficient herbicide control protocol which would represent a step forward in sustainable weed management. The development of these algorithms in free and open-source software fosters their application in small and medium farms. © 2022 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.


Subject(s)
Cucurbitaceae , Olea , Plant Weeds , Weed Control/methods , Farms , Algorithms , Crops, Agricultural
3.
Sensors (Basel) ; 21(9)2021 Apr 28.
Article in English | MEDLINE | ID: mdl-33925169

ABSTRACT

Yield prediction is crucial for the management of harvest and scheduling wine production operations. Traditional yield prediction methods rely on manual sampling and are time-consuming, making it difficult to handle the intrinsic spatial variability of vineyards. There have been significant advances in automatic yield estimation in vineyards from on-ground imagery, but terrestrial platforms have some limitations since they can cause soil compaction and have problems on sloping and ploughed land. The analysis of photogrammetric point clouds generated with unmanned aerial vehicles (UAV) imagery has shown its potential in the characterization of woody crops, and the point color analysis has been used for the detection of flowers in almond trees. For these reasons, the main objective of this work was to develop an unsupervised and automated workflow for detection of grape clusters in red grapevine varieties using UAV photogrammetric point clouds and color indices. As leaf occlusion is recognized as a major challenge in fruit detection, the influence of partial leaf removal in the accuracy of the workflow was assessed. UAV flights were performed over two commercial vineyards with different grape varieties in 2019 and 2020, and the photogrammetric point clouds generated from these flights were analyzed using an automatic and unsupervised algorithm developed using free software. The proposed methodology achieved R2 values higher than 0.75 between the harvest weight and the projected area of the points classified as grapes in vines when partial two-sided removal treatment, and an R2 of 0.82 was achieved in one of the datasets for vines with untouched full canopy. The accuracy achieved in grape detection opens the door to yield prediction in red grape vineyards. This would allow the creation of yield estimation maps that will ease the implementation of precision viticulture practices. To the authors' knowledge, this is the first time that UAV photogrammetric point clouds have been used for grape clusters detection.

4.
Front Plant Sci ; 10: 1472, 2019.
Article in English | MEDLINE | ID: mdl-31803210

ABSTRACT

The need for the olive farm modernization have encouraged the research of more efficient crop management strategies through cross-breeding programs to release new olive cultivars more suitable for mechanization and use in intensive orchards, with high quality production and resistance to biotic and abiotic stresses. The advancement of breeding programs are hampered by the lack of efficient phenotyping methods to quickly and accurately acquire crop traits such as morphological attributes (tree vigor and vegetative growth habits), which are key to identify desirable genotypes as early as possible. In this context, an UAV-based high-throughput system for olive breeding program applications was developed to extract tree traits in large-scale phenotyping studies under field conditions. The system consisted of UAV-flight configurations, in terms of flight altitude and image overlaps, and a novel, automatic, and accurate object-based image analysis (OBIA) algorithm based on point clouds, which was evaluated in two experimental trials in the framework of a table olive breeding program, with the aim to determine the earliest date for suitable quantifying of tree architectural traits. Two training systems (intensive and hedgerow) were evaluated at two very early stages of tree growth: 15 and 27 months after planting. Digital Terrain Models (DTMs) were automatically and accurately generated by the algorithm as well as every olive tree identified, independently of the training system and tree age. The architectural traits, specially tree height and crown area, were estimated with high accuracy in the second flight campaign, i.e. 27 months after planting. Differences in the quality of 3D crown reconstruction were found for the growth patterns derived from each training system. These key phenotyping traits could be used in several olive breeding programs, as well as to address some agronomical goals. In addition, this system is cost and time optimized, so that requested architectural traits could be provided in the same day as UAV flights. This high-throughput system may solve the actual bottleneck of plant phenotyping of "linking genotype and phenotype," considered a major challenge for crop research in the 21st century, and bring forward the crucial time of decision making for breeders.

5.
Front Plant Sci ; 10: 948, 2019.
Article in English | MEDLINE | ID: mdl-31396251

ABSTRACT

Bioethanol production obtained from cereal straw has aroused great interest in recent years, which has led to the development of breeding programs to improve the quality of lignocellulosic material in terms of the biomass and sugar content. This process requires the analysis of genotype-phenotype relationships, and although genotyping tools are very advanced, phenotypic tools are not usually capable of satisfying the massive evaluation that is required to identify potential characters for bioethanol production in field trials. However, unmanned aerial vehicle (UAV) platforms have demonstrated their capacity for efficient and non-destructive acquisition of crop data with an application in high-throughput phenotyping. This work shows the first evaluation of UAV-based multi-spectral images for estimating bioethanol-related variables (total biomass dry weight, sugar release, and theoretical ethanol yield) of several accessions of wheat, barley, and triticale (234 cereal plots). The full procedure involved several stages: (1) the acquisition of multi-temporal UAV images by a six-band camera along different crop phenology stages (94, 104, 119, 130, 143, 161, and 175 days after sowing), (2) the generation of ortho-mosaicked images of the full field experiment, (3) the image analysis with an object-based (OBIA) algorithm and the calculation of vegetation indices (VIs), (4) the statistical analysis of spectral data and bioethanol-related variables to predict a UAV-based ranking of cereal accessions in terms of theoretical ethanol yield. The UAV-based system captured the high variability observed in the field trials over time. Three VIs created with visible wavebands and four VIs that incorporated the near-infrared (NIR) waveband were studied, obtaining that the NIR-based VIs were the best at estimating the crop biomass, while the visible-based VIs were suitable for estimating crop sugar release. The temporal factor was very helpful in achieving better estimations. The results that were obtained from single dates [i.e., temporal scenario 1 (TS-1)] were always less accurate for estimating the sugar release than those obtained in TS-2 (i.e., averaging the values of each VI obtained during plant anthesis) and less accurate for estimating the crop biomass and theoretical ethanol yield than those obtained in TS-3 (i.e., averaging the values of each VI obtained during full crop development). The highest correlation to theoretical ethanol yield was obtained with the normalized difference vegetation index (R 2 = 0.66), which allowed to rank the cereal accessions in terms of potential for bioethanol production.

6.
PLoS One ; 14(6): e0218132, 2019.
Article in English | MEDLINE | ID: mdl-31185068

ABSTRACT

The perennial and stoloniferous weed, Cynodon dactylon (L.) Pers. (bermudagrass), is a serious problem in vineyards. The spectral similarity between bermudagrass and grapevines makes discrimination of the two species, based solely on spectral information from multi-band imaging sensor, unfeasible. However, that challenge can be overcome by use of object-based image analysis (OBIA) and ultra-high spatial resolution Unmanned Aerial Vehicle (UAV) images. This research aimed to automatically, accurately, and rapidly map bermudagrass and design maps for its management. Aerial images of two vineyards were captured using two multispectral cameras (RGB and RGNIR) attached to a UAV. First, spectral analysis was performed to select the optimum vegetation index (VI) for bermudagrass discrimination from bare soil. Then, the VI-based OBIA algorithm developed for each camera automatically mapped the grapevines, bermudagrass, and bare soil (accuracies greater than 97.7%). Finally, site-specific management maps were generated. Combining UAV imagery and a robust OBIA algorithm allowed the automatic mapping of bermudagrass. Analysis of the classified area made it possible to quantify grapevine growth and revealed expansion of bermudagrass infested areas. The generated bermudagrass maps could help farmers improve weed control through a well-programmed strategy. Therefore, the developed OBIA algorithm offers valuable geo-spatial information for designing site-specific bermudagrass management strategies leading farmers to potentially reduce herbicide use as well as optimize fuel, field operating time, and costs.


Subject(s)
Algorithms , Cynodon/growth & development , Farms , Image Processing, Computer-Assisted , Models, Biological , Plant Weeds/growth & development , Ultraviolet Rays
7.
Plant Methods ; 15: 160, 2019.
Article in English | MEDLINE | ID: mdl-31889984

ABSTRACT

BACKGROUND: Almond is an emerging crop due to the health benefits of almond consumption including nutritional, anti-inflammatory, and hypocholesterolaemia properties. Traditional almond producers were concentrated in California, Australia, and Mediterranean countries. However, almond is currently present in more than 50 countries due to breeding programs have modernized almond orchards by developing new varieties with improved traits related to late flowering (to reduce the risk of damage caused by late frosts) and tree architecture. Almond tree architecture and flowering are acquired and evaluated through intensive field labour for breeders. Flowering detection has traditionally been a very challenging objective. To our knowledge, there is no published information about monitoring of the tree flowering dynamics of a crop at the field scale by using color information from photogrammetric 3D point clouds and OBIA. As an alternative, a procedure based on the generation of colored photogrammetric point clouds using a low cost (RGB) camera on-board an unmanned aerial vehicle (UAV), and an semi-automatic object based image analysis (OBIA) algorithm was created for monitoring the flower density and flowering period of every almond tree in the framework of two almond phenotypic trials with different planting dates. RESULTS: Our method was useful for detecting the phenotypic variability of every almond variety by mapping and quantifying every tree height and volume as well as the flowering dynamics and flower density. There was a high level of agreement among the tree height, flower density, and blooming calendar derived from our procedure on both fields with the ones created from on-ground measured data. Some of the almond varieties showed a significant linear fit between its crown volume and their yield. CONCLUSIONS: Our findings could help breeders and researchers to reduce the gap between phenomics and genomics by generating accurate almond tree information in an efficient, non-destructive, and inexpensive way. The method described is also useful for data mining to select the most promising accessions, making it possible to assess specific multi-criteria ranking varieties, which are one of the main tools for breeders.

8.
Sensors (Basel) ; 15(8): 19688-708, 2015 Aug 12.
Article in English | MEDLINE | ID: mdl-26274960

ABSTRACT

Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.


Subject(s)
Plant Weeds/physiology , Remote Sensing Technology/methods , Satellite Imagery/methods , Photogrammetry/instrumentation
9.
PLoS One ; 10(6): e0130479, 2015.
Article in English | MEDLINE | ID: mdl-26107174

ABSTRACT

The geometric features of agricultural trees such as canopy area, tree height and crown volume provide useful information about plantation status and crop production. However, these variables are mostly estimated after a time-consuming and hard field work and applying equations that treat the trees as geometric solids, which produce inconsistent results. As an alternative, this work presents an innovative procedure for computing the 3-dimensional geometric features of individual trees and tree-rows by applying two consecutive phases: 1) generation of Digital Surface Models with Unmanned Aerial Vehicle (UAV) technology and 2) use of object-based image analysis techniques. Our UAV-based procedure produced successful results both in single-tree and in tree-row plantations, reporting up to 97% accuracy on area quantification and minimal deviations compared to in-field estimations of tree heights and crown volumes. The maps generated could be used to understand the linkages between tree grown and field-related factors or to optimize crop management operations in the context of precision agriculture with relevant agro-environmental implications.


Subject(s)
Agriculture , Remote Sensing Technology , Trees , High-Throughput Screening Assays
10.
Ann Hepatol ; 14(4): 524-30, 2015.
Article in English | MEDLINE | ID: mdl-26019039

ABSTRACT

BACKGROUND: Transient elastography (TE) is a useful tool for the assessment of hepatic fibrosis as an alternative to liver biopsy, but it has not been validated as a screening procedure in apparently healthy people. AIM: To determine the prevalence of advanced liver fibrosis diagnosed by TE in a socioeconomically challenged rural population. MATERIAL AND METHODS: We enrolled 299 participants aged over 18 years old from a vulnerable population in Mexico who responded to an open invitation. All participants had their history recorded and underwent a general clinical examination and a liver stiffness measurement, performed by a single operator according to international standards. RESULTS: Overall, 7.35% participants were found to be at high risk for cirrhosis. Three variables correlated with a risk for a TE measure ≥ 9 kPa and significant fibrosis: history of alcohol intake [7.95 vs. 92.04%, odds ratio (OR) 4.47, 95% confidence interval (CI) 1.45-13.78, P = 0.0167], body mass index (BMI) ≥ 30 kg/m2 (30.87 vs. 69.12%, OR 4.25, 95%CI 1.04-6.10, P = 0.049), and history of diabetes mellitus (14.87 vs. 85.12%, OR 2.76, 95%CI 1.002-7.63, P = 0.0419). In the multivariate analyses BMI ≥ 30 kg/m2 was the only significant risk factor for advanced liver fibrosis or cirrhosis (OR 2.54, 95%CI 1.02-6.3, P = 0.0460). CONCLUSION: TE could be useful as a screening process to identify advanced liver fibrosis in the general and apparently healthy population.


Subject(s)
Elasticity Imaging Techniques , Liver Cirrhosis/diagnostic imaging , Liver Cirrhosis/epidemiology , Rural Population , Vulnerable Populations , Adult , Alcohol Drinking , Chi-Square Distribution , Comorbidity , Female , Humans , Logistic Models , Male , Mexico/epidemiology , Middle Aged , Multivariate Analysis , Odds Ratio , Predictive Value of Tests , Prevalence , Risk Factors , Rural Health , Socioeconomic Factors
11.
Sensors (Basel) ; 15(3): 5609-26, 2015 Mar 06.
Article in English | MEDLINE | ID: mdl-25756867

ABSTRACT

In order to optimize the application of herbicides in weed-crop systems, accurate and timely weed maps of the crop-field are required. In this context, this investigation quantified the efficacy and limitations of remote images collected with an unmanned aerial vehicle (UAV) for early detection of weed seedlings. The ability to discriminate weeds was significantly affected by the imagery spectral (type of camera), spatial (flight altitude) and temporal (the date of the study) resolutions. The colour-infrared images captured at 40 m and 50 days after sowing (date 2), when plants had 5-6 true leaves, had the highest weed detection accuracy (up to 91%). At this flight altitude, the images captured before date 2 had slightly better results than the images captured later. However, this trend changed in the visible-light images captured at 60 m and higher, which had notably better results on date 3 (57 days after sowing) because of the larger size of the weed plants. Our results showed the requirements on spectral and spatial resolutions needed to generate a suitable weed map early in the growing season, as well as the best moment for the UAV image acquisition, with the ultimate objective of applying site-specific weed management operations.


Subject(s)
Agriculture , Seedlings , Weed Control , Aircraft , Humans , Plant Weeds/growth & development , Remote Sensing Technology
12.
PLoS One ; 8(10): e77151, 2013.
Article in English | MEDLINE | ID: mdl-24146963

ABSTRACT

The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance.


Subject(s)
Agriculture , Plant Weeds , Weed Control , Zea mays , Remote Sensing Technology/instrumentation , Remote Sensing Technology/methods , Seasons , Weed Control/methods , Zea mays/growth & development
13.
PLoS One ; 8(3): e58210, 2013.
Article in English | MEDLINE | ID: mdl-23483997

ABSTRACT

A new aerial platform has risen recently for image acquisition, the Unmanned Aerial Vehicle (UAV). This article describes the technical specifications and configuration of a UAV used to capture remote images for early season site- specific weed management (ESSWM). Image spatial and spectral properties required for weed seedling discrimination were also evaluated. Two different sensors, a still visible camera and a six-band multispectral camera, and three flight altitudes (30, 60 and 100 m) were tested over a naturally infested sunflower field. The main phases of the UAV workflow were the following: 1) mission planning, 2) UAV flight and image acquisition, and 3) image pre-processing. Three different aspects were needed to plan the route: flight area, camera specifications and UAV tasks. The pre-processing phase included the correct alignment of the six bands of the multispectral imagery and the orthorectification and mosaicking of the individual images captured in each flight. The image pixel size, area covered by each image and flight timing were very sensitive to flight altitude. At a lower altitude, the UAV captured images of finer spatial resolution, although the number of images needed to cover the whole field may be a limiting factor due to the energy required for a greater flight length and computational requirements for the further mosaicking process. Spectral differences between weeds, crop and bare soil were significant in the vegetation indices studied (Excess Green Index, Normalised Green-Red Difference Index and Normalised Difference Vegetation Index), mainly at a 30 m altitude. However, greater spectral separability was obtained between vegetation and bare soil with the index NDVI. These results suggest that an agreement among spectral and spatial resolutions is needed to optimise the flight mission according to every agronomical objective as affected by the size of the smaller object to be discriminated (weed plants or weed patches).


Subject(s)
Aircraft , Robotics , Weed Control/instrumentation , Weed Control/methods , Altitude , Photography
SELECTION OF CITATIONS
SEARCH DETAIL
...