Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS One ; 14(6): e0218132, 2019.
Article in English | MEDLINE | ID: mdl-31185068

ABSTRACT

The perennial and stoloniferous weed, Cynodon dactylon (L.) Pers. (bermudagrass), is a serious problem in vineyards. The spectral similarity between bermudagrass and grapevines makes discrimination of the two species, based solely on spectral information from multi-band imaging sensor, unfeasible. However, that challenge can be overcome by use of object-based image analysis (OBIA) and ultra-high spatial resolution Unmanned Aerial Vehicle (UAV) images. This research aimed to automatically, accurately, and rapidly map bermudagrass and design maps for its management. Aerial images of two vineyards were captured using two multispectral cameras (RGB and RGNIR) attached to a UAV. First, spectral analysis was performed to select the optimum vegetation index (VI) for bermudagrass discrimination from bare soil. Then, the VI-based OBIA algorithm developed for each camera automatically mapped the grapevines, bermudagrass, and bare soil (accuracies greater than 97.7%). Finally, site-specific management maps were generated. Combining UAV imagery and a robust OBIA algorithm allowed the automatic mapping of bermudagrass. Analysis of the classified area made it possible to quantify grapevine growth and revealed expansion of bermudagrass infested areas. The generated bermudagrass maps could help farmers improve weed control through a well-programmed strategy. Therefore, the developed OBIA algorithm offers valuable geo-spatial information for designing site-specific bermudagrass management strategies leading farmers to potentially reduce herbicide use as well as optimize fuel, field operating time, and costs.


Subject(s)
Algorithms , Cynodon/growth & development , Farms , Image Processing, Computer-Assisted , Models, Biological , Plant Weeds/growth & development , Ultraviolet Rays
2.
Sci Rep ; 8(1): 2793, 2018 02 12.
Article in English | MEDLINE | ID: mdl-29434226

ABSTRACT

Several diseases have threatened tomato production in Florida, resulting in large losses, especially in fresh markets. In this study, a high-resolution portable spectral sensor was used to investigate the feasibility of detecting multi-diseased tomato leaves in different stages, including early or asymptomatic stages. One healthy leaf and three diseased tomato leaves (late blight, target and bacterial spots) were defined into four stages (healthy, asymptomatic, early stage and late stage) and collected from a field. Fifty-seven spectral vegetation indices (SVIs) were calculated in accordance with methods published in previous studies and established in this study. Principal component analysis was conducted to evaluate SVIs. Results revealed six principal components (PCs) whose eigenvalues were greater than 1. SVIs with weight coefficients ranking from 1 to 30 in each selected PC were applied to a K-nearest neighbour for classification. Amongst the examined leaves, the healthy ones had the highest accuracy (100%) and the lowest error rate (0) because of their uniform tissues. Late stage leaves could be distinguished more easily than the two other disease categories caused by similar symptoms on the multi-diseased leaves. Further work may incorporate the proposed technique into an image system that can be operated to monitor multi-diseased tomato plants in fields.


Subject(s)
Plant Leaves/microbiology , Solanum lycopersicum/microbiology , Spectrum Analysis/methods , Solanum lycopersicum/metabolism , Plant Diseases/microbiology , Plant Leaves/metabolism , Principal Component Analysis
3.
PLoS One ; 9(3): e91275, 2014.
Article in English | MEDLINE | ID: mdl-24604031

ABSTRACT

A procedure to achieve the semi-automatic relative image normalization of multitemporal remote images of an agricultural scene called ARIN was developed using the following procedures: 1) defining the same parcel of selected vegetative pseudo-invariant features (VPIFs) in each multitemporal image; 2) extracting data concerning the VPIF spectral bands from each image; 3) calculating the correction factors (CFs) for each image band to fit each image band to the average value of the image series; and 4) obtaining the normalized images by linear transformation of each original image band through the corresponding CF. ARIN software was developed to semi-automatically perform the ARIN procedure. We have validated ARIN using seven GeoEye-1 satellite images taken over the same location in Southern Spain from early April to October 2010 at an interval of approximately 3 to 4 weeks. The following three VPIFs were chosen: citrus orchards (CIT), olive orchards (OLI) and poplar groves (POP). In the ARIN-normalized images, the range, standard deviation (s. d.) and root mean square error (RMSE) of the spectral bands and vegetation indices were considerably reduced compared to the original images, regardless of the VPIF or the combination of VPIFs selected for normalization, which demonstrates the method's efficacy. The correlation coefficients between the CFs among VPIFs for any spectral band (and all bands overall) were calculated to be at least 0.85 and were significant at P = 0.95, indicating that the normalization procedure was comparably performed regardless of the VPIF chosen. ARIN method was designed only for agricultural and forestry landscapes where VPIFs can be identified.


Subject(s)
Agriculture , Imaging, Three-Dimensional , Remote Sensing Technology , Automation , Crops, Agricultural/anatomy & histology , Reference Standards , Software , Time Factors
4.
PLoS One ; 8(10): e77151, 2013.
Article in English | MEDLINE | ID: mdl-24146963

ABSTRACT

The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance.


Subject(s)
Agriculture , Plant Weeds , Weed Control , Zea mays , Remote Sensing Technology/instrumentation , Remote Sensing Technology/methods , Seasons , Weed Control/methods , Zea mays/growth & development
5.
PLoS One ; 8(3): e58210, 2013.
Article in English | MEDLINE | ID: mdl-23483997

ABSTRACT

A new aerial platform has risen recently for image acquisition, the Unmanned Aerial Vehicle (UAV). This article describes the technical specifications and configuration of a UAV used to capture remote images for early season site- specific weed management (ESSWM). Image spatial and spectral properties required for weed seedling discrimination were also evaluated. Two different sensors, a still visible camera and a six-band multispectral camera, and three flight altitudes (30, 60 and 100 m) were tested over a naturally infested sunflower field. The main phases of the UAV workflow were the following: 1) mission planning, 2) UAV flight and image acquisition, and 3) image pre-processing. Three different aspects were needed to plan the route: flight area, camera specifications and UAV tasks. The pre-processing phase included the correct alignment of the six bands of the multispectral imagery and the orthorectification and mosaicking of the individual images captured in each flight. The image pixel size, area covered by each image and flight timing were very sensitive to flight altitude. At a lower altitude, the UAV captured images of finer spatial resolution, although the number of images needed to cover the whole field may be a limiting factor due to the energy required for a greater flight length and computational requirements for the further mosaicking process. Spectral differences between weeds, crop and bare soil were significant in the vegetation indices studied (Excess Green Index, Normalised Green-Red Difference Index and Normalised Difference Vegetation Index), mainly at a 30 m altitude. However, greater spectral separability was obtained between vegetation and bare soil with the index NDVI. These results suggest that an agreement among spectral and spatial resolutions is needed to optimise the flight mission according to every agronomical objective as affected by the size of the smaller object to be discriminated (weed plants or weed patches).


Subject(s)
Aircraft , Robotics , Weed Control/instrumentation , Weed Control/methods , Altitude , Photography
6.
ScientificWorldJournal ; 2012: 630390, 2012.
Article in English | MEDLINE | ID: mdl-22629171

ABSTRACT

In the context of detection of weeds in crops for site-specific weed control, on-ground spectral reflectance measurements are the first step to determine the potential of remote spectral data to classify weeds and crops. Field studies were conducted for four years at different locations in Spain. We aimed to distinguish cruciferous weeds in wheat and broad bean crops, using hyperspectral and multispectral readings in the visible and near-infrared spectrum. To identify differences in reflectance between cruciferous weeds, we applied three classification methods: stepwise discriminant (STEPDISC) analysis and two neural networks, specifically, multilayer perceptron (MLP) and radial basis function (RBF). Hyperspectral and multispectral signatures of cruciferous weeds, and wheat and broad bean crops can be classified using STEPDISC analysis, and MLP and RBF neural networks with different success, being the MLP model the most accurate with 100%, or higher than 98.1%, of classification performance for all the years. Classification accuracy from hyperspectral signatures was similar to that from multispectral and spectral indices, suggesting that little advantage would be obtained by using more expensive airborne hyperspectral imagery. Therefore, for next investigations, we recommend using multispectral remote imagery to explore whether they can potentially discriminate these weeds and crops.


Subject(s)
Algorithms , Brassicaceae/anatomy & histology , Neural Networks, Computer , Pattern Recognition, Automated/methods , Plant Weeds/anatomy & histology , Seasons , Spectrum Analysis/methods , Brassicaceae/physiology , Discriminant Analysis , Plant Weeds/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...