Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Plant Dis ; 103(5): 890-896, 2019 May.
Article in English | MEDLINE | ID: mdl-30807245

ABSTRACT

In the Pacific Northwest, chasmothecia formation is not observed in vineyards until the beginning of véraison despite heavy infestations whereby 100% of leaf tissue is covered by Erysiphe necator. Mating type proximity and distribution were sampled from individual lesions (∼71 mm2) on leaf tissue in a stratified sampling from three canopy heights at three times during the 2013, 2014, and 2015 growing seasons. Both mating types were observed at every sampling point and within the same lesions at all sampling dates and canopy heights. Effect of temperature and day length were examined by inoculating seedlings with known mating type 1 and 2 isolates and placed in incubators at different temperatures (5, 10, 15, 20, 25, and 30°C) or different day length changes (long day to long day, long day to short day, short day to short day, and short day to long day). Chasmothecia were produced at all temperatures that E. necator was able to colonize tissue, and the greatest number of chasmothecia were produced at 15 and 20°C (P ≤ 0.02). Day length shifts from short day (8 h) to long day (16 h) resulted in a significant increase in chasmothecia production (P < 0.001). End of season plant stress observed in the Pacific Northwest, such as water stress or host senescence, was assessed under naturally infested field conditions by either girdling canes or applying 150 mg·liter-1 abscisic acid solution to vines, respectively, and quantifying chasmothecia production. No differences were observed in chasmothecia production in the plant stress assessment, likely due to the high vigor and ability for plants to overcome stress treatments.


Subject(s)
Ascomycota , Plant Diseases , Vitis , Ascomycota/physiology , Northwestern United States , Plant Diseases/microbiology , Seasons , Stress, Physiological , Vitis/microbiology
2.
Plant Dis ; 102(8): 1500-1508, 2018 Aug.
Article in English | MEDLINE | ID: mdl-30673425

ABSTRACT

Predictive models have been developed in several major grape-growing regions to correlate environmental conditions to Erysiphe necator ascospore release; however, these models may not be broadly applicable in regions with different climatic conditions. To assess ascospore release in near-coastal regions of western Oregon, chasmothecia (syn. cleistothecia) were collected prior to leaf drop and placed onto natural and artificial grape trunk segments and overwintered outside. Ascospore release was monitored for three overwintering seasons using custom impaction spore traps from leaf drop (Biologische Bundesanstalt, Bundessortenamt und Chemische Industrie [BBCH] 97) until the onset of the disease epidemic in the following growing season. Airborne inoculum was concurrently monitored in a naturally infested research vineyard. Weather and ascospore release data were used to assess previously developed models and correlate environmental conditions to ascospore release. Ascospore release was predicted by all models prior to bud break (BBCH 08), and was observed from the first rain event following the start of inoculum monitoring until monitoring ceased. Previously developed models overpredicted ascospore release in the Willamette Valley and predicted exhaustion of inoculum prior to bud break. The magnitude of ascospore release could not be correlated to environmental conditions; thus, a binary ascospore release model was developed where release is a function of the collective occurrence of the following factors within a 24-h period: >6 h of cumulative leaf wetness during temperatures >4°C, precipitation >2.5 mm, and relative humidity >80%. The Oregon model was validated using field-collected ascospore datasets, and predicted ascospore release with 66% accuracy (P = 0.02). Extant methods for estimating ascospore release may not be sufficiently accurate to use as predictive models in wet, temperate climatic regions.


Subject(s)
Ascomycota/physiology , Climate , Models, Biological , Spores, Fungal/physiology , Oregon , Plant Diseases/microbiology , Plant Leaves/microbiology , Rain , Seasons , Temperature , Vitis/microbiology , Weather
3.
Plant Dis ; 101(7): 1246-1252, 2017 Jul.
Article in English | MEDLINE | ID: mdl-30682951

ABSTRACT

Management of grape powdery mildew (Erysiphe necator) and other polycyclic diseases relies on numerous fungicide applications that follow calendar or model-based application intervals, both of which assume that inoculum is always present. Quantitative molecular assays have been previously developed to initiate fungicide applications, and could be used to optimize fungicide application intervals throughout the growing season based on inoculum concentration. Airborne inoculum samplers were placed at one research and six commercial vineyards in the Willamette Valley of Oregon. Fungicide applications in all plots were initiated at the first detection of E. necator inoculum, and all subsequent fungicide application intervals were made based the grower' standard calendar program or based on inoculum concentration. In adjusted-interval plots, fungicides were applied at the shortest labeled application interval when >10 spores were detected and the longest labeled application interval when <10 spores were detected. Fungicide applications in control plots consisted of the growers' standard management practice. An average of 2.3 fewer fungicide applications in 2013 and 1.6 fewer fungicide applications in 2014 were used in the adjusted fungicide application intervals treatment in grower fields without significant differences in berry or leaf disease incidence between treatments.

4.
Plant Dis ; 96(5): 726-736, 2012 May.
Article in English | MEDLINE | ID: mdl-30727517

ABSTRACT

Many plant disease epidemic models, and the disease management decision aids developed from them, are created based on temperature or other weather conditions measured in or above the crop canopy at intervals of 15 or 30 min. Disease management decision aids, however, commonly are implemented based on hourly weather measurements made from sensors sited at a standard placement of 1.5 m above the ground or are estimated from off-site weather measurements. We investigated temperature measurement errors introduced when sampling interval was increased from 15 to 60 min, and when actual in-canopy conditions were represented by temperature measurements collected by standard-placement sensors (1.5 m above the ground, outside the canopy) in each of three crops (grass seed, grape, and hops) and assessed the impact of these errors on outcomes of decision aids for grass stem rust as well as grape and hops powdery mildews. Decreasing time resolution from 15 to 60 min resulted in statistically significant underestimates of daily maximum temperatures and overestimates of daily minimum temperatures that averaged 0.2 to 0.4°C. Sensor location (in-canopy versus standard-placement) also had a statistically significant effect on measured temperature, and this effect was significantly less in grape or hops than in the grass seed crop. Effects of these temperature errors on performance of disease management decision aids were affected by magnitude of the errors as well as the type of decision aid. The grape and hops powdery mildew decision aids used rule-based indices, and the relatively small (±0.8°C) differences in temperature observed between in-canopy and standard placement sensors in these crops resulted in differences in rule outcomes when actual in-canopy temperatures were near a threshold for declaring that a rule had been met. However, there were only minor differences in the management decision (i.e., fungicide application interval). The decision aid for grass stem rust was a simulation model, for which temperature recording errors associated with location of the weather station resulted in incremental (not threshold) effects on the model of pathogen growth and plant infection probability. Simple algorithms were devised to correct the recorded temperatures or the computed infection probability to produce outcomes similar to those resulting from in-canopy temperature measurements. This study illustrates an example of evaluating (and, if necessary, correcting) temperature measurement errors from weather station sensors not located within the crop canopy, and provides an estimate of uncertainty in temperature measurements associated with location and sampling interval of weather station sensors.

5.
Phytopathology ; 101(6): 644-53, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21091182

ABSTRACT

Many disease management decision support systems (DSSs) rely, exclusively or in part, on weather inputs to calculate an indicator for disease hazard. Error in the weather inputs, typically due to forecasting, interpolation, or estimation from off-site sources, may affect model calculations and management decision recommendations. The extent to which errors in weather inputs affect the quality of the final management outcome depends on a number of aspects of the disease management context, including whether management consists of a single dichotomous decision, or of a multi-decision process extending over the cropping season(s). Decision aids for multi-decision disease management typically are based on simple or complex algorithms of weather data which may be accumulated over several days or weeks. It is difficult to quantify accuracy of multi-decision DSSs due to temporally overlapping disease events, existence of more than one solution to optimizing the outcome, opportunities to take later recourse to modify earlier decisions, and the ongoing, complex decision process in which the DSS is only one component. One approach to assessing importance of weather input errors is to conduct an error analysis in which the DSS outcome from high-quality weather data is compared with that from weather data with various levels of bias and/or variance from the original data. We illustrate this analytical approach for two types of DSS, an infection risk index for hop powdery mildew and a simulation model for grass stem rust. Further exploration of analysis methods is needed to address problems associated with assessing uncertainty in multi-decision DSSs.


Subject(s)
Decision Support Techniques , Plant Diseases/prevention & control , Research Design/standards , Weather , Agriculture/economics , Agriculture/methods , Agriculture/trends , Algorithms , Models, Biological , Plant Diseases/microbiology , Software , Time Factors
6.
Plant Dis ; 94(5): 581-588, 2010 May.
Article in English | MEDLINE | ID: mdl-30754461

ABSTRACT

The blackberry rust pathogen Phragmidium violaceum was first observed in Oregon in spring 2005 on both commercially cultivated Rubus laciniatus (Evergreen blackberry) and naturalized R. armeniacus (Himalayan blackberry). Several commercial plantings suffered severe economic losses. In 2006 to 2008, all five spore stages of this autoecious, macrocyclic rust pathogen were observed annually, and asexual perennation of the pathogen on old leaves or in leaf buds was not evident in the disease cycle. In field experiments, teliospore germination and infection by basidiospores occurred mostly during April. On potted "trap" plants exposed for periods of 1 week under dense collections of dead leaves bearing teliospores, basidiospore infection was associated with wetness durations of >16 h with mean temperatures >8°C. Trap plants placed under the bundles of collected leaves frequently developed spermagonia, whereas only 1 of 630 trap plants placed in a production field of R. laciniatus became diseased, an indication that the effective dispersal distance of basidiospores may be limited. In growth chambers programmed for constant temperatures of 5, 10, 15, 20, 25, and 30°C, a minimum of six continuous hours of leaf wetness was required for infection by urediniospores, with >9 h required for moderate infection (>4 pustules/cm2) at 15 and 20°C. With diurnal temperature regimes averaging 5, 10, 15, 20, or 25°C, urediniospore germination and infection was highest in the range of 5 to 15°C; similarly, in the diurnal environment, >9 h of leaf wetness was required to attain moderate infection. In the field, lime sulfur applied as a delayed dormant treatment significantly suppressed teliospore germination and basidiospore infection. Over two seasons, one application of myclobutanil, a demethylation-inhibitor fungicide, applied in early May near the time of spermagonial appearance provided effective suppression of the summer epidemic.

7.
Phytopathology ; 98(10): 1107-17, 2008 Oct.
Article in English | MEDLINE | ID: mdl-18943457

ABSTRACT

The incidence of hop powdery mildew on leaves, caused by Podosphaera macularis, collected from 1,606 transects in 77 commercial hop yards in Oregon and Washington over 9 years was used to assess variability in heterogeneity of disease and the estimated binary power law parameters. Spatial analyses of data sets were conducted at the level of individual rows (row level) and multiple rows within a yard (yard level). The binary power law provided a good fit to all data sets, with R(2) values ranging from 0.933 to 0.993. At the row level, the intercept parameter ln(A(x)) was >0 for 8 years, but was not significantly greater than 0 in 2006. The parameter b was greater than 1 for all row-level data sets collected from 1999 to 2005, but was <1 in 2006 and not significantly different from 1 in 2007. Covariance analysis indicated the factor 'region' affected ln(A(x)) in 3 years, and b in 2 years. 'Cultivar' had an effect on ln(A(x)) in 3 years and b in year. At the yard level, ln(A(x)) was greater than 0 for 6 years, but in 2006 and 2007, ln(A(x)) was not significantly different from 0. The slope parameter b was greater than 1 in 6 years, but was not significantly different from 1 in 2006 and 2007. Differences in b among years were large enough to have practical implications for sample sizes and precision of fixed and sequential sampling. Although the binary power law parameter tended to be relatively stable, variability of the estimated parameters may have practical consequences for sampling precision and costs.


Subject(s)
Ascomycota/growth & development , Plant Diseases/microbiology , Plant Diseases/statistics & numerical data , Plant Leaves/microbiology , Ascomycota/isolation & purification , Binomial Distribution , Incidence , Models, Statistical , Oregon , Washington
8.
J Nematol ; 34(2): 98-105, 2002 Jun.
Article in English | MEDLINE | ID: mdl-19265915

ABSTRACT

A 3-year field rotation study was conducted to assess the potential of switchgrass (Panicum virgatum) to suppress root-knot nematodes (Meloidogyne arenaria), southern blight (Sclerotium rolfsii), and aflatoxigenic fungi (Aspergillus sp.) in peanut (Arachis hypogaea L.) and to assess shifts in microbial populations following crop rotation. Switchgrass did not support populations of root-knot nematodes but supported high populations of nonparasitic nematodes. Peanut with no nematicide applied and following 2 years of switchgrass had the same nematode populations as continuous peanut plus nematicide. Neither previous crop nor nematicide significantly reduced the incidence of pods infected with Aspergillus. However, pod invasion by A. flavus was highest in plots previously planted with peanut and not treated with nematicide. Peanut with nematicide applied at planting following 2 years of switchgrass had significantly less incidence of southern blight than either continuous peanut without nematicide application or peanut without nematicide following 2 years of cotton. Peanut yield did not differ among rotations in either sample year. Effects of crop rotation on the microbial community structure associated with peanut were examined using indices for diversity, richness, and similarity derived from culture-based analyses. Continuous peanut supported a distinctly different rhizosphere bacterial microflora compared to peanut following 1 year of switchgrass, or continuous switchgrass. Richness and diversity indices for continuous peanut rhizosphere and geocarposphere were not consistently different from peanut following switchgrass, but always differed in the specific genera present. These shifts in community structure were associated with changes in parasitic nematode populations.

9.
Appl Environ Microbiol ; 63(4): 1617-22, 1997 Apr.
Article in English | MEDLINE | ID: mdl-9097457

ABSTRACT

Field releases of the wild-type plant growth-promoting rhizobacterium Pseudomonas fluorescens 89B-27, its bioluminescent derivative GEM-8 (89B-27::Tn4431), and a spontaneous rifampin-resistant variant estimating the wild-type population. Seed and root samples were taken 0, 7, 14, 21, or 28, 35 or 42, and 70 days after planting in each year and processed for enumeration by spiral plating or immunofluorescent colony staining (IFC). In both years, the populations of 89B-27, R34, and GEM-8, as measured by IFC, were not significantly different (P > 0.05) from each other at each sampling time. However, the populations of R34 and GEM-8, as measured by spiral plating and differentiation based on their respective phenotypes, were significantly lower (P < 0.05) than the wild-type populations and their IFC-determined populations. These data indicate that traditional marker systems may underestimate populations and hence the survival and colonization of genetically marked bacteria.


Subject(s)
Bacterial Typing Techniques , Environmental Microbiology , Rhizobiaceae/classification , Anti-Bacterial Agents/pharmacology , Drug Resistance, Microbial , Fluorescent Antibody Technique , Luminescence , Rhizobiaceae/drug effects
10.
Can J Microbiol ; 43(4): 344-53, 1997 Apr.
Article in English | MEDLINE | ID: mdl-9115091

ABSTRACT

The future use of genetically modified microorganisms in the environment will be dependent on the ability to asses potential or theoretical risks associated with their introduction into natural ecosystems. To assess potential risks, several ecological parameters must be examined, including the impact of the introduced genetically modified organism on the microbial communities associated with the environment into which the introduction will occur. A 2-year field study was established to examine whether the indigenous bacterial communities of the rhizosphere and endorhiza (internal root tissues) were affected differently by the introduction of an unaltered wild type and its genetically modified derivative. Treatments consisted of the wild-type strain Pseudomonas fluorescens 89B-27 and a bioluminescent derivative GEM-8 (89B-27::Tn4431). Cucumber root or seed samples were taken 0, 7, 14, 21, 35, and 70 days after planting (DAP) in 1994 and 0, 7, 14, 28, 42, and 70 DAP in 1995. Samples were processed to examine the bacterial communities of both the rhizosphere ad endorhiza. Over 7200 bacterial colonies were isolated from the rhizosphere Community structure at the genus level was assessed using genera richness and Hill's diversity numbers, N1 and N2. The aerobic-heterotrophic bacterial community structure at the genus level did not significantly vary between treatments but did differ temporally. The data indicate that the introduction of the genetically modified derivative of 89B-27 did not pose a greater environmental risk than its unaltered wild type with respect to aerobic-heterotrophic bacterial community structure.


Subject(s)
Cucumis sativus/microbiology , Plant Roots/microbiology , Pseudomonas fluorescens/isolation & purification , Pseudomonas fluorescens/classification
SELECTION OF CITATIONS
SEARCH DETAIL
...