Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Phytopathology ; 110(4): 734-743, 2020 Apr.
Article in English | MEDLINE | ID: mdl-31859585

ABSTRACT

Studies in plant pathology, agronomy, and plant breeding requiring disease severity assessment often use quantitative ordinal scales (i.e., a special type of ordinal scale that uses defined numeric ranges); a frequently used example of such a scale is the Horsfall-Barratt scale. Parametric proportional odds models (POMs) may be used to analyze the ratings obtained from quantitative ordinal scales directly, without converting ratings to percent area affected using range midpoints of such scales (currently a standard procedure). Our aim was to evaluate the performance of the POM for comparing treatments using ordinal estimates of disease severity relative to two alternatives, the midpoint conversions (MCs) and nearest percent estimates (NPEs). A simulation method was implemented and the parameters of the simulation estimated using actual disease severity data from the field. The criterion for comparison of the three approaches was the power of the hypothesis test (the probability to reject the null hypothesis when it is false). Most often, NPEs had superior performance. The performance of the POM was never inferior to using the MC at severity <40%. Especially at low disease severity (≤10%), the POM was superior to using the MC method. Thus, for early onset of disease or for comparing treatments with severities <40%, the POM is preferable for analyzing disease severity data based on quantitative ordinal scales when comparing treatments and at severities >40% is equivalent to other methods.


Subject(s)
Plant Diseases , Plant Pathology , Data Collection , Probability , Research Design
2.
Plant Dis ; 99(8): 1104-1112, 2015 Aug.
Article in English | MEDLINE | ID: mdl-30695946

ABSTRACT

Assessment of disease severity is required for several purposes in plant pathology; most often, the estimates are made visually. It is established that visual estimates can be inaccurate and unreliable. The ramifications of biased or imprecise estimates by raters have not been fully explored using empirical data, partly because of the logistical difficulties involved in different raters assessing the same leaves for which actual disease has been measured in a replicated experiment with multiple treatments. In this study, nearest percent estimates (NPEs) of Septoria leaf blotch (SLB) on leaves of winter wheat from nontreated and fungicide-treated plots were assessed in both 2006 and 2007 by four raters and compared with assumed actual values measured using image analysis. Lin's concordance correlation (LCC, ρc) was used to assess agreement between the two approaches. NPEs were converted to Horsfall-Barratt (HB) midpoints and were compared with actual values. The estimates of SLB severity from fungicide-treated and nontreated plots were analyzed using generalized linear mixed modeling to ascertain effects of rater using both the NPE and HB values. Rater 1 showed good accuracy (ρc = 0.986 to 0.999), while raters 3 and 4 were less accurate (ρc = 0.205 to 0.936). Conversion to the HB scale had little effect on bias but reduced numerically both precision and accuracy for most raters on most assessment dates (precision, r = -0.001 to -0.132; and accuracy, ρc = -0.003 to -0.468). Interrater reliability was also reduced slightly by conversion of estimates to HB midpoint values. Estimates of mean SLB severity were significantly different between image analysis and raters 2, 3, and 4, and there were frequently significant differences among raters (F = 151 to 1,260, P = 0.001 to P < 0.0001). Only on 26 June 2007 did conversion to the HB scale change the means separation ranking of rater estimates. Nonetheless, image analysis and all raters were able to differentiate control and treated-plot treatments (F = 116 to 1,952, P = 0.002 to P < 0.0001, depending on date and rater). Conversion of NPEs to the HB scale tended to reduce F values slightly (2006: NPEs, F = 116 to 276, P = 0.002 to 0.0005; and, for the HB-converted values, F = 101 to 270, P = 0.002 to 0.0005; 2007: NPEs, F = 164 to 1,952, P = 0.001 to P < 0.0001; and, for HB-converted values, F = 126 to 1,633, P = 0.002 to P < 0.0001). The results reaffirm the need for accurate and reliable disease assessment to minimize over- or underestimates compared with actual disease, and the data we present support the view that, where multiple raters are deployed, they should be assigned in a manner to reduce any potential effect of rater differences on the analysis.

3.
Plant Dis ; 95(4): 384-393, 2011 Apr.
Article in English | MEDLINE | ID: mdl-30743337

ABSTRACT

The Septoria leaf blotch prediction model PROCULTURE was used to assess the impact on simulated infection rates when using rainfall estimated by radar instead of rain gauge measurements. When comparing infection events simulated by PROCULTURE using radar- and gauge-derived data, the simulated probability of detection (POD) of infection events was high (0.83 on average), and the simulated false alarm ratio (FAR) of infection events was not negligible (0.24 on average). For most stations, simulation-observed FAR decreased to 0 and simulation-observed POD increased (0.85 on average) when the model outputs for both datasets were compared against visual observations of disease symptoms. An analysis of 148 infection events over 3 years at four locations showed no significant difference in the number of infection events of simulations using either dataset, indicating that, for a given location, radar estimates were as reliable as rain gauges for predicting infection events. Radar also provided better estimates of rainfall occurrence over a continuous space than weather station networks. The high spatial resolution provides radar with an important advantage that could significantly improve existing warning systems.

4.
Plant Dis ; 93(10): 983-992, 2009 Oct.
Article in English | MEDLINE | ID: mdl-30754378

ABSTRACT

A mechanistic model, PROCULTURE, for assessing the development of each of the last five leaf layers and the progress of Septoria leaf blotch, caused by Septoria tritici (teleomorph Mycosphaerella graminicola), has been applied on susceptible and weakly susceptible winter wheat (Triticum aestivum) cultivars in two locations (Everlange and Reuland) in Luxembourg over a 3-year period (2000 to 2002). A double performance assessment of PROCULTURE was conducted in this study. First, the capability of PROCULTURE to correctly simulate S. tritici incidence was checked. Second, the model's ability to accurately estimate disease severity was assessed on the basis of the difference between simulated and observed levels of disease development at each leaf layer. The model accurately predicted disease occurrence in the 2000 and 2002 seasons, on susceptible and semi-susceptible cultivars, with a probability of detection (POD) exceeding 0.90. However, in 2001, even though the POD never fell below 0.90, the false alarm ratio (FAR) was too high to consider the simulations satisfactory. Concerning the evaluation of disease severity modeling, statistical tests revealed accurate simulations performed by PROCULTURE for susceptible cultivars in 2000 and 2002. By contrast, for weakly susceptible cultivars, the model overestimated disease severity, especially for the upper leaves, for the same period.

5.
Plant Dis ; 93(9): 971, 2009 Sep.
Article in English | MEDLINE | ID: mdl-30754567

ABSTRACT

Wheat leaf rust caused by Puccinia triticina Eriks. was identified for the first time in 2000 in the Grand Duchy of Luxembourg on the basis of orange-to-brown, round-to-ovoid, erumpent uredinia (1 to 1.5 mm in diameter) scattered on the upper and lower leaf surfaces and producing orange-brown urediniospores that are subgloboid, approximately 20 µm in diameter, and with up to eight germ pore scattered in thick, echinulate walls. In a second phase, wheat was monitored weekly (starting from Zadoks growth stage 30, pseudo stem erection) during the 2003-2008 cropping seasons for wheat leaf rust. Disease severity (percentage of leaf area with symptoms) was recorded in four, replicated field experiments located in three villages (Diekirch District: Reuler; and Grevenmacher District: Burmerange and Christnach), which are representative of the different agroclimatological zones of Luxembourg. A significant difference in severity was observed between the sites (P < 0.01) and the years (P < 0.05). Over the 6-year period, Burmerange and Reuler consistently showed the highest and lowest disease severity, respectively. In 2003 and 2007, Burmerange (a southern site with the highest average spring temperatures of 13.6 and 14.0°C, respectively) showed the highest disease severity with 66 and 57%, respectively, whereas the lowest severity (<1% for both years) was observed in the north at Reuler (site with the lowest average spring temperatures of 12.0 and 12.4°C, respectively). Christnach, located midway between Reuler and Burmerange, showed an intermediate disease severity with 7% (2003) and 22% (2007). The disease appeared at growth stages 77 (late milk) and 87 (hard dough) in the period 2003-2005, but at an earlier stage (45, boots swollen) for 2006-2008 (P < 0.001). In 2005, low severity was recorded due to a severe drought during May, June, and July. A reason for this earlier appearance of leaf rust occurrences in the two districts may be related to an increase in the average spring temperature (average March to May temperature for Luxembourg was 8.3°C for the 1971-2000 period, 9.5°C for the 2003-2005 period, 9.9°C for the 2006-2008 period, 2007 was exceptional with 11.9°C, P < 0.01). In the past, cereal disease management strategies were oriented toward the control of predominant and yield-reducing diseases such as that caused by Septoria tritici Desm. Because the succession of mild winters and warm springs during the last 5 years allowed the early occurrence and the fast development of wheat leaf rust in the Grand Duchy of Luxembourg, it is advisable to take this disease into account in fungicide application schemes.

6.
Plant Dis ; 92(11): 1587, 2008 Nov.
Article in English | MEDLINE | ID: mdl-30764457

ABSTRACT

Following a comparatively mild winter (1.9°C above average [2000-2007]), Fusarium head blight (FHB) on winter wheat was observed during the 2007 season in 17 sites representing all three districts of Diekirch, Grevenmacher, and Luxembourg. The cultivars encountered were diverse and included Achat, Akteur, Aron, Bussard, Cubus, Enorm, Exclusiv, Flair, Rosario, Tommi, and Urban. The preceding crops were maize (six sites), rapeseed (three sites), and one site each of pea, triticale, winter barley, and winter wheat. Rainfalls recorded during the flowering period (June 1-23, mean June 12 for GS 65) ranged from 13 to 62 (mean 38) mm. An overall prevalence of FHB (percentage of infected spikes) of 8.9 ± 15.5% (mean ± SD) and a severity (percentage of infected grains per spike) of 21.0 ± 17.8% were recorded. A significant difference in FHB severity was observed between the cantons north and south of Luxembourg City, 13.4 ± 13.1% (range 0.01 to 46.4) and 35.1 ± 18.1% (range 6.2 to 61.9), respectively (Man-Whitney, P = 0.027), indicating the importance to take regional specificities such as topoclimatological aspects into account. Maize as a preceding crop resulted in significant higher prevalence of FHB as opposed to the other crops (5.9 ± 1.6% versus 3.3 ± 2.2%, Man-Whitney, P = 0.022).

SELECTION OF CITATIONS
SEARCH DETAIL
...