Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 35
Filter
Add more filters










Publication year range
1.
Plant Dis ; 103(2): 315-323, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30540536

ABSTRACT

Heart rot, caused by Alternaria alternata, is a major pomegranate disease that impacts production worldwide; most fruits in orchards are colonized by A. alternata but, nevertheless, symptoms are apparent on only a small proportion of the colonized fruits. During the years of our previous research it was noticed that within individual orchards, the incidence of pomegranate fruits exhibiting heart rot symptoms was related to the visual appearance of the trees: trees that appeared visually frail bore more diseased fruits than robust trees. Furthermore, it was noticed that the disease responses of different pomegranate accessions and possibly of different variants of the same cultivar varied markedly. The specific objectives of the present study were: (i) to characterize the relationship between the visual appearance of pomegranate plants or individual stems and the incidence of heart rot and their vulnerability to heart rot; and (ii) to elucidate factors affecting the response of pomegranate fruit to A. alternata. Analysis of heart rot incidence in four orchards in 2014 revealed large differences in heart rot incidence among trees growing side by side in the same orchard; these differences were related to the visual appearance of the pomegranate trees. There were significant differences among germination rates of A. alternata spores in juice prepared from asymptomatic fruits originating from these trees, and comparable differences were found among the acidity levels (pH) of the juices. These differences may reflect differences among the physiological responses of pomegranate trees to heart rot. Fruits collected from the pomegranate collection located in Newe Ya'ar, which comprised 95 accessions in 2015 and 110 accessions in 2016, were also examined. There were differences among the acidity levels (pH) of the juices produced from these fruits and among the germination rates of A. alternata spores in the juices. These differences may reflect variances among the genetic responses of pomegranate accessions to heart rot. Results of studying the relationship between the acidity levels of pomegranate juice and the germination rates of A. alternata spores supported the hypothesis that, apart from pH, a compound(s) present in the juice regulates the germination of A. alternata spores in the juice.


Subject(s)
Alternaria , Fruit , Host-Pathogen Interactions , Lythraceae , Alternaria/physiology , Fruit/microbiology , Lythraceae/microbiology
2.
Phytopathology ; 106(8): 833-41, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27050576

ABSTRACT

Wild Pisum populations prevail in Israel in regions with diverse climatic conditions. A comprehensive survey was conducted in the winters of 2007-08 and 2008-09 at two sites in northern Israel, aiming to (i) document the density of Pisum elatius plants in natural ecosystems and elucidate factors related to their initial infection by Ascochyta blight and (ii) determine the factors governing disease development over time on individual plants. The surveyors identified P. elatius plants growing in designated quadrats, inspected each plant visually, and recorded the incidence and severity of its Ascochyta blight symptoms. Ascochyta blight, caused by Peyronellaea pinodes, was ubiquitous in Pisum elatius populations at both survey sites in both seasons. However, the total leaf area exhibiting disease symptoms of individual plants was very low, and stem and pod infections were rarely observed. Based on analyses of the survey data, it was suggested that, in natural ecosystems, the teleomorph stage of Peyronellaea pinodes serves as the main source of the primary and the secondary inoculum of the disease. In addition, it was found that infected leaves dropped off soon after infection, thereby precluding development of stem lesions. The plants continued growing and did not die; thus, they overcame the disease and could be considered "cured". This phenomenon was examined and confirmed in artificially inoculated, potted-plant experiments. It would be worthwhile to exploit the potential of this unique resistance mechanism as a tool for Ascochyta blight management in pea breeding.


Subject(s)
Ascomycota/physiology , Ecosystem , Pisum sativum/microbiology , Plant Diseases/microbiology , Israel , Population Density
3.
Phytopathology ; 106(8): 824-32, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27050578

ABSTRACT

Domesticated pea fields are grown in relatively close proximity to wild pea species in Israel. Despite the major role attributed to ascochyta blight in causing yield losses in domesticated pea, very limited information is available on the pathogens prevailing in natural ecosystems. The objectives of this study were (i) to identify the species causing ascochyta blight symptoms on leaves, stems, and petioles of domesticated pea and wild Pisum plants in Israel, and (ii) to quantify the temperature response(s) and aggressiveness of such pathogens originating from Pisum plants growing in sympatric and allopatric contexts. Eighteen fungal isolates were examined and identified; three of them were sampled from Pisum sativum, 11 from Pisum fulvum, and four from Pisum elatius. All isolates were identified as Peyronellaea pinodes. Spore germination and mycelial growth took place over a wide range of temperatures, the lower and upper cardinal temperatures being 2 to 9 and 33 to 38°C, respectively; the optimal temperatures ranged from 22 to 26°C. At an optimal temperature, disease severity was significantly higher for plants maintained under moist conditions for 24 h postinoculation than for those exposed to humidity for 5 or 10 h. Analyses of the data revealed that temperature responses, spore germination rates, and aggressiveness of isolates sampled from domesticated pea plants did not differ from those of isolates sampled from adjacent or distant wild populations. Host specificity was not observed. These observations suggest that Israel may be inhabited by a single metapopulation of P. pinodes.


Subject(s)
Ascomycota/physiology , Pisum sativum/microbiology , Plant Diseases/microbiology , Israel , Species Specificity , Temperature
4.
Phytopathology ; 106(3): 254-61, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26623996

ABSTRACT

Clavibacter michiganensis subsp. michiganensis, the causal agent of bacterial canker and wilt of tomato, is considered to be one of the most important bacterial pathogens worldwide. In the year 2000 there was an increase in the number of infected greenhouses and in the severity of the disease in Israel. As part of the effort to cope with the disease, a comprehensive survey was conducted. Scouts recorded disease severity monthly in 681 production units. At the end of the season the scouts met with the growers and together recorded relevant details about the crop and cultural practices employed. The results suggested an absence of anisotropy pattern in the study region. Global Moran's I analysis showed that disease severity had significant spatial autocorrelation. The strongest spatial autocorrelation occurred within a 1,500 m neighborhood, which is comparable to the distance between production units maintained by one grower (Farm). Next, we tested three groups of variables including or excluding the Farm as a variable. When the Farm was included the explained variation increased in all the studied models. Overall, results of this study demonstrate that the most influential factor on bacterial canker severity was the Farm. This variable probably encompasses variation in experience, differences in agricultural practices between growers, and the quality of implementation of management practices.


Subject(s)
Actinobacteria/physiology , Environment, Controlled , Plant Diseases/microbiology , Solanum lycopersicum/microbiology , Israel , Risk Factors
5.
Phytopathology ; 102(8): 769-78, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22624774

ABSTRACT

Alternaria alternata is the predominant fungus involved in moldy core and core rot of Red Delicious apples. The effects of environmental conditions during bloom on moldy core and core rot, and on the need for fungicide application, were examined in 10 experiments carried out in 2007. In untreated experimental plots, typical moldy core symptoms were very common, with relatively low variability (coefficient of variation: 22.2%) among experiments; core rot incidence ranged from 2 to 26% with large variability (coefficient of variation: 90.0%) among experiments. No evidence of prevailing environmental conditions during bloom affecting the development of moldy core or core rot was detected. No effect of fungicide application (a mixture of bromuconazole + captan three times a week at bloom) on moldy core or core rot was found. A random distribution of moldy core and an occasional aggregation of core rot in the orchards were indicated from Morisita's index of dispersion (I(δ)). The hypothesis that core rot incidence is governed by host physiology and that yield load can be used as an indicator of trees' susceptibility was examined in a set of eight observations and four experiments. No correlation was found between tree yield load and moldy core, but core rot incidence was inversely related to yield load. Furthermore, irrespective of tree yield load, core rot was more abundant on large compared with small fruits. It is concluded that host physiology, rather than pathogen occurrence or environmental conditions at bloom stage, governs the development of core rot in Red Delicious apples caused by A. alternata in Israel.


Subject(s)
Alternaria/pathogenicity , Malus/microbiology , Malus/physiology , Plant Diseases/microbiology
6.
Phytopathology ; 100(1): 97-104, 2010 Jan.
Article in English | MEDLINE | ID: mdl-19968555

ABSTRACT

The individual and joint effects of covering the soil with polyethylene mulch before planting and fungicides commonly used by organic growers on tomato late blight (caused by Phytophthora infestans) were studied in three experiments conducted from 2002 to 2005. Application of fungicides resulted in inconsistent and insufficient late blight suppression (control efficacy +/- standard error of 34.5 +/- 14.3%) but the polyethylene mulch resulted in consistent, effective, and highly significant suppression (control efficacy of 83.6 +/- 5.5%) of the disease. The combined effect of the two measures was additive. In a second set of three experiments carried out between 2004 and 2006, it was found that the type of polyethylene mulch used (bicolor aluminized, clear, or black) did not affect the efficacy of late blight suppression (control efficacy of 60.1 to 95.8%) and the differences in the effects among the different polyethylene mulches used were insignificant. Next, the ability of the mulch to suppress cucumber downy mildew (caused by Pseudoperonospora cubensis) was studied in four experiments carried out between 2006 and 2008. The mulch effectively suppressed cucumber downy mildew but the effect was less substantial (control efficacy of 34.9 +/- 4.8%) than that achieved for tomato late blight. The disease-suppressing effect of mulch appeared to come from a reduction in leaf wetness duration, because mulching led to reductions in both the frequency of nights when dew formed and the number of dew hours per night when it formed. Mulching also reduced relative humidity in the canopy, which may have reduced sporulation.


Subject(s)
Cucumis sativus/microbiology , Oomycetes/drug effects , Oomycetes/physiology , Phytophthora infestans/drug effects , Phytophthora infestans/physiology , Polyethylene/pharmacology , Solanum lycopersicum/microbiology , Fungicides, Industrial/pharmacology , Microclimate , Plant Diseases/microbiology , Plant Diseases/prevention & control
7.
Phytopathology ; 99(6): 775-81, 2009 Jun.
Article in English | MEDLINE | ID: mdl-19453238

ABSTRACT

Conditions affecting germination and growth of Fusarium mangiferae, causal agent of mango malformation disease, were studied in vitro. Both conidial germination and colony growth required temperatures >5 degrees C and reached a peak at 28 and 25 degrees C, respectively. A minimum 2-h wetness period was required for conidial germination, reaching a peak after 8 h of wetness. High incidence of fungal colonization in buds, predominantly the apical buds, was detected compared with inoculated leaves. The pathogen was detected in the roots of inoculated soil 19 weeks postinoculation but not in aboveground parts of the plants, and symptoms of the disease were not observed, either. Dry, malformed inflorescence debris serving as a source of inoculum caused significantly higher colonization (52 and 20%) of inoculated buds, compared with that (0%) of the untreated controls. Incidence of sampled leaf disks bearing propagules of F. mangiferae from an infected orchard peaked in June and July and decreased during the following months, whereas airborne infections on 1-month-old branches was the highest in May and June, corresponding with inoculum availability released from infected inflorescences. Colonization pattern, determined in naturally infected vegetative and woody branches, was significantly higher in node sections than in the internode sections. This study sheds light on infection dynamics, colonization patters, and the disease cycle of F. mangiferae in mango.


Subject(s)
Fusarium/pathogenicity , Mangifera/microbiology , Mycoses/microbiology , Plant Diseases/microbiology , Climate , Flowers/microbiology , Germination/physiology , Israel , Mangifera/growth & development , Plant Leaves/microbiology , Plant Roots/microbiology , Seasons , Temperature , Trees/microbiology
8.
Phytopathology ; 99(2): 160-6, 2009 Feb.
Article in English | MEDLINE | ID: mdl-19159308

ABSTRACT

Inoculum availability and conidial dispersal patterns of Fusarium mangiferae, causal agent of mango malformation disease, were studied during 2006 and 2007 in an experimental orchard. The spatial pattern of primary infections in a heavily infected commercial mango orchard corresponded with a typical dispersal pattern caused by airborne propagules. Malformed inflorescences were first observed in mid-March, gradually increased, reaching a peak in May, and declined to negligible levels in August. The sporulation capacity of the malformed inflorescences was evaluated during three consecutive months. Significantly higher numbers of conidia per gram of malformed inflorescence were detected in May and June than in April. Annual conidial dissemination patterns were evaluated by active and passive trapping of conidia. A peak in trapped airborne conidia was detected in May and June for both years. The daily pattern of conidial dispersal was not associated with a specifically discernable time of day, and an exponential correlation was determined between mean relative humidity (RH) and mean number of trapped conidia. Higher numbers of conidia were trapped when RH values were low (<55%). This is the first detailed report on airborne dispersal of F. mangiferae, serving as the primary means of inoculum spread.


Subject(s)
Fusarium/physiology , Mangifera/microbiology , Spores, Fungal/physiology , Circadian Rhythm , Host-Pathogen Interactions , Plant Diseases/microbiology , Time Factors
9.
Phytopathology ; 98(5): 600-8, 2008 May.
Article in English | MEDLINE | ID: mdl-18943229

ABSTRACT

Domesticated chickpea (Cicer arietinum) and its wild relative C. judaicum grow in sympatric distribution in Israel and both are susceptible to Ascochyta blight caused by Didymella rabiei. C. arietinum was grown for millennia in drier and hotter Levantine spring conditions while C. judaicum grows in the wetter and milder winters. Accordingly, it is possible that D. rabiei isolates originated from C. arietinum are adjusted to the less favorable spring conditions. Here, 60 isolates from both origins were tested in vitro for their hyphal growth at 15 and 25 degrees C. Isolates from C. arietinum had a significantly larger colony area at 25 degrees C than at 15 degrees C (P < 0.001) while no such differences were detected between isolates from C. judaicum. D. rabiei isolates from wild and domesticated origins were used to inoculate nine C. judaicum accessions and two domesticated chickpea cultivars and their aggressiveness patterns were determined using five measures. On domesticated chickpea, isolates from domesticated origin were significantly more aggressive in four out of the five aggressiveness measures than isolates from wild origin. On C. judaicum, isolates from wild origin were generally more aggressive than isolates from domesticated origin. The results suggest that the habitat segregation between wild and domesticated Cicer influences the pathogens ecological affinities and their aggressiveness patterns.


Subject(s)
Ascomycota/pathogenicity , Cicer/microbiology , Fabaceae/microbiology , Plant Diseases/microbiology , Ascomycota/isolation & purification , Species Specificity , Temperature , Virulence
10.
Phytopathology ; 97(10): 1284-9, 2007 Oct.
Article in English | MEDLINE | ID: mdl-18943686

ABSTRACT

ABSTRACT Foliar symptoms of Eutypa dieback, caused by Eutypa lata, in grapevines, cv. Shiraz, varied from year to year in a 6-year study conducted in South Australia and, although trends were similar for vineyards within geographical regions, differences were observed between regions. We attempted to elucidate the causes underlying this variation and hypothesized that it was influenced by climatic factors. A number of possible relationships were identified between climate and symptom expression: (i) increased symptom expression was related to increased winter rainfall 18 months earlier, (ii) decreased disease incidence and prevalence were related to increased temperature in spring, and (iii) a reduction in disease incidence was related to both very high and very low rainfall in October. Theories for these relationships are proposed and require further investigation. A conceptual model was developed which requires validation and has the potential to predict the incidence of foliar symptoms of Eutypa dieback. Information from this study could lead to an improved integrated pest management system to suppress foliar symptoms and sustain productivity of vines infected with E. lata.

11.
Ann Bot ; 96(6): 1137-40, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16157627

ABSTRACT

AIMS: To develop an in-situ, non-destructive method for observation and monitoring of the underground developmental stages of the root parasite Orobanche cumana. SCOPE: The parasitic weed Orobanche causes severe damage to vegetables and field crops. Most of the damage caused to the crops occurs during the underground, unobservable parasitism stage. Sunflower (Helianthus annuus 'Adi') plants were planted in soil that was artificially inoculated with O. cumana seeds. Clear Plexiglas mini-rhizotron plastic observation tubes were inserted into the soil. Seed germination, early stage of penetration, and formation of tubercles and spikes were observed non-destructively and were monitored throughout the growing season by mean of a mini-rhizotron camera. Use of this technology enabled the complete individual parasite life cycle from the very early development (including germination) to Orobanche shoot to be monitored. In addition, the effect of the systemic herbicide Cadre (imazapic) on the development of O. cumana was inspected and quantified. CONCLUSIONS: This novel methodology facilitates the in-situ study of major aspects of the host-parasite interaction and of parasite suppression, such as parasitism dynamics, parasite growth rate, and the effect of chemical treatments on the parasite.


Subject(s)
Helianthus/parasitology , Orobanche/growth & development , Host-Parasite Interactions , Orobanche/physiology , Plant Roots/parasitology , Seedlings/growth & development , Time Factors
12.
Phytopathology ; 95(11): 1279-86, 2005 Nov.
Article in English | MEDLINE | ID: mdl-18943358

ABSTRACT

ABSTRACT Temperature and wetness conditions required for development and maturation of Didymella rabiei pseudothecia were determined in a series of experiments conducted in controlled-environmental conditions. Initial stages of pseudothecium formation occurred at temperatures ranging from 5 to 15 degrees C. Incubation at low temperatures was essential for subsequent pseudothecium maturation. This requirement was satisfied for chickpea stem segments incubated at 5 or 10 degrees C for three consecutive weeks or during periods of 3 or 5 days, separated by periods at higher temperatures. Following the low-temperature requirement, subsequent pseudothecium development was independent of temperature in the range tested (5 to 20 degrees C). Wetness was essential for pseudothecium production: pseudothecia formed and matured on stem segments maintained continuously wet but also on those exposed to periods of three or five wet days, separated by dry periods. The dispersal of D. rabiei ascospores was studied using chickpea plants as living traps in the field. Trap plants were infected mainly when exposed during rain but also in rainless periods. Results of this study enabled us to describe the developmental events leading to the production of the teleomorph stage and the dispersal of ascospores by D. rabiei in the Mediterranean climate of Israel.

13.
Plant Dis ; 89(10): 1027-1034, 2005 Oct.
Article in English | MEDLINE | ID: mdl-30791268

ABSTRACT

The significance of preventing primary infections resulting from the teleomorph stage of Didymella rabiei was tested in field experiments in 1998 and 2000. Control efficacy was greater and yield and its components were higher in plots where the fungicide difenoconazole had been sprayed in time to protect the plants from infections resulting from airborne ascospores than in plots where sprays were not applied on time. Forty empirical models reflecting the influence of temperature and interrupted wetness on initial maturation of D. rabiei pseudothecia were developed and verified by using data recorded in chickpea fields in 1998. Seven of the models then were validated with data recorded in 1999 and 2000. The following model provided the best predictions: starting at the beginning of the rainy season (October to December), the predictor of the model was assigned one severity value unit when there was a rain event (1 day or more) with ≥10 mm of rain and an average daily temperature (during the rainy days) of ≤15°C. According to the model, pseudothecia mature after accumulation of six severity values and ascospores will be discharged during the following rain.

14.
Phytopathology ; 93(8): 931-40, 2003 Aug.
Article in English | MEDLINE | ID: mdl-18943859

ABSTRACT

ABSTRACT The coliform agar produced by Merck was tested for rapid diagnosis of Erwinia amylovora (the causal agent of fire blight) in pear blossoms. The medium enabled the diagnosis to be completed within 36 h. Diagnoses performed with the medium were confirmed by the BIOLOG and the fatty-acid profile methods. The diagnostic medium was used to determine the spatial distribution of colonized blossoms in the orchards and it was found that E. amylovora may be distributed both in clusters and at random. These findings were used in the development of a statistical model for sampling blossoms in the orchard. The model determines the number of trees to be sampled in the orchard and the number of blossoms be taken from each tree, which would enable the true colonization incidence of blossoms in the orchard to be estimated at desired levels of accuracy and confidence. Parameters included in the model are: the total number of trees in the orchard (T), the number of trees to be sampled in the orchard (t), the number of blossoms to be sampled from each tree (n), the true colonization incidence of blossoms (pi), a coefficient of aggregation (rho), the required level of confidence (1 - alpha), and the required level of accuracy (L). Sensitivity analyses revealed that the parameter governing sample size is the required level of accuracy. Sampling of 20 blossoms from each of several hundred trees is required to achieve an accuracy of +/-1%, but only a few single trees are needed for an accuracy level of +/-10%. A sampling procedure then was developed, validated with an independent data set, and found to be accurate. It was concluded that sampling of pear blossoms and estimation of the incidence of blossom colonization by E. amylovora could improve fire blight management, but not in all cases.

15.
Phytopathology ; 93(10): 1320-8, 2003 Oct.
Article in English | MEDLINE | ID: mdl-18944332

ABSTRACT

ABSTRACT The efficacy of chemical (i.e., foliar fungicide sprays), genetic (i.e., moderately resistant cultivars), and cultural (i.e., drip-irrigation system) control measures was quantified individually and in combination in the management of Alternaria dauci, the causal agent of Alternaria leaf blight of carrot. Whereas host resistance and drip irrigation affected both the time of disease onset and the rate of disease progression, chemical control affected only the latter. In all cases, a single control measure did not provide an acceptable level of disease suppression. Control efficacy values (based on the relative area under the disease progress curve) for chemical, genetic, and cultural control were 58 +/- 11, 39 +/- 20, and 60 +/- 22%, respectively (values are means +/- standard error). By contrast, implementing two control measures concurrently always improved disease suppression significantly compared with the individual measures. Control efficacy values were 91 +/- 8% for the integration of chemical and genetic measures and 82 +/- 23% for the integration of chemical and cultural measures. Moreover, yields in plots protected by two control measures simultaneously were higher by 10.1 to 28.6 t/ha than those in the respective plots protected by single measures. The joint effect of chemical control and host resistance was additive, whereas that of chemical control and drip irrigation was synergistic in most cases. A literature review was performed to determine if these findings represent a general relationship between chemical and genetic, and chemical and cultural measures. Based on 19 reviewed cases, it was concluded that additive effects are the rule and synergistic or antagonistic effects are the exception. Synergistic effects of two control measures were observed when one control measure improved the efficacy of the other directly or when one control measure induced host resistance or predisposed the pathogen to increased susceptibility. These results may enable a more effective selection of candidate control measures for integration in the future.

16.
Phytopathology ; 93(3): 356-63, 2003 Mar.
Article in English | MEDLINE | ID: mdl-18944347

ABSTRACT

ABSTRACT The possibility of using local and imported warning systems for the management of fire blight (caused by the bacterium Erwinia amylovora) in pears was tested in Israel from 1997 to 2000. Three imported systems (MARYBLYT 4.3, BIS95, and Cougarblight 98C) and one local system (Fire Blight Control Advisory [FBCA]) were used. All systems were tested in simulation experiments; MARYBLYT 4.3 and FBCA were also tested in orchard experiments under natural infections. Simulation experiments included 193 orchard-plots in which the time of disease onset enabled us to determine the date of infection. Thirty-five experiments were conducted in commercial orchards; in 10 of these, fire blight developed naturally. The performance of the imported warning systems was too variable to be accurately used under Israeli conditions. In the simulation experiments, the success rate (i.e., the capacity of the systems to predict the exact date of the occurrence of infection episodes) of the imported systems was low (3 to 55%) with considerably large variability among years (CV = 30 to 67%). Similar results were obtained in the orchard experiments for MARYBLYT 4.3: in only two of five experiments where plots were managed according to that system was disease severity significantly lower than that recorded in untreated control plots. In comparison, the local system, FBCA, predicted most infection episodes in the simulation experiments with low variability (99%, CV = 1.0%). In the orchard experiments, adequate disease suppression was achieved in all eight experiments in which FBCA recommendations were followed. We concluded that it was not possible to import and successfully implement fire blight warning systems in Israel that have been developed in regions with dissimilar environmental conditions.

17.
Plant Dis ; 87(9): 1083-1088, 2003 Sep.
Article in English | MEDLINE | ID: mdl-30812822

ABSTRACT

The efficacy of pruning infected pear tissues to combat fire blight (caused by Erwinia amylovora) was evaluated in two sets of experiments conducted during 1999 to 2001 in Israel. In the first set of two experiments, diseased tissues were removed soon after the observation of blossom infections. Pruning was effective in 0 to 50% of the treated trees, and resulted in complete eradication of E. amylovora. In the remaining trees, pruning not only did not result in eradication of the bacteria from the tree tissues, it made the situation worse, as the disease had invaded the main branches and limbs of a significantly larger proportion of pruned trees than of non-pruned ones, because of alteration of the physiological status of the host plant by pruning. In the five experiments of the second set, the efficacy of pruning fire blight infections on main branches and limbs was studied; the time of pruning varied among the experiments. Effectiveness of cutting and removing infected branches and limbs was linearly related to time of treatment: the efficacy of pruning improved significantly with lateness of the treatment. The best results were obtained when pruning was carried out while the trees were dormant, in December: none of these trees had a severely infected canopy the following spring. Based on the results obtained in this study, it was concluded that factors related to all three components of the disease triangle (i.e., pathogen, host, and environment), rather than only the actual presence of diseased tissues, should be taken into account in considering the need for cutting and removing fire blight-diseased tissues. Accordingly, recommendations for Israeli growers were revised and updated.

18.
Plant Dis ; 87(9): 1077-1082, 2003 Sep.
Article in English | MEDLINE | ID: mdl-30812821

ABSTRACT

The role of autumn infections in the progression of fire blight (caused by Erwinia amylovora) symptoms in perennial pear branches was studied in orchard-grown trees in Israel. The extent of symptom progression and the final length of fire blight cankers in perennial branches were variably affected by the vigor of the trees and the season of infection. Following spring infections, when all trees supported active shoot growth, fire blight symptoms progressed more rapidly and to longer distances in trees that exhibited high vigor (i.e., with numerous annual shoots on most terminal branches) than in low-vigor trees (i.e., few or no annual shoots on terminal branches). Irrespective of the vigor of the trees, the progression of fire blight symptoms in perennial branches ceased between mid-May and mid-July, and only a small proportion (0 to 14.2%) of the infections had invaded main limbs or trunks of trees. Progression of fire blight symptoms following autumn infections was related to the preceding summer (August to No-vember) shoot regrowth: in trees in which the shoots did not restore their growth in the summer, the rate of symptom progression in perennial branches was higher in trees with a low vigor than in those with a high vigor, whereas for those with summer regrowth the relationship between rates of symptom expression was reversed. Irrespective of the vigor group and of whether there was summer regrowth, symptoms in perennial branches continued to progress through the winter until the following spring. Most of the autumn infections (50 to 78.5%) that developed in susceptible trees had invaded main limbs or trunks of trees. The results of this study indicate that factors related to host phenology and physiology, rather than factors related to environmental influences (such as temperature), govern the extent, rate, and duration of fire blight progression in perennial pear branches. Furthermore, it turned out that autumn infections play a substantial role in fire blight epidemiology in Israel.

19.
Plant Dis ; 87(6): 650-654, 2003 Jun.
Article in English | MEDLINE | ID: mdl-30812855

ABSTRACT

A survey of streptomycin resistance in the fire blight pathogen, Erwinia amylovora, conducted in pear, apple, and quince orchards in Israel during 1998 to 2001 revealed a decrease in the frequency of locations with streptomycin-resistant strains, from 57% in 1998 to 15% in 2001. In 2001, streptomycin-resistant strains were detected in only five locations in two restricted areas in western Galilee and the Golan Heights, compared with 16 locations found in 1998 throughout the northern part of the country. Since the use of streptomycin for fire blight control was terminated in 1997, this antibiotic has been replaced with oxolinic acid (Starner) in commercial orchards. Strains resistant to oxolinic acid were isolated from two pear orchards in the northern part of Israel in 1999. In a nationwide survey conducted during the spring and winter of 2000 and 2001, 51 and 47 pome fruit orchards, respectively, were sampled. Oxolinic acid-resistant strains were detected in several orchards located in two restricted areas in northern Galilee. Strains with resistance to both streptomycin and oxolinic acid were not found during 2000 to 2001. Results of this survey are used in managing fire blight with bactericides.

20.
Phytopathology ; 92(4): 417-23, 2002 Apr.
Article in English | MEDLINE | ID: mdl-18942955

ABSTRACT

ABSTRACT Historically, the response of chickpea (Cicer arietinum L.) to Didymella rabiei (causal agent of Ascochyta blight) has been mainly related to as complete resistance and it was commonly assayed with qualitative (nonparametric) scales. Two reciprocal populations, derived from intra-specific crosses between a moderately resistant late flowering Israeli cultivar and a highly susceptible early flowering Indian accession, were tested at F(3) and F(4) generations in 1998 and 1999, respectively. A quantitative (parametric) assessment (percent disease severity) was used to evaluate the chickpea field response to Ascochyta blight. The transformed relative area under the disease progress curve (tRAUDPC) was calculated for each experimental unit for further analyses. Heritability estimates of the tRAUDPC were relatively high (0.67 to 0.85) in both generations for both reciprocal populations. The frequency distributions of tRAUDPC of the populations were continuous and significantly departed from normality (Shapiro-Wilk W test; P of W < 0.0001), being all platykurtic and skewed toward either the resistant or the susceptible parental lines. The presence of major genes was examined by testing the relationship between the F(3) and F(4) family means and the within-family variances (Fain's test). Analyses of these relationships suggested that segregation of a single (or few) quantitative trait locus with major effect and possibly other minor loci was the predominant mode of inheritance. The correlation estimates between the resistance and days to flower (r = -0.19 to -0.44) were negative and significantly (P = 0.054 to 0.001) different from zero, which represents a breeding constraint in the development of early flowering cultivars with Ascochyta blight resistance.

SELECTION OF CITATIONS
SEARCH DETAIL
...