Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Hortic Res ; 11(2): uhad286, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38487294

ABSTRACT

Accurate and real-time monitoring of grapevine freezing tolerance is crucial for the sustainability of the grape industry in cool climate viticultural regions. However, on-site data are limited due to the complexity of measurement. Current prediction models underperform under diverse climate conditions, which limits the large-scale deployment of these methods. We combined grapevine freezing tolerance data from multiple regions in North America and generated a predictive model based on hourly temperature-derived features and cultivar features using AutoGluon, an automated machine learning engine. Feature importance was quantified by AutoGluon and SHAP (SHapley Additive exPlanations) value. The final model was evaluated and compared with previous models for its performance under different climate conditions. The final model achieved an overall 1.36°C root-mean-square error during model testing and outperformed two previous models using three test cultivars at all testing regions. Two feature importance quantification methods identified five shared essential features. Detailed analysis of the features indicates that the model has adequately extracted some biological mechanisms during training. The final model, named NYUS.2, was deployed along with two previous models as an R shiny-based application in the 2022-23 dormancy season, enabling large-scale and real-time simulation of grapevine freezing tolerance in North America for the first time.

2.
Ann Bot ; 133(2): 217-224, 2024 Apr 10.
Article in English | MEDLINE | ID: mdl-37971306

ABSTRACT

BACKGROUND: Dormancy of buds is an important phase in the life cycle of perennial plants growing in environments where unsuitable growth conditions occur seasonally. In regions where low temperature defines these unsuitable conditions, the attainment of cold hardiness is also required for survival. The end of the dormant period culminates in budbreak and flower emergence, or spring phenology, one of the most appreciated and studied phenological events - a time also understood to be most sensitive to low-temperature damage. Despite this, we have a limited physiological and molecular understanding of dormancy, which has negatively affected our ability to model budbreak. This is also true for cold hardiness. SCOPE: Here we highlight the importance of including cold hardiness in dormancy studies that typically only characterize time to budbreak. We show how different temperature treatments may lead to increases in cold hardiness, and by doing so also (potentially inadvertently) increase time to budbreak. CONCLUSIONS: We present a theory that describes evaluation of cold hardiness as being key to clarifying physiological changes throughout the dormant period, delineating dormancy statuses, and improving both chill and phenology models. Erroneous interpretations of budbreak datasets are possible by not phenotyping cold hardiness. Changes in cold hardiness were very probably present in previous experiments that studied dormancy, especially when those included below-freezing temperature treatments. Separating the effects between chilling accumulation and cold acclimation in future studies will be essential for increasing our understanding of dormancy and spring phenology in plants.


Subject(s)
Cold Temperature , Seasons
3.
Proc Natl Acad Sci U S A ; 119(19): e2112250119, 2022 05 10.
Article in English | MEDLINE | ID: mdl-35500120

ABSTRACT

Budbreak is one of the most observed and studied phenological phases in perennial plants, but predictions remain a challenge, largely due to our poor understanding of dormancy. Two dimensions of exposure to temperature are generally used to model budbreak: accumulation of time spent at low temperatures (chilling) and accumulation of heat units (forcing). These two effects have a well-established negative correlation; with more chilling, less forcing is required for budbreak. Furthermore, temperate plant species are assumed to vary in chilling requirements for dormancy completion allowing proper budbreak. Here, dormancy is investigated from the cold hardiness standpoint across many species, demonstrating that it should be accounted for to study dormancy and accurately predict budbreak. Most cold hardiness is lost prior to budbreak, but rates of cold hardiness loss (deacclimation) vary among species, leading to different times to budbreak. Within a species, deacclimation rate increases with accumulation of chill. When inherent differences between species in deacclimation rate are accounted for by normalizing rates throughout winter by the maximum rate observed, a standardized deacclimation potential is produced. Deacclimation potential is a quantitative measurement of dormancy progression based on responsiveness to forcing as chill accumulates, which increases similarly for all species, contradicting estimations of dormancy transition based on budbreak assays. This finding indicates that comparisons of physiologic and genetic control of dormancy require an understanding of cold hardiness dynamics. Thus, an updated framework for studying dormancy and its effects on spring phenology is suggested where cold hardiness in lieu of (or in addition to) budbreak is used.


Subject(s)
Acclimatization , Cold Temperature , Plant Physiological Phenomena , Climate , Seasons , Temperature
SELECTION OF CITATIONS
SEARCH DETAIL
...