Your browser doesn't support javascript.
loading
Review of statistical methods for survival analysis using genomic data
Article in En | WPRIM | ID: wpr-830120
Responsible library: WPRO
ABSTRACT
Survival analysis mainly deals with the time to event, including death, onset of disease, and bankruptcy. The common characteristic of survival analysis is that it contains “censored” data, in which the time to event cannot be completely observed, but instead represents the lower bound of the time to event. Only the occurrence of either time to event or censoring time is observed. Many traditional statistical methods have been effectively used for analyzing survival data with censored observations. However, with the development of high-throughput technologies for producing “omics” data, more advanced statistical methods, such as regularization, should be required to construct the predictive survival model with high-dimensional genomic data. Furthermore, machine learning approaches have been adapted for survival analysis, to fit nonlinear and complex interaction effects between predictors, and achieve more accurate prediction of individual survival probability. Presently, since most clinicians and medical researchers can easily assess statistical programs for analyzing survival data, a review article is helpful for understanding statistical methods used in survival analysis. We review traditional survival methods and regularization methods, with various penalty functions, for the analysis of high-dimensional genomics, and describe machine learning techniques that have been adapted to survival analysis.
Full text: 1 Database: WPRIM Type of study: Prognostic_studies Language: En Journal: Genomics & Informatics Year: 2019 Document type: Article
Full text: 1 Database: WPRIM Type of study: Prognostic_studies Language: En Journal: Genomics & Informatics Year: 2019 Document type: Article