Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 61
Filtrar
1.
bioRxiv ; 2024 Jul 05.
Artigo em Inglês | MEDLINE | ID: mdl-39005357

RESUMO

Background: Alzheimer's disease (AD), a progressive neurodegenerative disorder, continues to increase in prevalence without any effective treatments to date. In this context, knowledge graphs (KGs) have emerged as a pivotal tool in biomedical research, offering new perspectives on drug repurposing and biomarker discovery by analyzing intricate network structures. Our study seeks to build an AD-specific knowledge graph, highlighting interactions among AD, genes, variants, chemicals, drugs, and other diseases. The goal is to shed light on existing treatments, potential targets, and diagnostic methods for AD, thereby aiding in drug repurposing and the identification of biomarkers. Results: We annotated 800 PubMed abstracts and leveraged GPT-4 for text augmentation to enrich our training data for named entity recognition (NER) and relation classification. A comprehensive data mining model, integrating NER and relationship classification, was trained on the annotated corpus. This model was subsequently applied to extract relation triplets from unannotated abstracts. To enhance entity linking, we utilized a suite of reference biomedical databases and refine the linking accuracy through abbreviation resolution. As a result, we successfully identified 3,199,276 entity mentions and 633,733 triplets, elucidating connections between 5,000 unique entities. These connections were pivotal in constructing a comprehensive Alzheimer's Disease Knowledge Graph (ADKG). We also integrated the ADKG constructed after entity linking with other biomedical databases. The ADKG served as a training ground for Knowledge Graph Embedding models with the high-ranking predicted triplets supported by evidence, underscoring the utility of ADKG in generating testable scientific hypotheses. Further application of ADKG in predictive modeling using the UK Biobank data revealed models based on ADKG outperforming others, as evidenced by higher values in the areas under the receiver operating characteristic (ROC) curves. Conclusion: The ADKG is a valuable resource for generating hypotheses and enhancing predictive models, highlighting its potential to advance AD's disease research and treatment strategies.

2.
Stat Med ; 42(29): 5491-5512, 2023 12 20.
Artigo em Inglês | MEDLINE | ID: mdl-37816678

RESUMO

Joint models for longitudinal and survival data (JMLSs) are widely used to investigate the relationship between longitudinal and survival data in clinical trials in recent years. But, the existing studies mainly focus on independent survival data. In many clinical trials, survival data may be bivariately correlated. To this end, this paper proposes a novel JMLS accommodating multivariate longitudinal and bivariate correlated time-to-event data. Nonparametric marginal survival hazard functions are transformed to bivariate normal random variables. Bayesian penalized splines are employed to approximate unknown baseline hazard functions. Incorporating the Metropolis-Hastings algorithm into the Gibbs sampler, we develop a Bayesian adaptive Lasso method to simultaneously estimate parameters and baseline hazard functions, and select important predictors in the considered JMLS. Simulation studies and an example taken from the International Breast Cancer Study Group are used to illustrate the proposed methodologies.


Assuntos
Algoritmos , Modelos Estatísticos , Humanos , Teorema de Bayes , Análise Multivariada , Simulação por Computador
3.
Lifetime Data Anal ; 29(4): 888-918, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37581774

RESUMO

We consider a novel class of semiparametric joint models for multivariate longitudinal and survival data with dependent censoring. In these models, unknown-fashion cumulative baseline hazard functions are fitted by a novel class of penalized-splines (P-splines) with linear constraints. The dependence between the failure time of interest and censoring time is accommodated by a normal transformation model, where both nonparametric marginal survival function and censoring function are transformed to standard normal random variables with bivariate normal joint distribution. Based on a hybrid algorithm together with the Metropolis-Hastings algorithm within the Gibbs sampler, we propose a feasible Bayesian method to simultaneously estimate unknown parameters of interest, and to fit baseline survival and censoring functions. Intensive simulation studies are conducted to assess the performance of the proposed method. The use of the proposed method is also illustrated in the analysis of a data set from the International Breast Cancer Study Group.


Assuntos
Algoritmos , Modelos Estatísticos , Humanos , Teorema de Bayes , Simulação por Computador
4.
Stat Methods Med Res ; 32(9): 1694-1710, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37408456

RESUMO

Many joint models of multivariate skew-normal longitudinal and survival data have been presented to accommodate for the non-normality of longitudinal outcomes in recent years. But existing work did not consider variable selection. This article investigates simultaneous parameter estimation and variable selection in joint modeling of longitudinal and survival data. The penalized splines technique is used to estimate unknown log baseline hazard function, the rectangle integral method is adopted to approximate conditional survival function. Monte Carlo expectation-maximization algorithm is developed to estimate model parameters. Based on local linear approximations to conditional expectation of likelihood function and penalty function, a one-step sparse estimation procedure is proposed to circumvent the computationally challenge in optimizing the penalized conditional expectation of likelihood function, which is utilized to select significant covariates and trajectory functions, and identify the departure from normality of longitudinal data. The conditional expectation of likelihood function-based Bayesian information criterion is developed to select the optimal tuning parameter. Simulation studies and a real example from the clinical trial are used to illustrate the proposed methodologies.


Assuntos
Algoritmos , Modelos Estatísticos , Teorema de Bayes , Simulação por Computador , Funções Verossimilhança , Método de Monte Carlo , Estudos Longitudinais
5.
Stat Methods Med Res ; 31(12): 2368-2382, 2022 12.
Artigo em Inglês | MEDLINE | ID: mdl-36154344

RESUMO

Alzheimer's disease (AD) can be diagnosed by utilizing traditional logistic regression models to fit magnetic resonance imaging (MRI) data of brain, which is regarded as a vector of covariates. But its parameter estimation is inefficient and computationally extensive due to ultrahigh dimensionality and complicated structure of MRI data. To overcome this deficiency, this paper proposes a tensor logistic regression model (TLRM) for AD's MRI data by regarding MRI tensor as covariates. Under this framework, a tensor candecomp/parafac (CP) decomposition tool is employed to reduce ultrahigh dimensional tensor to a high dimensional level, a novel Bayesian adaptive Lasso method is developed to simultaneously select important components of tensor and estimate model parameters by incorporating the Po´lya-Gamma method leading a closed-form likelihood and avoiding the usage of the Metropolis-Hastings algorithm, and Gibbs sampler technique in Markov chain Monte Carlo (MCMC). A tensor's product technique is utilized to optimize the calculation program and speed up the calculation of MCMC. Bayes factor together with the path sampling approach is presented to select tensor rank in CP decomposition. Effectiveness of the proposed method is illustrated on simulation studies and an MRI data analysis.


Assuntos
Doença de Alzheimer , Humanos , Teorema de Bayes , Modelos Logísticos , Doença de Alzheimer/diagnóstico por imagem , Análise de Dados , Método de Monte Carlo , Neuroimagem
6.
J Appl Stat ; 49(12): 3063-3089, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36035614

RESUMO

Methodological development and application of joint models for longitudinal and time-to-event data have mostly coupled a single longitudinal outcome-based linear mixed-effects model with normal distribution and Cox proportional hazards model. In practice, however, (i) profile of subject's longitudinal response may follow a `broken-stick nonlinear' (piecewise) trajectory. Such multiple phases are an important indicator to help quantify treatment effect, disease diagnosis and clinical decision-making. (ii) Normality in longitudinal models is a routine assumption, but it may be unrealistically obscuring important features of subject variations. (iii) Data collected are often featured by multivariate longitudinal outcomes which are significantly correlated, ignoring their correlation may lead to biased estimation. (iv) It is of importance to investigate how multivariate longitudinal outcomes are associated with event time of interest. In the article, driven by a motivating example, we propose Bayesian multivariate piecewise joint models with a skewed distribution and random change-points for longitudinal measures with an attempt to cope with correlated multivariate longitudinal data, adjust departures from normality, mediate accuracy from longitudinal trajectories with random change-point and tailor linkage in specifying a time-to-event process. A real example is analyzed to demonstrate methodology and simulation studies are conducted to evaluate performance of the proposed models and method.

7.
8.
BMC Genomics ; 23(1): 504, 2022 Jul 12.
Artigo em Inglês | MEDLINE | ID: mdl-35831808

RESUMO

BACKGROUND: Using single-cell RNA sequencing (scRNA-seq) data to diagnose disease is an effective technique in medical research. Several statistical methods have been developed for the classification of RNA sequencing (RNA-seq) data, including, for example, Poisson linear discriminant analysis (PLDA), negative binomial linear discriminant analysis (NBLDA), and zero-inflated Poisson logistic discriminant analysis (ZIPLDA). Nevertheless, few existing methods perform well for large sample scRNA-seq data, in particular when the distribution assumption is also violated. RESULTS: We propose a deep learning classifier (scDLC) for large sample scRNA-seq data, based on the long short-term memory recurrent neural networks (LSTMs). Our new scDLC does not require a prior knowledge on the data distribution, but instead, it takes into account the dependency of the most outstanding feature genes in the LSTMs model. LSTMs is a special recurrent neural network, which can learn long-term dependencies of a sequence. CONCLUSIONS: Simulation studies show that our new scDLC performs consistently better than the existing methods in a wide range of settings with large sample sizes. Four real scRNA-seq datasets are also analyzed, and they coincide with the simulation results that our new scDLC always performs the best. The code named "scDLC" is publicly available at https://github.com/scDLC-code/code .


Assuntos
Aprendizado Profundo , Análise Discriminante , Perfilação da Expressão Gênica/métodos , RNA/genética , RNA-Seq , Análise de Sequência de RNA/métodos , Análise de Célula Única/métodos
9.
Front Big Data ; 5: 812725, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35574573

RESUMO

Joint models of longitudinal and time-to-event data have received a lot of attention in epidemiological and clinical research under a linear mixed-effects model with the normal assumption for a single longitudinal outcome and Cox proportional hazards model. However, those model-based analyses may not provide robust inference when longitudinal measurements exhibit skewness and/or heavy tails. In addition, the data collected are often featured by multivariate longitudinal outcomes which are significantly correlated, and ignoring their correlation may lead to biased estimation. Under the umbrella of Bayesian inference, this article introduces multivariate joint (MVJ) models with a skewed distribution for multiple longitudinal exposures in an attempt to cope with correlated multiple longitudinal outcomes, adjust departures from normality, and tailor linkage in specifying a time-to-event process. We develop a Bayesian joint modeling approach to MVJ models that couples a multivariate linear mixed-effects (MLME) model with the skew-normal (SN) distribution and a Cox proportional hazards model. Our proposed models and method are evaluated by simulation studies and are applied to a real example from a diabetes study.

10.
Lifetime Data Anal ; 28(3): 335-355, 2022 07.
Artigo em Inglês | MEDLINE | ID: mdl-35352270

RESUMO

This paper discusses the fitting of the proportional hazards model to interval-censored failure time data with missing covariates. Many authors have discussed the problem when complete covariate information is available or the missing is completely at random. In contrast to this, we will focus on the situation where the missing is at random. For the problem, a sieve maximum likelihood estimation approach is proposed with the use of I-spline functions to approximate the unknown cumulative baseline hazard function in the model. For the implementation of the proposed method, we develop an EM algorithm based on a two-stage data augmentation. Furthermore, we show that the proposed estimators of regression parameters are consistent and asymptotically normal. The proposed approach is then applied to a set of the data concerning Alzheimer Disease that motivated this study.


Assuntos
Algoritmos , Simulação por Computador , Humanos , Funções Verossimilhança , Modelos de Riscos Proporcionais
11.
J Biopharm Stat ; 32(5): 768-788, 2022 09 03.
Artigo em Inglês | MEDLINE | ID: mdl-35213275

RESUMO

A three-arm non-inferiority trial including a test treatment, a reference treatment, and a placebo is recommended to assess the assay sensitivity and internal validity of a trial when applicable. Existing methods for designing and analyzing three-arm trials with binary endpoints are mainly developed from a frequentist viewpoint. However, these methods largely depend on large sample theories. To alleviate this problem, we propose two fully Bayesian approaches, the posterior variance approach and Bayes factor approach, to determine sample size required in a three-arm non-inferiority trial with binary endpoints. Simulation studies are conducted to investigate the performance of the proposed Bayesian methods. An example is illustrated by the proposed methodologies. Bayes factor method always leads to smaller sample sizes than the posterior variance method, utilizing the historical data can reduce the required sample size, simultaneous test requires more sample size to achieve the desired power than the non-inferiority test, the selection of the hyperparameters has a relatively large effect on the required sample size. When only controlling the posterior variance, the posterior variance criterion is a simple and effective option for obtaining a rough outcome. When conducting a previous clinical trial, it is recommended to use the Bayes factor criterion in practical applications.


Assuntos
Projetos de Pesquisa , Teorema de Bayes , Simulação por Computador , Humanos , Tamanho da Amostra
12.
Biometrics ; 78(1): 151-164, 2022 03.
Artigo em Inglês | MEDLINE | ID: mdl-33031576

RESUMO

This paper discusses variable selection in the context of joint analysis of longitudinal data and failure time data. A large literature has been developed for either variable selection or the joint analysis but there exists only limited literature for variable selection in the context of the joint analysis when failure time data are right censored. Corresponding to this, we will consider the situation where instead of right-censored data, one observes interval-censored failure time data, a more general and commonly occurring form of failure time data. For the problem, a class of penalized likelihood-based procedures will be developed for simultaneous variable selection and estimation of relevant covariate effects for both longitudinal and failure time variables of interest. In particular, a Monte Carlo EM (MCEM) algorithm is presented for the implementation of the proposed approach. The proposed method allows for the number of covariates to be diverging with the sample size and is shown to have the oracle property. An extensive simulation study is conducted to assess the finite sample performance of the proposed approach and indicates that it works well in practical situations. An application is also provided.


Assuntos
Algoritmos , Projetos de Pesquisa , Simulação por Computador , Funções Verossimilhança , Modelos Estatísticos , Tamanho da Amostra
13.
Brain Imaging Behav ; 16(1): 281-290, 2022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-34313906

RESUMO

Neuroimaging technique is a powerful tool to characterize the abnormality of brain networks in schizophrenia. However, the neurophysiological substrate of schizophrenia is still unclear. Here we investigated the patterns of brain functional and structural changes in female patients with schizophrenia using elastic net logistic regression analysis of resting-state functional magnetic resonance imaging data. Data from 52 participants (25 female schizophrenia patients and 27 healthy controls) were obtained. Using an elastic net penalty, the brain regions most relevant to schizophrenia pathology were defined in the models using the amplitude of low-frequency fluctuations (ALFF) and gray matter, respectively. The receiver operating characteristic analysis showed reliable classification accuracy with 85.7% in ALFF analysis, and 77.1% in gray matter analysis. Notably, our results showed eight common regions between the ALFF and gray matter analyses, including the Frontal-Inf-Orb-R, Rolandic-Oper-R, Olfactory-R, Angular-L, Precuneus-L, Precuenus-R, Heschl-L, and Temporal-Pole-Mid-R. In addition, the severity of symptoms was found positively associated with the ALFF within the Rolandic-Oper-R and Frontal-Inf-Orb-R. Our findings indicated that elastic net logistic regression could be a useful tool to identify the characteristics of schizophrenia -related brain deterioration, which provides novel insights into schizophrenia diagnosis and prediction.


Assuntos
Esquizofrenia , Encéfalo/diagnóstico por imagem , Mapeamento Encefálico , Feminino , Humanos , Modelos Logísticos , Imageamento por Ressonância Magnética , Esquizofrenia/diagnóstico por imagem
14.
Stat Med ; 40(15): 3604-3624, 2021 07 10.
Artigo em Inglês | MEDLINE | ID: mdl-33851463

RESUMO

Alzheimer's disease can be diagnosed by analyzing brain images (eg, magnetic resonance imaging, MRI) and neuropsychological tests (eg, mini-mental state examination, MMSE). A partially linear mean shift model (PLMSM) is here proposed to investigate the relationship between MMSE score and high-dimensional regions of interest in MRI, and detect the outliers. In the presence of high-dimensional data, existing Bayesian approaches (eg, Markov chain Monte Carlo) to analyze a PLMSM take intensive computational cost and require huge memory, and have low convergence rate. To address these issues, a variational Bayesian inference is developed to simultaneously estimate parameters and nonparametric functions and identify outliers in a PLMSM. A Bayesian P-splines method is presented to approximate nonparametric functions, a Bayesian adaptive Lasso approach is employed to select predictors, and outliers are detected by the classification variable. Two simulation studies are conducted to assess the finite sample performance of the proposed method. An MRI dataset with elderly cognitive ability is provided to corroborate the proposed method.


Assuntos
Doença de Alzheimer , Idoso , Algoritmos , Teorema de Bayes , Humanos , Modelos Lineares , Método de Monte Carlo , Neuroimagem
15.
Entropy (Basel) ; 22(11)2020 Nov 05.
Artigo em Inglês | MEDLINE | ID: mdl-33287025

RESUMO

Distance weighted discrimination (DWD) is an appealing classification method that is capable of overcoming data piling problems in high-dimensional settings. Especially when various sparsity structures are assumed in these settings, variable selection in multicategory classification poses great challenges. In this paper, we propose a multicategory generalized DWD (MgDWD) method that maintains intrinsic variable group structures during selection using a sparse group lasso penalty. Theoretically, we derive minimizer uniqueness for the penalized MgDWD loss function and consistency properties for the proposed classifier. We further develop an efficient algorithm based on the proximal operator to solve the optimization problem. The performance of MgDWD is evaluated using finite sample simulations and miRNA data from an HIV study.

16.
BioData Min ; 13: 14, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32905307

RESUMO

BACKGROUND: Various combinations of ultrasonographic (US) characteristics are increasingly utilized to classify thyroid nodules. But they lack theories, and heavily depend on radiologists' experience, and cannot correctly classify thyroid nodules. Hence, our main purpose of this manuscript is to select the US characteristics significantly associated with malignancy and to develop an efficient scoring system for facilitating ultrasonic clinicians to correctly identify thyroid malignancy. METHODS: A logistic regression (LR) model is utilized to identify the potential thyroid malignancy, and the least absolute shrinkage and selection operator (LASSO) method is adopted to simultaneously select US characteristics significantly associated with malignancy and estimate parameters in LR model. Based on the selected US characteristics, we calculate the probability for each of thyroid nodules via random forest (RF) and extreme learning machine (ELM), and develop a scoring system to classify thyroid nodules. For comparison, we also consider eight state-of-the-art methods such as support vector machine (SVM), neural network (NET), etc. The area under the receiver operating characteristic curve (AUC) is employed to measure the accuracy of various classifiers. RESULTS: The US characteristics: nodule size, AP/T≥1, solid component, micro-calcifications, hackly border, hypoechogenicity, presence of halo, unclear border, irregular margin, and central vascularity are selected as the significant predictors associated with thyroid malignancy via the LASSO LR (LLR). Using the developed scoring system, thyroid nodules are classified into the following four categories: benign, low suspicion, intermediate suspicion, and high suspicion, whose rates of malignancy correctly identified for RF (ELM) method on the testing dataset are 0.0% (4.3%), 14.3% (50.0%), 58.1% (59.1%) and 96.1% (97.7%), respectively. CONCLUSION: LLR together with RF performs better than other methods in identifying malignancy, especially for abnormal nodules, in terms of risk scores. The developed scoring system can well predict the risk of malignancy and guide medical doctors to make management decisions for reducing the number of unnecessary biopsies for benign nodules.

17.
Stat Med ; 39(20): 2621-2638, 2020 09 10.
Artigo em Inglês | MEDLINE | ID: mdl-32390284

RESUMO

In a matched-pair study, when outcomes of two diagnostic tests are ordinal/continuous, the difference between two correlated areas under ROC curves (AUCs) is usually used to compare the overall discriminatory ability of two diagnostic tests. This article considers confidence interval (CI) construction problems of difference between two correlated AUCs in a matched-pair experiment, and proposes 13 hybrid CIs based on variance estimates recovery with the maximum likelihood estimation, Delong's statistic, Wilson score statistic (WS) and WS with continuity correction, the modified Wald statistic (MW) and MW with continuity correction and Agresti-Coull statistic, and three Bootstrap-resampling-based CIs. For comparison, we present traditional parametric and nonparametric CIs. Simulation studies are conducted to assess the performance of the proposed CIs in terms of empirical coverage probabilities, empirical interval widths, and ratios of the mesial noncoverage probabilities to the noncoverage probabilities. Two examples from clinical studies are illustrated by the proposed methodologies. Empirical results evidence that the hybrid Agresti-Coull CI with the empirical estimation (EAC) behaved most satisfactorily because its coverage probability was quite close to the prespecified confidence level with short interval width. Hence, we recommend the usage of the EAC CI in applications.


Assuntos
Modelos Estatísticos , Área Sob a Curva , Simulação por Computador , Intervalos de Confiança , Humanos , Probabilidade , Curva ROC
18.
Pharm Stat ; 19(5): 518-531, 2020 09.
Artigo em Inglês | MEDLINE | ID: mdl-32112669

RESUMO

A three-arm trial including an experimental treatment, an active reference treatment and a placebo is often used to assess the non-inferiority (NI) with assay sensitivity of an experimental treatment. Various hypothesis-test-based approaches via a fraction or pre-specified margin have been proposed to assess the NI with assay sensitivity in a three-arm trial. There is little work done on confidence interval in a three-arm trial. This paper develops a hybrid approach to construct simultaneous confidence interval for assessing NI and assay sensitivity in a three-arm trial. For comparison, we present normal-approximation-based and bootstrap-resampling-based simultaneous confidence intervals. Simulation studies evidence that the hybrid approach with the Wilson score statistic performs better than other approaches in terms of empirical coverage probability and mesial-non-coverage probability. An example is used to illustrate the proposed approaches.


Assuntos
Ensaios Clínicos Controlados como Assunto/métodos , Determinação de Ponto Final , Projetos de Pesquisa , Simulação por Computador , Intervalos de Confiança , Interpretação Estatística de Dados , Humanos , Probabilidade
19.
Biom J ; 62(4): 1038-1059, 2020 07.
Artigo em Inglês | MEDLINE | ID: mdl-31957095

RESUMO

This paper considers statistical inference for the receiver operating characteristic (ROC) curve in the presence of missing biomarker values by utilizing estimating equations (EEs) together with smoothed empirical likelihood (SEL). Three approaches are developed to estimate ROC curve and construct its SEL-based confidence intervals based on the kernel-assisted EE imputation, multiple imputation, and hybrid imputation combining the inverse probability weighted imputation and multiple imputation. Under some regularity conditions, we show asymptotic properties of the proposed maximum SEL estimators for ROC curve. Simulation studies are conducted to investigate the performance of the proposed SEL approaches. An example is illustrated by the proposed methodologies. Empirical results show that the hybrid imputation method behaves better than the kernel-assisted and multiple imputation methods, and the proposed three SEL methods outperform existing nonparametric method.


Assuntos
Biometria/métodos , Funções Verossimilhança , Modelos Estatísticos , Curva ROC , Estatísticas não Paramétricas
20.
BMC Med Res Methodol ; 18(1): 172, 2018 12 18.
Artigo em Inglês | MEDLINE | ID: mdl-30563454

RESUMO

BACKGROUND: The main purpose of dose-finding studies in Phase I trial is to estimate maximum tolerated dose (MTD), which is the maximum test dose that can be assigned with an acceptable level of toxicity. Existing methods developed for single-agent dose-finding assume that the dose-toxicity relationship follows a specific parametric potency curve. This assumption may lead to bias and unsafe dose escalations due to the misspecification of parametric curve. METHODS: This paper relaxes the parametric assumption of dose-toxicity relationship by imposing a Dirichlet process prior on unknown dose-toxicity curve. A hybrid algorithm combining the Gibbs sampler and adaptive rejection Metropolis sampling (ARMS) algorithm is developed to estimate the dose-toxicity curve, and a two-stage Bayesian nonparametric adaptive design is presented to estimate MTD. RESULTS: For comparison, we consider two classical continual reassessment methods (CRMs) (i.e., logistic and power models). Numerical results show the flexibility of the proposed method for single-agent dose-finding trials, and the proposed method behaves better than two classical CRMs under our considered scenarios. CONCLUSIONS: The proposed dose-finding procedure is model-free and robust, and behaves satisfactorily even in small sample cases.


Assuntos
Algoritmos , Teorema de Bayes , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/diagnóstico , Modelos Teóricos , Estatísticas não Paramétricas , Ensaios Clínicos Fase I como Assunto , Simulação por Computador , Relação Dose-Resposta a Droga , Humanos , Modelos Logísticos , Dose Máxima Tolerável , Reprodutibilidade dos Testes
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...