Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 39
Filtrar
1.
Sci Rep ; 14(1): 8771, 2024 Apr 16.
Artigo em Inglês | MEDLINE | ID: mdl-38627533

RESUMO

The stress measurements determined by both the overcoring (OC) and hydraulic fracturing (HF) methods in the Shuichang iron mine and Sanshandao gold mine were compared and evaluated, respectively. The results indicate that the independent OC and HF data in the two mines reveal the same dominant faulting stress regime. The σH orientations derived from the OC and HF methods in the Shuichang iron mine are dominantly oriented in the N81.1°W-N89.4°W and N77.0°E-N88.0°E, respectively, and the σH orientations yielded from the OC and HF techniques in the Sanshandao gold mine are predominantly in the N30°W-N90°W and N55.5°W-N60.4°W, respectively; hence, the σH orientations obtained by the two different methods in the two mines are comparatively similar. In addition, the shapes of the probability density diagrams using an improved Bayesian regression approach of the three principal stresses measured by the OC and HF methods in the same mine are quite similar, and all the obtained Kolmogorov-Smirnov test p-values are larger than the selected significance level of 0.01, indicating that the stress data interpreted by the two methods approximately follow the same distribution law. Thus, the performance of the two techniques and the reliability of the measured data are satisfactory.

2.
MethodsX ; 12: 102586, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38357636

RESUMO

It typically takes a lot of time to monitor life-testing experiments on a product or material. Units can be tested under harsher conditions than usual, known as accelerated life tests to shorten the testing period. This study's goal is to investigate the issue of partially accelerated life testing that use generalized progressive hybrid censored samples to estimate the stress-strength reliability in the multicomponent case. Also, the fuzziness of the model is considered that gives more sensitive and accurate analyses about the underlying system. Maximum likelihood estimation method under the inverse Weibull distribution and using the generalized progressively hybrid censoring scheme is introduced to obtain an estimator for the fuzzy multicomponent stress-strength reliability. Also, an asymptotic confidence interval is deduced to examine the reliability of the fuzzy multicomponent stress-strength. Simulation study is conducted using maximum likelihood estimates and confidence intervals for the fuzzy multicomponent stress-strength reliability for different values of the parameters and different schemes. A real data application representing the data for the failure times for a certain software model is introduced to obtain the fuzzy multicomponent stress-strength reliability for different schemes.•The fuzzy multicomponent stress-strength reliability is investigated under partially accelerated life testing and the generalized progressively hybrid censored scheme.•An algorithm is introduced to simulate data for the censoring scheme.•A real data application is presented to obtain the fuzzy multicomponent stress-strength reliability at different schemes.

3.
Sensors (Basel) ; 23(23)2023 Nov 21.
Artigo em Inglês | MEDLINE | ID: mdl-38067669

RESUMO

This paper proposes a novel and reliable leak-detection method for pipeline systems based on acoustic emission (AE) signals. The proposed method analyzes signals from two AE sensors installed on the pipeline to detect leaks located between these two sensors. Firstly, the raw AE signals are preprocessed using empirical mode decomposition. The time difference of arrival (TDOA) is then extracted as a statistical feature of the two AE signals. The state of the pipeline (leakage/normal) is determined through comparing the statistical distribution of the TDOA of the current state with the prior normal state. Specifically, the two-sample Kolmogorov-Smirnov (K-S) test is applied to compare the statistical distribution of the TDOA feature for leak and non-leak scenarios. The K-S test statistic value in this context functions as a leakage indicator. A new criterion called leak sensitivity is introduced to evaluate and compare the performance of leak detection methods. Extensive experiments were conducted using an industrial pipeline system, and the results demonstrate the excellence of the proposed method in leak detection. Compared to traditional feature-based indicators, our approach achieves a significantly higher performance in leak detection.

4.
Sensors (Basel) ; 23(5)2023 Mar 02.
Artigo em Inglês | MEDLINE | ID: mdl-36904966

RESUMO

The signal measured by the maglev gyro sensor is sensitive to the influence of the instantaneous disturbance torque caused by the instantaneous strong wind or the ground vibration, which reduced the north-seeking accuracy of the instrument. To address this issue, we proposed a novel method combining the heuristic segmentation algorithm (HSA) and the two-sample Kolmogorov-Smirnov (KS) test (named HSA-KS method) to process the gyro signals and improve the north-seeking accuracy of the gyro. There were two key steps in the HSA-KS method: (i) all the potential change points were automatically and accurately detected by HSA, and (ii) the jumps in the signal caused by the instantaneous disturbance torque were quickly located and eliminated by the two-sample KS test. The effectiveness of our method was verified through a field experiment on a high-precision global positioning system (GPS) baseline at the 5th sub-tunnel of the Qinling water conveyance tunnel of the Hanjiang-to-Weihe River Diversion Project in Shaanxi Province, China. Our results from the autocorrelograms indicated that the jumps in the gyro signals can be automatically and accurately eliminated by the HSA-KS method. After processing, the absolute difference between the gyro and high-precision GPS north azimuths was enhanced by 53.5%, which was superior to the optimized wavelet transform and the optimized Hilbert-Huang transform.

5.
J Biopharm Stat ; 33(3): 386-399, 2023 05 04.
Artigo em Inglês | MEDLINE | ID: mdl-36511635

RESUMO

The Weibull distribution is applied to the number of days between the start date of drug administration and the date of occurrence of an adverse event. The tendency of occurrence of adverse events can be clarified by estimating the two- or three-parameter Weibull distribution, using the data regarding the number of days. Our purpose is to estimate the parameters of the Weibull distribution with high accuracy, even in low-reported adverse events, such as new drugs, polypharmacy and small clinical trials. Furthermore, the two-sample Kolmogorov - Smirnov test (two-sided) is used to examine whether the tendency of occurrence of adverse events is different for two Weibull distributions estimated from two drugs with similar efficacy. We used discrete data derived from FDA Adverse Event Reporting System (FAERS), as the FAERS data are presented in years, months and days without hours and minutes. Because this study focuses on early onset adverse events, data may be contained 0 days. The discreteness of the data and the fact that it may include zero make this distribution different from the general Weibull distribution, which is defined for continuous data greater than zero. We search for the optimal parameter estimation method for the Weibull distribution under these two conditions, and verify its effectiveness using Monte Carlo simulations and FAERS data. Because the results obtained from FAERS data may differ depending on data handling, we describe the of data handling technique and the sample code that can reproduce the results.


Assuntos
Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos , Estados Unidos , Humanos , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/diagnóstico , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Sistemas de Notificação de Reações Adversas a Medicamentos , United States Food and Drug Administration , Software , Distribuições Estatísticas
6.
Commun Stat Simul Comput ; 51(12): 7444-7457, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36583130

RESUMO

It is a common approach to dichotomize a continuous biomarker in clinical setting for the convenience of application. Analytically, results from using a dichotomized biomarker are often more reliable and resistant to outliers, bi-modal and other unknown distributions. There are two commonly used methods for selecting the best cut-off value for dichotomization of a continuous biomarker, using either maximally selected chi-square statistic or a ROC curve, specifically the Youden Index. In this paper, we explained that in many situations, it is inappropriate to use the former. By using the Maximum Absolute Youden Index (MAYI), we demonstrated that the integration of a MAYI and the Kolmogorov-Smirnov test is not only a robust non-parametric method, but also provides more meaningful p value for selecting the cut-off value than using a Mann-Whitney test. In addition, our method can be applied directly in clinical settings.

7.
Stat Med ; 41(25): 5134-5149, 2022 11 10.
Artigo em Inglês | MEDLINE | ID: mdl-36005293

RESUMO

With advances in cancer treatments and improved patient survival, more patients may go through multiple lines of treatment. It is of clinical importance to choose a sequence of effective treatments (eg, lines of treatment) for individual patients with the goal of optimizing their long-term clinical outcome (eg, survival). Several important issues arise in cancer studies. First, cancer clinical trials are usually conducted by each line of treatment. For a treatment sequence, we may have first line and second line treatment data from two different studies. Second, there is typically a treatment initiation period varying from patient to patient between progression of disease and the start of the second line treatment due to administrative reasons. Additionally, the choice of the second line treatment for patients with progression of disease may depend on their characteristics. We address all these issues and develop semiparametric methods under the potential outcome framework for the estimation of the overall survival probability for a treatment sequence and for comparing different treatment sequences. We establish the large sample properties of the proposed inferential procedures. Simulation studies and an application to a colorectal clinical trial are provided.


Assuntos
Neoplasias , Humanos , Neoplasias/terapia , Estatísticas não Paramétricas
8.
Entropy (Basel) ; 24(8)2022 Aug 03.
Artigo em Inglês | MEDLINE | ID: mdl-36010735

RESUMO

A new nonparametric test of equality of two densities is investigated. The test statistic is an average of log-Bayes factors, each of which is constructed from a kernel density estimate. Prior densities for the bandwidths of the kernel estimates are required, and it is shown how to choose priors so that the log-Bayes factors can be calculated exactly. Critical values of the test statistic are determined by a permutation distribution, conditional on the data. An attractive property of the methodology is that a critical value of 0 leads to a test for which both type I and II error probabilities tend to 0 as sample sizes tend to ∞. Existing results on Kullback-Leibler loss of kernel estimates are crucial to obtaining these asymptotic results, and also imply that the proposed test works best with heavy-tailed kernels. Finite sample characteristics of the test are studied via simulation, and extensions to multivariate data are straightforward, as illustrated by an application to bivariate connectionist data.

9.
Entropy (Basel) ; 24(5)2022 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-35626622

RESUMO

Gene-set enrichment analysis is the key methodology for obtaining biological information from transcriptomic space's statistical result. Since its introduction, Gene-set Enrichment analysis methods have obtained more reliable results and a wider range of application. Great attention has been devoted to global tests, in contrast to competitive methods that have been largely ignored, although they appear more flexible because they are independent from the source of gene-profiles. We analyzed the properties of the Mann-Whitney-Wilcoxon test, a competitive method, and adapted its interpretation in the context of enrichment analysis by introducing a Normalized Enrichment Score that summarize two interpretations: a probability estimate and a location index. Two implementations are presented and compared with relevant literature methods: an R package and an online web tool. Both allow for obtaining tabular and graphical results with attention to reproducible research.

10.
Sensors (Basel) ; 21(24)2021 Dec 10.
Artigo em Inglês | MEDLINE | ID: mdl-34960342

RESUMO

Pipeline leakage remains a challenge in various industries. Acoustic emission (AE) technology has recently shown great potential for leak diagnosis. Many AE features, such as root mean square (RMS), peak value, standard deviation, mean value, and entropy, have been suggested to detect leaks. However, background noise in AE signals makes these features ineffective. The present paper proposes a pipeline leak detection technique based on acoustic emission event (AEE) features and a Kolmogorov-Smirnov (KS) test. The AEE features, namely, peak amplitude, energy, rise-time, decay time, and counts, are inherent properties of AE signals and therefore more suitable for recognizing leak attributes. Surprisingly, the AEE features have received negligible attention. According to the proposed technique, the AEE features are first extracted from the AE signals. For this purpose, a sliding window was used with an adaptive threshold so that the properties of both burst- and continuous-type emissions can be retained. The AEE features form distribution that change its shape when the pipeline condition changes from normal to leakage. The AEE feature distributions for leak and healthy conditions were discriminated using the two-sample KS test, and a pipeline leak indicator (PLI) was obtained. The experimental results demonstrate that the developed PLI accurately distinguishes the leak and no-leak conditions without any prior leak information and it performs better than the traditional features such as mean, variance, RMS, and kurtosis.


Assuntos
Acústica , Ruído , Estatísticas não Paramétricas
11.
Sensors (Basel) ; 20(24)2020 Dec 12.
Artigo em Inglês | MEDLINE | ID: mdl-33322805

RESUMO

This paper proposes a new test method of detecting the presence of impulsive noise based on a complementary cumulative density function (CCDF). Impulsive noise severely degrades performance of communication systems and the conventional Kolmogorov-Smirnov (K-S) test may not perform well, because the test does not consider the characteristics of impulsive noise. In order to detect the presence of impulsive noise reliably, the CCDF of measurement samples is analyzed and compared with the CCDF of additive white Gaussian noise to find the difference between those CCDFs. Due to the nature of heavy-tails in impulsive noise, only the maximum difference may not be sufficient for the accurate detection of impulsive noise. Therefore, the proposed method applies the test hypothesis using the weighted sum of all the differences between those CCDFs. Simulation results justify that the proposed test is more robust and provides lower miss detection probability than the K-S test in the presence of impulsive noise.

12.
Entropy (Basel) ; 22(4)2020 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-33286221

RESUMO

The Bayesian information criterion (BIC), the Akaike information criterion (AIC), and some other indicators derived from them are widely used for model selection. In their original form, they contain the likelihood of the data given the models. Unfortunately, in many applications, it is practically impossible to calculate the likelihood, and, therefore, the criteria have been reformulated in terms of descriptive statistics of the residual distribution: the variance and the mean-squared error of the residuals. These alternative versions are strictly valid only in the presence of additive noise of Gaussian distribution, not a completely satisfactory assumption in many applications in science and engineering. Moreover, the variance and the mean-squared error are quite crude statistics of the residual distributions. More sophisticated statistical indicators, capable of better quantifying how close the residual distribution is to the noise, can be profitably used. In particular, specific goodness of fit tests have been included in the expressions of the traditional criteria and have proved to be very effective in improving their discriminating capability. These improved performances have been demonstrated with a systematic series of simulations using synthetic data for various classes of functions and different noise statistics.

13.
Entropy (Basel) ; 22(5)2020 May 04.
Artigo em Inglês | MEDLINE | ID: mdl-33286295

RESUMO

In Italy, the elections occur often, indeed almost every year the citizens are involved in a democratic choice for deciding leaders of different administrative entities. Sometimes the citizens are called to vote for filling more than one office in more than one administrative body. This phenomenon has occurred 35 times after 1948; it creates the peculiar condition of having the same sample of people expressing decisions on political bases at the same time. Therefore, the Italian contemporaneous ballots constitute the occasion to measure coherence and chaos in the way of expressing political opinion. In this paper, we address all the Italian elections that occurred between 1948 and 2018. We collect the number of votes per party at each administrative level and we treat each election as a manifestation of a complex system. Then, we use the Shannon entropy and the Gini Index to study the degree of disorder manifested during different types of elections at the municipality level. A particular focus is devoted to the contemporaneous elections. Such cases implicate different disorder dynamics in the contemporaneous ballots, when different administrative level are involved. Furthermore, some features that characterize different entropic regimes have emerged.

14.
J Med Signals Sens ; 10(3): 145-157, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-33062607

RESUMO

BACKGROUND: With the increasing advancement of technology, it is necessary to develop more accurate, convenient, and cost-effective security systems. Handwriting signature, as one of the most popular and applicable biometrics, is widely used to register ownership in banking systems, including checks, as well as in administrative and financial applications in everyday life, all over the world. Automatic signature verification and recognition systems, especially in the case of online signatures, are potentially the most powerful and publicly accepted means for personal authentication. METHODS: In this article, a novel procedure for online signature verification and recognition has been presented based on Dual-Tree Complex Wavelet Packet Transform (DT-CWPT). RESULTS: In the presented method, three-level decomposition of DT-CWPT has been computed for three time signals of dynamic information including horizontal and vertical positions in addition to the pressure signal. Then, in order to make feature vector corresponding to each signature, log energy entropy measures have been computed for each subband of DT-CWPT decomposition. Finally, to classify the query signature, three classifiers including k-nearest neighbor, support vector machine, and Kolmogorov- Smirnov test have been examined. Experiments have been conducted using three benchmark datasets: SVC2004, MCYT-100, as two Latin online signature datasets, and NDSD as a Persian signature dataset. CONCLUSION: Obtained favorable experimental results, in comparison with literature, confirm the effectiveness of the presented method in both online signature verification and recognition objects.

15.
Proc Math Phys Eng Sci ; 476(2241): 20190742, 2020 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-33071564

RESUMO

The putative scale-free nature of real-world networks has generated a lot of interest in the past 20 years: if networks from many different fields share a common structure, then perhaps this suggests some underlying 'network law'. Testing the degree distribution of networks for power-law tails has been a topic of considerable discussion. Ad hoc statistical methodology has been used both to discredit power-laws as well as to support them. This paper proposes a statistical testing procedure that considers the complex issues in testing degree distributions in networks that result from observing a finite network, having dependent degree sequences and suffering from insufficient power. We focus on testing whether the tail of the empirical degrees behaves like the tail of a de Solla Price model, a two-parameter power-law distribution. We modify the well-known Kolmogorov-Smirnov test to achieve even sensitivity along the tail, considering the dependence between the empirical degrees under the null distribution, while guaranteeing sufficient power of the test. We apply the method to many empirical degree distributions. Our results show that power-law network degree distributions are not rare, classifying almost 65% of the tested networks as having a power-law tail with at least 80% power.

16.
MethodsX ; 7: 100875, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32346527

RESUMO

We introduce an alternative method that is simple and can be used to test scale invariance or self-similarity in any types of data, irrespective of their distributions. Our method is based on estimating the Lorenz curve and the Kolmogorov-Smirnov test. This alternative method could be used as a preliminary screening before investigating further which types of distributions would fit the actual observations.•We introduce a simple method to test scale invariance, regardless of data distributions.•Our method is based on estimating the Lorenz curve and the Kolmogorov-Smirnov test.•This alternative method could serve as an initial screening before investigating further which types of distributions would fit the actual data.

17.
Forensic Sci Int ; 309: 110227, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32146301

RESUMO

The presence of traces of narcotics, particularly cocaine, on banknotes in circulation is a known and widespread fact in all countries. While linked to consumption and trafficking (primary contamination), their spread is due to direct contact with other banknotes during machine counting and cash financial transactions. The mere detection of traces of cocaine on a sample of banknotes is therefore not sufficient evidence to establish the banknote's illegal origin. Increasing levels of contamination are recorded close to (in terms of both place and time) the first direct contact with the substance. The analysis must thus be able to demonstrate that the concentration of narcotics on the banknotes is significantly higher (statistically) in terms of value and frequency than would be expected from background noise alone. Even in that event, however, this evidence has to be substantiated with additional confirmations linking banknotes to the person and this latter to drug trafficking and/or dealing. In general, an in-depth and systematic analysis of all seized banknotes to search for traces of narcotics is not only prohibitive in terms of cost, but also unnecessary. If the sampling procedure is respected, the Swiss Federal Supreme Court actually recognizes IMS (ion mobility spectrometry) as a lawful method for checking the degree of banknote contamination, as well as all the statistical conclusions that can be drawn from it. In special cases, the prosecutor may require confirmation of IMS results by a laboratory test (liquid/gas chromatography-mass spectrometry). Using a non-destructive sampling procedure (suction on swabs) we determined the presence of cocaine on 978 circulating euro banknotes, randomly collected at 5 swiss customs offices, with IMS and LC-MS/MS in order to establish a normal (background) contamination level. A significant proportion (46.4%) of the euro banknotes analysed by LC-MS/MS had cocaine concentrations above the quantification limit (1 ng/swab). However, the extent of contamination is a determining factor: 94.6% of the banknotes in circulation have cocaine concentrations equal to or less than 10 ng/swab and only 3.4% have cocaine concentrations above 20 ng/swab. By comparison, only 27.3% and 13.4% respectively of the seized banknotes (2 real cases) had cocaine concentrations equal to or less than 10 ng/swab, but 63.5% and 86.7% respectively had cocaine concentrations above 20 ng/swab. We also describe a Komolgorov-Smirnov test model used to determine the presence of an "abnormal" level of contamination relative to the reference banknotes (banknotes in circulation or background noise) effectively and within realistic practical and theoretical frameworks. This model provides a quantifiable and statistically significant result that not only simplifies data interpretation, but also facilitates admissibility as forensic evidence in proceedings. When applied to the sized banknotes using both IMS and LC-MS/MS data, we obtain fully consistent and sounding conclusions.

18.
Ecology ; 101(6): e03011, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-32065669

RESUMO

The maximum entropy theory of ecology (METE) applies the concept of "entropy" from information theory to predict macroecological patterns. The energetic predictions of the METE rely on predetermined metabolic scaling from external theories, and this reliance diminishes the testability of the theory. In this work, I build parameterized METE models by treating the metabolic scaling exponent as a free parameter, and I use the maximum-likelihood method to obtain empirically plausible estimates of the exponent. I test the models using the individual tree data from an oak-dominated deciduous forest in the northeastern United States and from a tropical forest in central Panama. My analysis shows that the metabolic scaling exponents predicted from the parameterized METE models deviate from that of the metabolic theory of ecology and exhibit large variation, at both community and population levels. Assemblage and population abundance may act as ecological constraints that regulate the individual-level metabolic scaling behavior. This study provides a novel example of the use of the parameterized METE models to reveal the biological processes of individual organisms. The implication and possible extensions of the parameterized METE models are discussed.


Assuntos
Modelos Biológicos , Árvores , Entropia , Florestas , Panamá
19.
Mem. Inst. Invest. Cienc. Salud (Impr.) ; 17(2): 44-55, ago. 2019. tab, ilus
Artigo em Espanhol | LILACS, BDNPAR | ID: biblio-1008416

RESUMO

El modelo matemático de Lotka describe la relación entre los autores y su productividad dentro de un área de la ciencia. Este estudio se efectuó con el objetivo de comprobar que el comportamiento de la productividad científica de los investigadores del área de Ciencias Médicas y de la Salud, categorizados al año 2016 en el Programa Nacional de Incentivo a los Investigadores (PRONII), cumple con el modelo matemático de Lotka. Con este fin se consideraron las publicaciones alojadas en la Web of Science en las que éstos figuran como primer autor, aplicándose el criterio de primera autoría tal como lo hizo el formulador del modelo, y con filiación a instituciones paraguayas. Se aplicó el modelo a 236 publicaciones generadas por 77 investigadores, observándose que 21 de ellos tenían una sola publicación. Se comprobó que los datos observados no se ajustaban al modelo propuesto por Lotka. Esto motivó a que de los 77 investigadores analizados inicialmente se seleccionaran a 70. A la productividad de estos 70 investigadores se aplicó la forma general del modelo de Lotka y se comprobó que los datos se ajustaban al modelo. Se observó que de cada 10 investigadores muestreados solo 4 contaban con una única publicación, hecho que podría suponer una limitada existencia de autores ocasionales. Estos hallazgos revisten importancia ya que muestran el comportamiento de la relación entre el investigador y su productividad. Asimismo, mediante el modelo establecido es posible bajo determinadas condiciones realizar predicciones de la cantidad de investigadores con determinado número de publicaciones(AU)


Lotka's mathematical model describes the relationship between authors and their productivity within an area of science. This study was carried out with the aim of verifying that the behavior of the scientific productivity of the researchers in the area of Medical Sciences and Health, categorized to 2016 in the National Program of Incentive to Researchers (PRONII), complies with the model mathematical of Lotka. To this purpose, the publications hosted in the Web of Science were considered in which they appear as the first author, applying the criterion of first authorship as the model's formulator did, and with affiliation to Paraguayan institutions. The model was applied to 236 publications generated by 77 researchers, observing that 21 of them had a single publication. It was found that the observed data did not conform to the model proposed by Lotka. This motivated the selection of 70 of the 77 researchers initially analyzed. The overall form of the Lotka's model was applied to the productivity of these 70 researchers and the data was found to fit the model. It was observed that out of every 10 researchers sampled only four had a single publication, fact that could suppose a limited existence of occasional authors. These findings are important because they show the behavior of the relationship between the researcher and his & her productivity. In addition, through the established model it is possible under certain conditions to make predictions of the number of researchers with a certain number of publications(AU)


Assuntos
Bibliometria , Ciências da Saúde , Publicações Científicas e Técnicas , Estudos Transversais , Estatísticas não Paramétricas
20.
Lifetime Data Anal ; 25(1): 97-127, 2019 01.
Artigo em Inglês | MEDLINE | ID: mdl-29512005

RESUMO

We rigorously extend the widely used wild bootstrap resampling technique to the multivariate Nelson-Aalen estimator under Aalen's multiplicative intensity model. Aalen's model covers general Markovian multistate models including competing risks subject to independent left-truncation and right-censoring. This leads to various statistical applications such as asymptotically valid confidence bands or tests for equivalence and proportional hazards. This is exemplified in a data analysis examining the impact of ventilation on the duration of intensive care unit stay. The finite sample properties of the new procedures are investigated in a simulation study.


Assuntos
Simulação por Computador , Análise Multivariada , Modelos de Riscos Proporcionais , Estatísticas não Paramétricas , Biometria/métodos , Análise de Dados , Humanos , Unidades de Terapia Intensiva , Tempo de Internação , Modelos Estatísticos , Respiração Artificial , Sensibilidade e Especificidade , Análise de Sobrevida
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...