Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.412
Filter
1.
Front Nucl Med ; 4: 1380518, 2024.
Article in English | MEDLINE | ID: mdl-39355208

ABSTRACT

The diagnosis of medical conditions and subsequent treatment often involves radionuclide imaging techniques. To refine localisation accuracy and improve diagnostic confidence, compared with the use of a single scanning technique, a combination of two (or more) techniques can be used but with a higher risk of misalignment. For this to be reliable and accurate, recorded data undergo processing to suppress noise and enhance resolution. A step in image processing techniques for such inverse problems is the inclusion of smoothing. Standard approaches, however, are usually limited to applying identical models globally. In this study, we propose a novel Laplace and Gaussian mixture prior distribution that incorporates different smoothing strategies with the automatic model-based estimation of mixture component weightings creating a locally adaptive model. A fully Bayesian approach is presented using multi-level hierarchical modelling and Markov chain Monte Carlo (MCMC) estimation methods to sample from the posterior distribution and hence perform estimation. The proposed methods are assessed using simulated γ -eye TM camera images and demonstrate greater noise reduction than existing methods but without compromising resolution. As well as image estimates, the MCMC methods also provide posterior variance estimates and hence uncertainty quantification takes into consideration any potential sources of variability. The use of mixture prior models, part Laplace random field and part Gaussian random field, within a Bayesian modelling approach is not limited to medical imaging applications but provides a more general framework for analysing other spatial inverse problems. Locally adaptive prior distributions provide a more realistic model, which leads to robust results and hence more reliable decision-making, especially in nuclear medicine. They can become a standard part of the toolkit of everyone working in image processing applications.

2.
Lifetime Data Anal ; 2024 Oct 04.
Article in English | MEDLINE | ID: mdl-39367291

ABSTRACT

Individuals with end-stage kidney disease (ESKD) on dialysis experience high mortality and excessive burden of hospitalizations over time relative to comparable Medicare patient cohorts without kidney failure. A key interest in this population is to understand the time-dynamic effects of multilevel risk factors that contribute to the correlated outcomes of longitudinal hospitalization and mortality. For this we utilize multilevel data from the United States Renal Data System (USRDS), a national database that includes nearly all patients with ESKD, where repeated measurements/hospitalizations over time are nested in patients and patients are nested within (health service) regions across the contiguous U.S. We develop a novel spatiotemporal multilevel joint model (STM-JM) that accounts for the aforementioned hierarchical structure of the data while considering the spatiotemporal variations in both outcomes across regions. The proposed STM-JM includes time-varying effects of multilevel (patient- and region-level) risk factors on hospitalization trajectories and mortality and incorporates spatial correlations across the spatial regions via a multivariate conditional autoregressive correlation structure. Efficient estimation and inference are performed via a Bayesian framework, where multilevel varying coefficient functions are targeted via thin-plate splines. The finite sample performance of the proposed method is assessed through simulation studies. An application of the proposed method to the USRDS data highlights significant time-varying effects of patient- and region-level risk factors on hospitalization and mortality and identifies specific time periods on dialysis and spatial locations across the U.S. with elevated hospitalization and mortality risks.

3.
Sci Rep ; 14(1): 22885, 2024 Oct 02.
Article in English | MEDLINE | ID: mdl-39358373

ABSTRACT

Predicting rock tunnel squeezing in underground projects is challenging due to its intricate and unpredictable nature. This study proposes an innovative approach to enhance the accuracy and reliability of tunnel squeezing prediction. The proposed method combines ensemble learning techniques with Q-learning and online Markov chain integration. A deep learning model is trained on a comprehensive database comprising tunnel parameters including diameter (D), burial depth (H), support stiffness (K), and tunneling quality index (Q). Multiple deep learning models are trained concurrently, leveraging ensemble learning to capture diverse patterns and improve prediction performance. Integration of the Q-learning-Online Markov Chain further refines predictions. The online Markov chain analyzes historical sequences of tunnel parameters and squeezing class transitions, establishing transition probabilities between different squeezing classes. The Q-learning algorithm optimizes decision-making by learning the optimal policy for transitioning between tunnel states. The proposed model is evaluated using a dataset from various tunnel construction projects, assessing performance through metrics like accuracy, precision, recall, and F1-score. Results demonstrate the efficiency of the ensemble deep learning model combined with Q-learning-Online Markov Chain in predicting surrounding rock tunnel squeezing. This approach offers insights into parameter interrelationships and dynamic squeezing characteristics, enabling proactive planning and support measures implementation to mitigate tunnel squeezing hazards and ensure underground structure safety. Experimental results show the model achieves a prediction accuracy of 98.11%, surpassing individual CNN and RNN models, with an AUC value of 0.98.

4.
Stat Methods Med Res ; : 9622802241281027, 2024 Oct 07.
Article in English | MEDLINE | ID: mdl-39370826

ABSTRACT

Recurrent event data, which represent the occurrence of repeated incidences, are common in observational studies. Furthermore, collecting possible spatial correlations in health and environmental data is likely to provide more information for risk prediction. This article proposes a comprehensive proportional intensity model considering spatial random effects for recurrent event data using a Bayesian approach. The spatial information for areal data (where the spatial location is known up to a geographic unit such as a county) and georeferenced data (where the location is exactly observed) is examined. A traditional constant baseline intensity function, as well as a flexible piecewise constant baseline intensity function, are both under consideration. To estimate the parameters, a Markov chain Monte Carlo method with the Metropolis-Hastings algorithm and the adaptive Metropolis algorithm are applied. To assess the performance of model fitting, the deviance information criterion and log pseudo marginal likelihood are proposed. Overall, simulation studies demonstrate that the proposed model is significantly better than models that do not consider spatial effects if spatial correlations exist. Finally, our approach is implemented using a dataset related to the recurrence of cardiovascular diseases, which incorporates spatial information.

5.
BMC Bioinformatics ; 25(1): 285, 2024 Sep 02.
Article in English | MEDLINE | ID: mdl-39223484

ABSTRACT

We consider a problem of inferring contact network from nodal states observed during an epidemiological process. In a black-box Bayesian optimisation framework this problem reduces to a discrete likelihood optimisation over the set of possible networks. The cardinality of this set grows combinatorially with the number of network nodes, which makes this optimisation computationally challenging. For each network, its likelihood is the probability for the observed data to appear during the evolution of the epidemiological process on this network. This probability can be very small, particularly if the network is significantly different from the ground truth network, from which the observed data actually appear. A commonly used stochastic simulation algorithm struggles to recover rare events and hence to estimate small probabilities and likelihoods. In this paper we replace the stochastic simulation with solving the chemical master equation for the probabilities of all network states. Since this equation also suffers from the curse of dimensionality, we apply tensor train approximations to overcome it and enable fast and accurate computations. Numerical simulations demonstrate efficient black-box Bayesian inference of the network.


Subject(s)
Algorithms , Bayes Theorem , Humans , Computer Simulation
6.
Heliyon ; 10(16): e35390, 2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39247261

ABSTRACT

Enhancing the reliability of photovoltaic (PV) systems is of paramount importance, given their expanding role in sustainable energy production, carbon emissions reduction, and supporting industrial growth. However, PV panels commonly encounter issues that significantly impact their performance. Specifically, the accumulation of dust and the rise in internal temperature lead to a drop in energy production efficiency. The primary issue addressed in this paper is using mathematical modeling to determine the optimal cleaning frequency. This paper first focuses on stochastic modeling for dust accumulation and temperature changes in PV panels, considering varying environmental conditions and proposing a model-based approach to determine the optimal cleaning frequency. Dust accumulation is described using a Non-homogeneous compound Poisson process (NHCPP), while temperature evolution is modeled using Markov chains. Within this framework, we consider the impact of wind speed and rainfall on dust accumulation and temperature. These factors, treated as covariates, are modeled using a two-dimensional time-continuous Markov chain with a finite state space. A Condition-based cleaning policy is proposed and assessed based on the degradation model. Optimal preventive cleaning thresholds and cleaning frequency (periodic and non-periodic) are determined to minimize the long-term average maintenance cost. The gain achieved by non-periodic inspections compared to periodic inspections ranges from 3.83% to 9.37%. Numerical experiments demonstrate the performance of the proposed cleaning policy, highlighting its potential to improve PV system efficiency and reliability.

7.
Heliyon ; 10(16): e36109, 2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39247362

ABSTRACT

The Erlang loss formula, also known as the Erlang B formula, has been known for over a century and has been used in a wide range of applications, from telephony to hospital intensive care unit management. It provides the blocking probability of arriving customers to a loss system involving a finite number of servers without a waiting room. Because of the need to introduce priorities in many services, an extension of the Erlang B formula to the case of a loss system with preemptive priority is valuable and essential. This paper analytically establishes the consistency between the global balance (steady state) equations for a loss system with preemptive priorities and a known result obtained using traffic loss arguments for the same problem. This paper, for the first time, derives this known result directly from the global balance equations based on the relevant multidimensional Markov chain. The paper also addresses the question of whether or not the well-known insensitivity property of the Erlang loss system is also applicable to the case of a loss system with preemptive priorities, provides explanations, and demonstrates through simulations that, except for the blocking probability of the highest priority customers, the blocking probabilities of the other customers are sensitive to the service time distributions and that a larger service time variance leads to a lower blocking probability of the lower priority traffic.

8.
Sensors (Basel) ; 24(17)2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39275559

ABSTRACT

Land-use and land-cover change (LULCC) is a critical environmental issue that has significant effects on biodiversity, ecosystem services, and climate change. This study examines the land-use and land-cover (LULC) spatiotemporal dynamics across a three-decade period (1998-2023) in a district area. In order to forecast the LULCC patterns, this study suggests a hybrid strategy that combines the random forest method with multi-layer perceptron (MLP) and Markov chain analysis. To predict the dynamics of LULC changes for the year 2035, a hybrid technique based on multi-layer perceptron and Markov chain model analysis (MLP-MCA) was employed. The area of developed land has increased significantly, while the amount of bare land, vegetation, and forest cover have all decreased. This is because the principal land types have changed due to population growth and economic expansion. This study also discovered that between 1998 and 2023, the built-up area increased by 468 km2 as a result of the replacement of natural resources. It is estimated that 25.04% of the study area's urbanization will increase by 2035. The performance of the model was confirmed with an overall accuracy of 90% and a kappa coefficient of around 0.89. It is important to use advanced predictive models to guide sustainable urban development strategies. The model provides valuable insights for policymakers, land managers, and researchers to support sustainable land-use planning, conservation efforts, and climate change mitigation strategies.

9.
J Am Stat Assoc ; 119(546): 798-810, 2024.
Article in English | MEDLINE | ID: mdl-39280355

ABSTRACT

Medical imaging is a form of technology that has revolutionized the medical field over the past decades. Digital pathology imaging, which captures histological details at the cellular level, is rapidly becoming a routine clinical procedure for cancer diagnosis support and treatment planning. Recent developments in deep-learning methods have facilitated tumor region segmentation from pathology images. The traditional shape descriptors that characterize tumor boundary roughness at the anatomical level are no longer suitable. New statistical approaches to model tumor shapes are in urgent need. In this paper, we consider the problem of modeling a tumor boundary as a closed polygonal chain. A Bayesian landmark-based shape analysis model is proposed. The model partitions the polygonal chain into mutually exclusive segments, accounting for boundary roughness. Our Bayesian inference framework provides uncertainty estimations on both the number and locations of landmarks, while outputting metrics that can be used to quantify boundary roughness. The performance of our model is comparable with that of a recently developed landmark detection model for planar elastic curves. In a case study of 143 consecutive patients with stage I to IV lung cancer, we demonstrated the heterogeneity of tumor boundary roughness derived from our model effectively predicted patient prognosis (p-value < 0.001).

10.
Biometrics ; 80(3)2024 Jul 01.
Article in English | MEDLINE | ID: mdl-39248120

ABSTRACT

Prior distributions, which represent one's belief in the distributions of unknown parameters before observing the data, impact Bayesian inference in a critical and fundamental way. With the ability to incorporate external information from expert opinions or historical datasets, the priors, if specified appropriately, can improve the statistical efficiency of Bayesian inference. In survival analysis, based on the concept of unit information (UI) under parametric models, we propose the unit information Dirichlet process (UIDP) as a new class of nonparametric priors for the underlying distribution of time-to-event data. By deriving the Fisher information in terms of the differential of the cumulative hazard function, the UIDP prior is formulated to match its prior UI with the weighted average of UI in historical datasets and thus can utilize both parametric and nonparametric information provided by historical datasets. With a Markov chain Monte Carlo algorithm, simulations and real data analysis demonstrate that the UIDP prior can adaptively borrow historical information and improve statistical efficiency in survival analysis.


Subject(s)
Bayes Theorem , Computer Simulation , Markov Chains , Models, Statistical , Monte Carlo Method , Survival Analysis , Humans , Algorithms , Biometry/methods , Data Interpretation, Statistical
11.
Heliyon ; 10(17): e36668, 2024 Sep 15.
Article in English | MEDLINE | ID: mdl-39263093

ABSTRACT

Ensuring stable power flow and reliable supply could maintain system security, improve system efficiency, minimize power loss, and reduce the risk of supply outage. Power flow management can be employed to enhance bus voltage and decrease power losses. The reliability of the system is critical for both the customers and the utility to ensure supply continuity and improved revenue. With the growing demand for reliable power supplies, it is crucial that utilities devote efforts to ensure a consistent power supply to meet customer needs. However, the frequent occurrence of power interruptions and the prolonged duration of interruption pose significant challenges to power distribution systems in the town of Wolaita Sodo. This study aims to explore power flow and reliability control through the utilization of optimal distribution network reconfiguration (DNR). The optimal placement of tie-switches (TS) to address the power flow and reliability issues is done through the adaptive particle swarm optimization (APSO) algorithm. With the help of APSO, five TS units achieved the reliability indices within the national standard boundary. The backward/forward sweep (BFS) and Markov chain-based Monte Carlo simulation (MCMCS) methods are used for load flow and reliability analysis. Through simulation, with integration of five TS, SAIFI decreases from a value of 557 to about 34, SAIDI decreases from 573.59h to about 43.87h and EENS decreases from 1835.5 MWh to about 140.38 MWh annually, active power loss decreases from 1631.15 kW to about 559.35 kW, the minimum bus voltage increases from 0.7537pu to 0.9502pu. Finally, the evaluation of the suggested algorithm variants is conducted by taking into account the duration it takes to respond, the level of convergence achieved, and the extent to which power loss is minimized.

12.
Cancers (Basel) ; 16(18)2024 Sep 11.
Article in English | MEDLINE | ID: mdl-39335104

ABSTRACT

Chimeric antigen receptor (CAR)-T cell therapy represents a breakthrough in treating resistant hematologic cancers. It is based on genetically modifying T cells transferred from the patient or a donor. Although its implementation has increased over the last few years, CAR-T has many challenges to be addressed, for instance, the associated severe toxicities, such as cytokine release syndrome. To model CAR-T cell dynamics, focusing on their proliferation and cytotoxic activity, we developed a mathematical framework using ordinary differential equations (ODEs) with Bayesian parameter estimation. Bayesian statistics were used to estimate model parameters through Monte Carlo integration, Bayesian inference, and Markov chain Monte Carlo (MCMC) methods. This paper explores MCMC methods, including the Metropolis-Hastings algorithm and DEMetropolis and DEMetropolisZ algorithms, which integrate differential evolution to enhance convergence rates. The theoretical findings and algorithms were validated using Python and Jupyter Notebooks. A real medical dataset of CAR-T cell therapy was analyzed, employing optimization algorithms to fit the mathematical model to the data, with the PyMC library facilitating Bayesian analysis. The results demonstrated that our model accurately captured the key dynamics of CAR-T cell therapy. This conclusion underscores the potential of parameter estimation to improve the understanding and effectiveness of CAR-T cell therapy in clinical settings.

13.
Environ Sci Technol ; 58(39): 17235-17246, 2024 Oct 01.
Article in English | MEDLINE | ID: mdl-39287556

ABSTRACT

Molecular, cellular, and organismal alterations are important descriptors of toxic effects, but our ability to extrapolate and predict ecological risks is limited by the availability of studies that link measurable end points to adverse population relevant outcomes such as cohort survival and growth. In this study, we used laboratory gene expression and behavior data from two populations of Atlantic killifish Fundulus heteroclitus [one reference site (SCOKF) and one PCB-contaminated site (NBHKF)] to inform individual-based models simulating cohort growth and survival from embryonic exposures to environmentally relevant concentrations of neurotoxicants. Methylmercury exposed SCOKF exhibited brain gene expression changes in the si:ch211-186j3.6, si:dkey-21c1.4, scamp1, and klhl6 genes, which coincided with changes in feeding and swimming behaviors, but our models simulated no growth or survival effects of exposures. PCB126-exposed SCOKF had lower physical activity levels coinciding with a general upregulation in nucleic and cellular brain gene sets (BGS) and downregulation in signaling, nucleic, and cellular BGS. The NBHKF, known to be tolerant to PCBs, had altered swimming behaviors that coincided with 98% fewer altered BGS. Our models simulated PCB126 decreased growth in SCOKF and survival in SCOKF and NBHKF. Overall, our study provides a unique demonstration linking molecular and behavioral data to develop quantitative, testable predictions of ecological risk.


Subject(s)
Fundulidae , Water Pollutants, Chemical , Animals , Water Pollutants, Chemical/toxicity , Fundulidae/genetics , Polychlorinated Biphenyls/toxicity , Methylmercury Compounds/toxicity , Behavior, Animal/drug effects , Neurotoxins/toxicity , Fundulus heteroclitus
14.
JMIR Med Inform ; 12: e59392, 2024 Sep 24.
Article in English | MEDLINE | ID: mdl-39316426

ABSTRACT

BACKGROUND: The World Health Organization (WHO) reported that cardiovascular diseases (CVDs) are the leading cause of death worldwide. CVDs are chronic, with complex progression patterns involving episodes of comorbidities and multimorbidities. When dealing with chronic diseases, physicians often adopt a "watchful waiting" strategy, and actions are postponed until information is available. Population-level transition probabilities and progression patterns can be revealed by applying time-variant stochastic modeling methods to longitudinal patient data from cohort studies. Inputs from CVD practitioners indicate that tools to generate and visualize cohort transition patterns have many impactful clinical applications. The resultant computational model can be embedded in digital decision support tools for clinicians. However, to date, no study has attempted to accomplish this for CVDs. OBJECTIVE: This study aims to apply advanced stochastic modeling methods to uncover the transition probabilities and progression patterns from longitudinal episodic data of patient cohorts with CVD and thereafter use the computational model to build a digital clinical cohort analytics artifact demonstrating the actionability of such models. METHODS: Our data were sourced from 9 epidemiological cohort studies by the National Heart Lung and Blood Institute and comprised chronological records of 1274 patients associated with 4839 CVD episodes across 16 years. We then used the continuous-time Markov chain method to develop our model, which offers a robust approach to time-variant transitions between disease states in chronic diseases. RESULTS: Our study presents time-variant transition probabilities of CVD state changes, revealing patterns of CVD progression against time. We found that the transition from myocardial infarction (MI) to stroke has the fastest transition rate (mean transition time 3, SD 0 days, because only 1 patient had a MI-to-stroke transition in the dataset), and the transition from MI to angina is the slowest (mean transition time 1457, SD 1449 days). Congestive heart failure is the most probable first episode (371/840, 44.2%), followed by stroke (216/840, 25.7%). The resultant artifact is actionable as it can act as an eHealth cohort analytics tool, helping physicians gain insights into treatment and intervention strategies. Through expert panel interviews and surveys, we found 9 application use cases of our model. CONCLUSIONS: Past research does not provide actionable cohort-level decision support tools based on a comprehensive, 10-state, continuous-time Markov chain model to unveil complex CVD progression patterns from real-world patient data and support clinical decision-making. This paper aims to address this crucial limitation. Our stochastic model-embedded artifact can help clinicians in efficient disease monitoring and intervention decisions, guided by objective data-driven insights from real patient data. Furthermore, the proposed model can unveil progression patterns of any chronic disease of interest by inputting only 3 data elements: a synthetic patient identifier, episode name, and episode time in days from a baseline date.


Subject(s)
Cardiovascular Diseases , Disease Progression , Stochastic Processes , Humans , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/diagnosis , Cohort Studies , Markov Chains , Female , Male , Longitudinal Studies
15.
Heliyon ; 10(17): e36778, 2024 Sep 15.
Article in English | MEDLINE | ID: mdl-39319167

ABSTRACT

The study examines M [ X ] / G ( P 1 , P 2 ) / 1 feedback retrial queues coupled with starting failure, repair, delay to repair, working vacation, and general retrial times. Also, it explores how different batch sizes affect performance and how bulk arrival affects system behavior. When the server is not in use, a single customer initiates the system while the remaining customers transition to a state of orbit. A new customer must turn on the server to provide two phases of mandatory service at any time. The server could have starting issues. If the service is successfully started (with likelihood α), the customer receives service immediately. In the absence of that, the likelihood of starting failure happens (with likelihood 1 - α = α ¯ ). The server was taken for repair with some delay and that customer was transported to an orbital location. When the server was busy or unavailable, the arriving customers queued by FCFS in the orbit. We also discuss the idea of reworking with probability p, and restarting unsuccessful service attempts to improve customer happiness and service efficiency. We also introduce the concept of working vacation, which permits servers to temporarily stop providing services, affecting system performance and availability at both peak and off-peak times. A supplementary variable technique was adopted for the system's and orbit size's probability-generating function. Various performance measurements were provided with appropriate numerical examples.

16.
IEEE Access ; 12: 100772-100791, 2024.
Article in English | MEDLINE | ID: mdl-39286062

ABSTRACT

Antimicrobial resistance (AMR) emerges when disease-causing microorganisms develop the ability to withstand the effects of antimicrobial therapy. This phenomenon is often fueled by the human-to-human transmission of pathogens and the overuse of antibiotics. Over the past 50 years, increased computational power has facilitated the application of Bayesian inference algorithms. In this comprehensive review, the basic theory of Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) methods are explained. These inference algorithms are instrumental in calibrating complex statistical models to the vast amounts of AMR-related data. Popular statistical models include hierarchical and mixture models as well as discrete and stochastic epidemiological compartmental and agent based models. Studies encompassed multi-drug resistance, economic implications of vaccines, and modeling AMR in vitro as well as within specific populations. We describe how combining these topics in a coherent framework can result in an effective antimicrobial stewardship. We also outline recent advancements in the methodology of Bayesian inference algorithms and provide insights into their prospective applicability for modeling AMR in the future.

17.
Curr Biol ; 2024 Sep 24.
Article in English | MEDLINE | ID: mdl-39332401

ABSTRACT

A long-standing question in biology is whether there are common principles that characterize the development of ecological systems (the appearance of a group of taxa), regardless of organismal diversity and environmental context.1,2,3,4,5,6,7,8,9,10,11 Classic ecological theory holds that these systems develop following a sequenced, orderly process that generally proceeds from fast-growing to slow-growing taxa and depends on life-history trade-offs.2,12,13 However, it is also possible that this developmental order is simply the path with the least environmental resistance for survival of the component species and hence favored by probability alone. Here, we use theory and data to show that the order from fast- to slow-growing taxa is the most likely developmental path for diverse systems when local taxon interactions self-organize in light of environmental resistance. First, we demonstrate theoretically that a sequenced development is more likely than a simultaneous one, at least until the number of iterations becomes so large as to be ecologically implausible. We then show that greater diversity of taxa and life histories improves the likelihood of a sequenced order from fast- to slow-growing taxa. Using data from bacterial and metazoan systems,14,15,16,17,18,19 we present empirical evidence that the developmental order of ecological systems moves along the paths of least environmental resistance. The capacity of simple principles to explain the trend in the developmental order of diverse ecological systems paves the way to an enhanced understanding of collective features of life.

18.
Biometrics ; 80(3)2024 Jul 01.
Article in English | MEDLINE | ID: mdl-39136277

ABSTRACT

Time-to-event data are often recorded on a discrete scale with multiple, competing risks as potential causes for the event. In this context, application of continuous survival analysis methods with a single risk suffers from biased estimation. Therefore, we propose the multivariate Bernoulli detector for competing risks with discrete times involving a multivariate change point model on the cause-specific baseline hazards. Through the prior on the number of change points and their location, we impose dependence between change points across risks, as well as allowing for data-driven learning of their number. Then, conditionally on these change points, a multivariate Bernoulli prior is used to infer which risks are involved. Focus of posterior inference is cause-specific hazard rates and dependence across risks. Such dependence is often present due to subject-specific changes across time that affect all risks. Full posterior inference is performed through a tailored local-global Markov chain Monte Carlo (MCMC) algorithm, which exploits a data augmentation trick and MCMC updates from nonconjugate Bayesian nonparametric methods. We illustrate our model in simulations and on ICU data, comparing its performance with existing approaches.


Subject(s)
Algorithms , Bayes Theorem , Computer Simulation , Markov Chains , Monte Carlo Method , Humans , Survival Analysis , Models, Statistical , Multivariate Analysis , Biometry/methods
19.
BMC Public Health ; 24(1): 2243, 2024 Aug 19.
Article in English | MEDLINE | ID: mdl-39160542

ABSTRACT

BACKGROUND: This article applies a variant of the Markov chain that explicitly incorporates spatial effects. It is an extension of the Markov class allowing a more complete analysis of the spatial dimensions of transition dynamics. The aim is to provide a methodology for applying the explicit model to spatial dependency analysis. METHODS: Here, the question is to study and quantify whether neighborhood context affects transitional dynamics. Rather than estimating a homogeneous law, the model requires the estimation of k transition laws each dependent on spatial neighbor state. This article used published data on confirmed cases of Covid'19 in the 22 regions of Madagascar. These data were discretized to obtain a discrete state of propagation intensity. RESULTS: The analysis gave us the transition probabilities between Covid'19 intensity states knowing the context of neighboring regions, and the propagation time laws knowing the spatial contexts. The results showed that neighboring regions had an effect on the propagation of Covid'19 in Madagascar. CONCLUSION: After analysis, we can say that there is spatial dependency according to these spatial transition matrices.


Subject(s)
COVID-19 , Markov Chains , Spatial Analysis , Madagascar/epidemiology , COVID-19/epidemiology , Humans , SARS-CoV-2
20.
Entropy (Basel) ; 26(8)2024 Aug 16.
Article in English | MEDLINE | ID: mdl-39202165

ABSTRACT

Several disciplines, such as econometrics, neuroscience, and computational psychology, study the dynamic interactions between variables over time. A Bayesian nonparametric model known as the Wishart process has been shown to be effective in this situation, but its inference remains highly challenging. In this work, we introduce a Sequential Monte Carlo (SMC) sampler for the Wishart process, and show how it compares to conventional inference approaches, namely MCMC and variational inference. Using simulations, we show that SMC sampling results in the most robust estimates and out-of-sample predictions of dynamic covariance. SMC especially outperforms the alternative approaches when using composite covariance functions with correlated parameters. We further demonstrate the practical applicability of our proposed approach on a dataset of clinical depression (n=1), and show how using an accurate representation of the posterior distribution can be used to test for dynamics in covariance.

SELECTION OF CITATIONS
SEARCH DETAIL