Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 33.028
Filter
1.
J Med Syst ; 48(1): 58, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38822876

ABSTRACT

Modern anesthetic drugs ensure the efficacy of general anesthesia. Goals include reducing variability in surgical, tracheal extubation, post-anesthesia care unit, or intraoperative response recovery times. Generalized confidence intervals based on the log-normal distribution compare variability between groups, specifically ratios of standard deviations. The alternative statistical approaches, performing robust variance comparison tests, give P-values, not point estimates nor confidence intervals for the ratios of the standard deviations. We performed Monte-Carlo simulations to learn what happens to confidence intervals for ratios of standard deviations of anesthesia-associated times when analyses are based on the log-normal, but the true distributions are Weibull. We used simulation conditions comparable to meta-analyses of most randomized trials in anesthesia, n ≈ 25 and coefficients of variation ≈ 0.30 . The estimates of the ratios of standard deviations were positively biased, but slightly, the ratios being 0.11% to 0.33% greater than nominal. In contrast, the 95% confidence intervals were very wide (i.e., > 95% of P ≥ 0.05). Although substantive inferentially, the differences in the confidence limits were small from a clinical or managerial perspective, with a maximum absolute difference in ratios of 0.016. Thus, P < 0.05 is reliable, but investigators should plan for Type II errors at greater than nominal rates.


Subject(s)
Monte Carlo Method , Humans , Confidence Intervals , Anesthesia, General , Time Factors , Models, Statistical
2.
Front Public Health ; 12: 1406566, 2024.
Article in English | MEDLINE | ID: mdl-38827615

ABSTRACT

Background: Emerging infectious diseases pose a significant threat to global public health. Timely detection and response are crucial in mitigating the spread of such epidemics. Inferring the onset time and epidemiological characteristics is vital for accelerating early interventions, but accurately predicting these parameters in the early stages remains challenging. Methods: We introduce a Bayesian inference method to fit epidemic models to time series data based on state-space modeling, employing a stochastic Susceptible-Exposed-Infectious-Removed (SEIR) model for transmission dynamics analysis. Our approach uses the particle Markov chain Monte Carlo (PMCMC) method to estimate key epidemiological parameters, including the onset time, the transmission rate, and the recovery rate. The PMCMC algorithm integrates the advantageous aspects of both MCMC and particle filtering methodologies to yield a computationally feasible and effective means of approximating the likelihood function, especially when it is computationally intractable. Results: To validate the proposed method, we conduct case studies on COVID-19 outbreaks in Wuhan, Shanghai and Nanjing, China, respectively. Using early-stage case reports, the PMCMC algorithm accurately predicted the onset time, key epidemiological parameters, and the basic reproduction number. These findings are consistent with empirical studies and the literature. Conclusion: This study presents a robust Bayesian inference method for the timely investigation of emerging infectious diseases. By accurately estimating the onset time and essential epidemiological parameters, our approach is versatile and efficient, extending its utility beyond COVID-19.


Subject(s)
Algorithms , Bayes Theorem , COVID-19 , Communicable Diseases, Emerging , Markov Chains , Humans , Communicable Diseases, Emerging/epidemiology , COVID-19/epidemiology , COVID-19/transmission , China/epidemiology , Monte Carlo Method , SARS-CoV-2 , Disease Outbreaks/statistics & numerical data , Time Factors , Epidemiological Models
3.
Eur Phys J E Soft Matter ; 47(6): 39, 2024 Jun 03.
Article in English | MEDLINE | ID: mdl-38831117

ABSTRACT

Small-Angle Scattering (SAS), encompassing both X-ray (SAXS) and Neutron (SANS) techniques, is a crucial tool for structural analysis at the nanoscale, particularly in the realm of biological macromolecules. This paper explores the intricacies of SAS, emphasizing its application in studying complex biological systems and the challenges associated with sample preparation and data analysis. We highlight the use of neutron-scattering properties of hydrogen isotopes and isotopic labeling in SANS for probing structures within multi-subunit complexes, employing techniques like contrast variation (CV) for detailed structural analysis. However, traditional SAS analysis methods, such as Guinier and Kratky plots, are limited by their partial use of available data and inability to operate without substantial a priori knowledge of the sample's chemical composition. To overcome these limitations, we introduce a novel approach integrating α -SAS, a computational method for simulating SANS with CV, with machine learning (ML). This approach enables the accurate prediction of scattering contrast in multicomponent macromolecular complexes, reducing the need for extensive sample preparation and computational resources. α -SAS, utilizing Monte Carlo methods, generates comprehensive datasets from which structural invariants can be extracted, enhancing our understanding of the macromolecular form factor in dilute systems. The paper demonstrates the effectiveness of this integrated approach through its application to two case studies: Janus particles, an artificial structure with a known SAS intensity and contrast, and a biological system involving RNA polymerase II in complex with Rtt103. These examples illustrate the method's capability to provide detailed structural insights, showcasing its potential as a powerful tool for advanced SAS analysis in structural biology.


Subject(s)
Machine Learning , Scattering, Small Angle , Macromolecular Substances/chemistry , Neutron Diffraction , X-Ray Diffraction , Monte Carlo Method
4.
Environ Geochem Health ; 46(6): 183, 2024 May 02.
Article in English | MEDLINE | ID: mdl-38696054

ABSTRACT

Pollution of water resources with nitrate is currently one of the major challenges at the global level. In order to make macro-policy decisions in water safety plans, it is necessary to carry out nitrate risk assessment in underground water, which has not been done in Fars province for all urban areas. In the current study, 9494 drinking water samples were collected in four seasons in 32 urban areas of Fars province in Iran, between 2017 and 2021 to investigate the non-carcinogenic health risk assessment. Geographical distribution maps of hazard quotient were drawn using geographical information system software. The results showed that the maximum amount of nitrate in water samples in 4% of the samples in 2021, 2.5% of the samples in 2020 and 3% of the samples in 2019 were more than the standard declared by World Health Organization guidelines (50 mg/L). In these cases, the maximum amount of nitrate was reported between 82 and 123 mg/L. The HQ values for infants did not exceed 1 in any year, but for children (44% ± 10.8), teenagers (10.8% ± 8.4), and adults (3.2% ± 1.7) exceeded 1 in cities, years, and seasons, indicating that three age groups in the studied area are at noticeably significant non-carcinogenic risk. The results of the Monte Carlo simulation showed that the average value of non-carcinogenic risk was less than 1 for all age groups. Moreover, the maximum HQ values (95%) were higher than 1 for both children and teenager, indicating a significant non-carcinogenic risk for the two age groups.


Subject(s)
Drinking Water , Geographic Information Systems , Monte Carlo Method , Nitrates , Water Pollutants, Chemical , Nitrates/analysis , Risk Assessment , Iran , Drinking Water/chemistry , Drinking Water/analysis , Water Pollutants, Chemical/analysis , Humans , Adolescent , Cities , Infant , Child , Adult , Environmental Monitoring/methods
5.
PLoS One ; 19(5): e0289822, 2024.
Article in English | MEDLINE | ID: mdl-38691561

ABSTRACT

Histograms are frequently used to perform a preliminary study of data, such as finding outliers and determining the distribution's shape. It is common knowledge that choosing an appropriate number of bins is crucial to revealing the right information. It's also well known that using bins of different widths, which called unequal bin width, is preferable to using bins of equal width if the bin width is selected carefully. However this is a much difficult issue. In this research, a novel approach to AIC for histograms with unequal bin widths was proposed. We demonstrate the advantage of the suggested approach in comparison to others using both extensive Monte Carlo simulations and empirical examples.


Subject(s)
Monte Carlo Method , Models, Statistical , Computer Simulation , Algorithms , Humans
6.
AAPS J ; 26(3): 53, 2024 Apr 23.
Article in English | MEDLINE | ID: mdl-38722435

ABSTRACT

The standard errors (SE) of the maximum likelihood estimates (MLE) of the population parameter vector in nonlinear mixed effect models (NLMEM) are usually estimated using the inverse of the Fisher information matrix (FIM). However, at a finite distance, i.e. far from the asymptotic, the FIM can underestimate the SE of NLMEM parameters. Alternatively, the standard deviation of the posterior distribution, obtained in Stan via the Hamiltonian Monte Carlo algorithm, has been shown to be a proxy for the SE, since, under some regularity conditions on the prior, the limiting distributions of the MLE and of the maximum a posterior estimator in a Bayesian framework are equivalent. In this work, we develop a similar method using the Metropolis-Hastings (MH) algorithm in parallel to the stochastic approximation expectation maximisation (SAEM) algorithm, implemented in the saemix R package. We assess this method on different simulation scenarios and data from a real case study, comparing it to other SE computation methods. The simulation study shows that our method improves the results obtained with frequentist methods at finite distance. However, it performed poorly in a scenario with the high variability and correlations observed in the real case study, stressing the need for calibration.


Subject(s)
Algorithms , Computer Simulation , Monte Carlo Method , Nonlinear Dynamics , Uncertainty , Likelihood Functions , Bayes Theorem , Humans , Models, Statistical
7.
J Biomed Opt ; 29(9): 093502, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38715718

ABSTRACT

Significance: Developing stable, robust, and affordable tissue-mimicking phantoms is a prerequisite for any new clinical application within biomedical optics. To this end, a thorough understanding of the phantom structure and optical properties is paramount. Aim: We characterized the structural and optical properties of PlatSil SiliGlass phantoms using experimental and numerical approaches to examine the effects of phantom microstructure on their overall optical properties. Approach: We employed scanning electron microscope (SEM), hyperspectral imaging (HSI), and spectroscopy in combination with Mie theory modeling and inverse Monte Carlo to investigate the relationship between phantom constituent and overall phantom optical properties. Results: SEM revealed that microspheres had a broad range of sizes with average (13.47±5.98) µm and were also aggregated, which may affect overall optical properties and warrants careful preparation to minimize these effects. Spectroscopy was used to measure pigment and SiliGlass absorption coefficient in the VIS-NIR range. Size distribution was used to calculate scattering coefficients and observe the impact of phantom microstructure on scattering properties. The results were surmised in an inverse problem solution that enabled absolute determination of component volume fractions that agree with values obtained during preparation and explained experimentally observed spectral features. HSI microscopy revealed pronounced single-scattering effects that agree with single-scattering events. Conclusions: We show that knowledge of phantom microstructure enables absolute measurements of phantom constitution without prior calibration. Further, we show a connection across different length scales where knowledge of precise phantom component constitution can help understand macroscopically observable optical properties.


Subject(s)
Monte Carlo Method , Phantoms, Imaging , Microscopy, Electron, Scanning , Scattering, Radiation , Microspheres , Hyperspectral Imaging/methods , Hyperspectral Imaging/instrumentation
8.
J Biomed Opt ; 29(9): 093503, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38715717

ABSTRACT

Significance: Hyperspectral dark-field microscopy (HSDFM) and data cube analysis algorithms demonstrate successful detection and classification of various tissue types, including carcinoma regions in human post-lumpectomy breast tissues excised during breast-conserving surgeries. Aim: We expand the application of HSDFM to the classification of tissue types and tumor subtypes in pre-histopathology human breast lumpectomy samples. Approach: Breast tissues excised during breast-conserving surgeries were imaged by the HSDFM and analyzed. The performance of the HSDFM is evaluated by comparing the backscattering intensity spectra of polystyrene microbead solutions with the Monte Carlo simulation of the experimental data. For classification algorithms, two analysis approaches, a supervised technique based on the spectral angle mapper (SAM) algorithm and an unsupervised technique based on the K-means algorithm are applied to classify various tissue types including carcinoma subtypes. In the supervised technique, the SAM algorithm with manually extracted endmembers guided by H&E annotations is used as reference spectra, allowing for segmentation maps with classified tissue types including carcinoma subtypes. Results: The manually extracted endmembers of known tissue types and their corresponding threshold spectral correlation angles for classification make a good reference library that validates endmembers computed by the unsupervised K-means algorithm. The unsupervised K-means algorithm, with no a priori information, produces abundance maps with dominant endmembers of various tissue types, including carcinoma subtypes of invasive ductal carcinoma and invasive mucinous carcinoma. The two carcinomas' unique endmembers produced by the two methods agree with each other within <2% residual error margin. Conclusions: Our report demonstrates a robust procedure for the validation of an unsupervised algorithm with the essential set of parameters based on the ground truth, histopathological information. We have demonstrated that a trained library of the histopathology-guided endmembers and associated threshold spectral correlation angles computed against well-defined reference data cubes serve such parameters. Two classification algorithms, supervised and unsupervised algorithms, are employed to identify regions with carcinoma subtypes of invasive ductal carcinoma and invasive mucinous carcinoma present in the tissues. The two carcinomas' unique endmembers used by the two methods agree to <2% residual error margin. This library of high quality and collected under an environment with no ambient background may be instrumental to develop or validate more advanced unsupervised data cube analysis algorithms, such as effective neural networks for efficient subtype classification.


Subject(s)
Algorithms , Breast Neoplasms , Mastectomy, Segmental , Microscopy , Humans , Breast Neoplasms/diagnostic imaging , Breast Neoplasms/surgery , Breast Neoplasms/pathology , Female , Mastectomy, Segmental/methods , Microscopy/methods , Breast/diagnostic imaging , Breast/pathology , Breast/surgery , Hyperspectral Imaging/methods , Margins of Excision , Monte Carlo Method , Image Processing, Computer-Assisted/methods
9.
Phys Med ; 121: 103367, 2024 May.
Article in English | MEDLINE | ID: mdl-38701625

ABSTRACT

PURPOSE: Diffusing alpha-emitters radiation therapy (DaRT) is a brachytherapy technique using α-particles to treat solid tumours. The high linear energy transfer (LET) and short range of α-particles make them good candidates for the targeted treatment of cancer. Treatment planning of DaRT requires a good understanding of the dose from α-particles and the other particles released in the 224Ra decay chain. METHODS: The Geant4 Monte Carlo toolkit has been used to simulate a DaRT seed to better understand the dose contribution from all particles and simulate the DNA damage due to this treatment. RESULTS: Close to the seed α-particles deliver the majority of dose, however at radial distances greater than 4 mm, the contribution of ß-particles is greater. The RBE has been estimated as a function of number of double strand breaks (DSBs) and complex DSBs. A maximum seed spacing of 5.5 mm and 6.5 mm was found to deliver at least 20 Gy RBE weighted dose between the seeds for RBEDSB and RBEcDSB respectively. CONCLUSIONS: The DNA damage changes with radial distance from the seed and has been found to become less complex with distance, which is potentially easier for the cell to repair. Close to the seed α-particles contribute the majority of dose, however the contribution from other particles cannot be neglected and may influence the choice of seed spacing.


Subject(s)
Alpha Particles , DNA Damage , Monte Carlo Method , Alpha Particles/therapeutic use , Radiotherapy Dosage , Radiation Dosage , Relative Biological Effectiveness , Diffusion , Brachytherapy/methods , Humans , Linear Energy Transfer , Radiotherapy Planning, Computer-Assisted/methods , DNA Breaks, Double-Stranded/radiation effects
10.
PLoS One ; 19(5): e0298897, 2024.
Article in English | MEDLINE | ID: mdl-38722980

ABSTRACT

To estimate the economic and financial viability of a pig farm in central sub-tropical Mexico within a 5-year planning horizon, a Monte Carlo simulation model was utilized. Net returns were projected using simulated values for the distribution of input and product processes, establishing 2021 as base scenario. A stochastic modelling approach was employed to determine the economic and financial outlook. The findings reveal a panorama of economic and financial viability. Net income increased by 555%, return on assets rose from 3.36% in 2022 to 11.34% in 2026, and the probability of decapitalization dropped from 58% to 13%, respectively in the aforesaid periods. Similarly, the probability of obtaining negative net income decreased from 40% in 2022 to 18% in 2026. The technological, productive, and economic management of the production unit allowed for a favorable scenario within the planning horizon. There is a growing interest in predicting the economic sectors worth investing in and supporting, considering their economic and development performance. This research offers both methodological and scientific evidence to demonstrate the feasibility of establishing a planning schedule and validating the suitability of the pork sector for public investment and support.


Subject(s)
Farms , Mexico , Animals , Swine , Farms/economics , Models, Economic , Animal Husbandry/economics , Monte Carlo Method , Prospective Studies , Income
11.
Clin Oral Investig ; 28(6): 301, 2024 May 07.
Article in English | MEDLINE | ID: mdl-38710794

ABSTRACT

OBJECTIVES: To undertake a cost-effectiveness analysis of restorative treatments for a first permanent molar with severe molar incisor hypomineralization from the perspective of the Brazilian public system. MATERIALS AND METHODS: Two models were constructed: a one-year decision tree and a ten-year Markov model, each based on a hypothetical cohort of one thousand individuals through Monte Carlo simulation. Eight restorative strategies were evaluated: high viscosity glass ionomer cement (HVGIC); encapsulated GIC; etch and rinse adhesive + composite; self-etch adhesive + composite; preformed stainless steel crown; HVGIC + etch and rinse adhesive + composite; HVGIC + self-etch adhesive + composite, and encapsulated GIC + etch and rinse adhesive + composite. Effectiveness data were sourced from the literature. Micro-costing was applied using 2022 USD market averages with a 5% variation. Incremental cost-effectiveness ratio (ICER), net monetary benefit (%NMB), and the budgetary impact were obtained. RESULTS: Cost-effective treatments included HVGIC (%NMB = 0%/ 0%), encapsulated GIC (%NMB = 19.4%/ 19.7%), and encapsulated GIC + etch and rinse adhesive + composite (%NMB = 23.4%/ 24.5%) at 1 year and 10 years, respectively. The benefit gain of encapsulated GIC + etch and rinse adhesive + composite in relation to encapsulated GIC was small when compared to the cost increase at 1 year (gain of 3.28% and increase of USD 24.26) and 10 years (gain of 4% and increase of USD 15.54). CONCLUSION: Within the horizon and perspective analyzed, the most cost-effective treatment was encapsulated GIC restoration. CLINICAL RELEVANCE: This study can provide information for decision-making.


Subject(s)
Cost-Benefit Analysis , Dental Enamel Hypoplasia , Dental Restoration, Permanent , Glass Ionomer Cements , Humans , Brazil , Dental Enamel Hypoplasia/therapy , Dental Restoration, Permanent/methods , Dental Restoration, Permanent/economics , Glass Ionomer Cements/therapeutic use , Decision Trees , Molar , Monte Carlo Method , Markov Chains , Molar Hypomineralization
12.
Lasers Med Sci ; 39(1): 130, 2024 May 16.
Article in English | MEDLINE | ID: mdl-38750285

ABSTRACT

The aim of this study is to investigate how the introduction of Gold nanoparticles GNPs into a skin tumor affects the ability to absorb laser light during multicolor laser exposure. The Monte Carlo Geant4 technique was used to construct a cubic geometry simulating human skin, and a 5 mm tumor spheroid was implanted at an adjustable depth x. Our findings show that injecting a very low concentration of 0.01% GNPs into a tumor located 1 cm below the skin's surface causes significant laser absorption of up to 25%, particularly in the 900 nm to 1200 nm range, resulting in a temperature increase of approximately 20%. It is an effective way to raise a tumor's temperature and cause cell death while preserving healthy cells. The addition of GNPs to a tumor during polychromatic laser exposure with a wavelength ranging from 900 nm to 1200 nm increases laser absorption and thus temperature while preserving areas without GNPs.


Subject(s)
Gold , Metal Nanoparticles , Monte Carlo Method , Photothermal Therapy , Skin Neoplasms , Humans , Photothermal Therapy/methods , Skin Neoplasms/therapy , Skin Neoplasms/pathology , Skin/radiation effects
13.
Opt Lett ; 49(10): 2669-2672, 2024 May 15.
Article in English | MEDLINE | ID: mdl-38748132

ABSTRACT

Central venous oxygen saturation (ScvO2) is an important parameter for assessing global oxygen usage and guiding clinical interventions. However, measuring ScvO2 requires invasive catheterization. As an alternative, we aim to noninvasively and continuously measure changes in oxygen saturation of the internal jugular vein (SijvO2) by a multi-channel near-infrared spectroscopy system. The relation between the measured reflectance and changes in SijvO2 is modeled by Monte Carlo simulations and used to build a prediction model using deep neural networks (DNNs). The prediction model is tested with simulated data to show robustness to individual variations in tissue optical properties. The proposed technique is promising to provide a noninvasive tool for monitoring the stability of brain oxygenation in broad patient populations.


Subject(s)
Jugular Veins , Monte Carlo Method , Oxygen Saturation , Jugular Veins/physiology , Humans , Oxygen Saturation/physiology , Neural Networks, Computer , Oxygen/metabolism , Spectroscopy, Near-Infrared/methods , Male
14.
Biometrics ; 80(2)2024 Mar 27.
Article in English | MEDLINE | ID: mdl-38742907

ABSTRACT

We propose a new non-parametric conditional independence test for a scalar response and a functional covariate over a continuum of quantile levels. We build a Cramer-von Mises type test statistic based on an empirical process indexed by random projections of the functional covariate, effectively avoiding the "curse of dimensionality" under the projected hypothesis, which is almost surely equivalent to the null hypothesis. The asymptotic null distribution of the proposed test statistic is obtained under some mild assumptions. The asymptotic global and local power properties of our test statistic are then investigated. We specifically demonstrate that the statistic is able to detect a broad class of local alternatives converging to the null at the parametric rate. Additionally, we recommend a simple multiplier bootstrap approach for estimating the critical values. The finite-sample performance of our statistic is examined through several Monte Carlo simulation experiments. Finally, an analysis of an EEG data set is used to show the utility and versatility of our proposed test statistic.


Subject(s)
Computer Simulation , Models, Statistical , Monte Carlo Method , Humans , Electroencephalography/statistics & numerical data , Data Interpretation, Statistical , Biometry/methods , Statistics, Nonparametric
15.
Sci Rep ; 14(1): 11120, 2024 05 15.
Article in English | MEDLINE | ID: mdl-38750131

ABSTRACT

Very High Energy Electron (VHEE) beams are a promising alternative to conventional radiotherapy due to their highly penetrating nature and their applicability as a modality for FLASH (ultra-high dose-rate) radiotherapy. The dose distributions due to VHEE need to be optimised; one option is through the use of quadrupole magnets to focus the beam, reducing the dose to healthy tissue and allowing for targeted dose delivery at conventional or FLASH dose-rates. This paper presents an in depth exploration of the focusing achievable at the current CLEAR (CERN Linear Electron Accelerator for Research) facility, for beam energies >200 MeV. A shorter, more optimal quadrupole setup was also investigated using the TOPAS code in Monte Carlo simulations, with dimensions and beam parameters more appropriate to a clinical situation. This work provides insight into how a focused VHEE radiotherapy beam delivery system might be achieved.


Subject(s)
Electrons , Monte Carlo Method , Radiotherapy Dosage , Humans , Particle Accelerators/instrumentation , Radiotherapy Planning, Computer-Assisted/methods , Radiotherapy/methods , Radiotherapy, High-Energy/methods , Radiotherapy, High-Energy/instrumentation
16.
PLoS One ; 19(5): e0302699, 2024.
Article in English | MEDLINE | ID: mdl-38781185

ABSTRACT

In anticipation of growing wildfire management challenges, the Canadian government is investing in WildFireSat, an Earth observation satellite mission designed to collect data in support of Canadian wildfire management. Although costs of the mission can be reasonably estimated, the benefits of such an investment are unknown. Here we forecast the possible benefits of WildFireSat via an avoided cost approach. We consider five socio-economic components: suppression costs (fixed and variable), timber losses, property, asset and infrastructure losses, evacuation costs, and smoke related health costs. Using a Monte Carlo analysis, we evaluated a range of possible changes to these components based on expert opinions. The resulting Net Present Value (NPV) estimates depend on the presumed impact of using WildFireSat decision support data products, with pessimistic and conservative assumptions generating mission costs that typically exceed potential benefits by 1.16 to 1.59 times, while more optimistic assumptions generate benefits in excess of costs by 8.72 to 10.48 times. The analysis here excludes some possibly significant market and non-market impacts expected from WildFireSat due to data limitations; accounting for these additional impacts would likely generate positive NPVs under even cautious impact assumptions.


Subject(s)
Cost-Benefit Analysis , Wildfires , Canada , Humans , Environmental Monitoring/methods , Environmental Monitoring/economics , Monte Carlo Method
17.
PLoS One ; 19(5): e0303605, 2024.
Article in English | MEDLINE | ID: mdl-38781265

ABSTRACT

Black ice, a phenomenon that occurs abruptly owing to freezing rain, is difficult for drivers to identify because it mirrors the color of the road. Effectively managing the occurrence of unforeseen accidents caused by black ice requires predicting their probability using spatial, weather, and traffic factors and formulating appropriate countermeasures. Among these factors, weather and traffic exhibit the highest levels of uncertainty. To address these uncertainties, a study was conducted using a Monte Carlo simulation based on random values to predict the probability of black ice accidents at individual road points and analyze their trigger factors. We numerically modeled black ice accidents and visualized the simulation results in a geographical information system (GIS) by employing a sensitivity analysis, another feature of Monte Carlo simulations, to analyze the factors that trigger black ice accidents. The Monte Carlo simulation allowed us to map black ice accident occurrences at each road point on the GIS. The average black ice accident probability was found to be 0.0058, with a standard deviation of 0.001. Sensitivity analysis using Monte Carlo simulations identified wind speed, air temperature, and angle as significant triggers of black ice accidents, with sensitivities of 0.354, 0.270, and 0.203, respectively. We predicted the probability of black ice accidents per road section and analyzed the primary triggers of black ice accidents. The scientific contribution of this study lies in the development of a method beyond simple road temperature predictions for evaluating the risk of black ice occurrences and subsequent accidents. By employing Monte Carlo simulations, the probability of black ice accidents can be predicted more accurately through decoupling meteorological and traffic factors over time. The results can serve as a reference for government agencies, including road traffic authorities, to identify accident-prone spots and devise strategies focused on the primary triggers of black ice accidents.


Subject(s)
Geographic Information Systems , Ice , Monte Carlo Method , Models, Statistical , Humans , Accidents, Traffic/statistics & numerical data
18.
Radiol Phys Technol ; 17(2): 488-503, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38696086

ABSTRACT

We proposed a new deep learning (DL) model for accurate scatter correction in digital radiography. The proposed network featured a pixel-wise water equivalent path length (WEPL) map of subjects with diverse sizes and 3D inner structures. The proposed U-Net model comprises two concatenated modules: one for generating a WEPL map and the other for predicting scatter using the WEPL map as auxiliary information. First, 3D CT images were used as numerical phantoms for training and validation, generating observed and scattered images by Monte Carlo simulation, and WEPL maps using Siddon's algorithm. Then, we optimised the model without overfitting. Next, we validated the proposed model's performance by comparing it with other DL models. The proposed model obtained scatter-corrected images with a peak signal-to-noise ratio of 44.24 ± 2.89 dB and a structural similarity index measure of 0.9987 ± 0.0004, which were higher than other DL models. Finally, scatter fractions (SFs) were compared with other DL models using an actual phantom to confirm practicality. Among DL models, the proposed model showed the smallest deviation from measured SF values. Furthermore, using an actual radiograph containing an acrylic object, the contrast-to-noise ratio (CNR) of the proposed model and the anti-scatter grid were compared. The CNR of the images corrected using the proposed model are 16% and 82% higher than those of the raw and grid-applied images, respectively. The advantage of the proposed method is that no actual radiography system is required for collecting training dataset, as the dataset is created from CT images using Monte Carlo simulation.


Subject(s)
Deep Learning , Phantoms, Imaging , Radiographic Image Enhancement , Scattering, Radiation , Water , Radiographic Image Enhancement/methods , Monte Carlo Method , Image Processing, Computer-Assisted/methods , Humans , Tomography, X-Ray Computed , Algorithms , Signal-To-Noise Ratio , Imaging, Three-Dimensional
19.
Bull Math Biol ; 86(6): 70, 2024 May 08.
Article in English | MEDLINE | ID: mdl-38717656

ABSTRACT

Practical limitations of quality and quantity of data can limit the precision of parameter identification in mathematical models. Model-based experimental design approaches have been developed to minimise parameter uncertainty, but the majority of these approaches have relied on first-order approximations of model sensitivity at a local point in parameter space. Practical identifiability approaches such as profile-likelihood have shown potential for quantifying parameter uncertainty beyond linear approximations. This research presents a genetic algorithm approach to optimise sample timing across various parameterisations of a demonstrative PK-PD model with the goal of aiding experimental design. The optimisation relies on a chosen metric of parameter uncertainty that is based on the profile-likelihood method. Additionally, the approach considers cases where multiple parameter scenarios may require simultaneous optimisation. The genetic algorithm approach was able to locate near-optimal sampling protocols for a wide range of sample number (n = 3-20), and it reduced the parameter variance metric by 33-37% on average. The profile-likelihood metric also correlated well with an existing Monte Carlo-based metric (with a worst-case r > 0.89), while reducing computational cost by an order of magnitude. The combination of the new profile-likelihood metric and the genetic algorithm demonstrate the feasibility of considering the nonlinear nature of models in optimal experimental design at a reasonable computational cost. The outputs of such a process could allow for experimenters to either improve parameter certainty given a fixed number of samples, or reduce sample quantity while retaining the same level of parameter certainty.


Subject(s)
Algorithms , Computer Simulation , Mathematical Concepts , Models, Biological , Monte Carlo Method , Likelihood Functions , Humans , Dose-Response Relationship, Drug , Research Design/statistics & numerical data , Models, Genetic , Uncertainty
20.
Sci Rep ; 14(1): 11524, 2024 05 21.
Article in English | MEDLINE | ID: mdl-38773212

ABSTRACT

The biological mechanisms triggered by low-dose exposure still need to be explored in depth. In this study, the potential mechanisms of low-dose radiation when irradiating the BEAS-2B cell lines with a Cs-137 gamma-ray source were investigated through simulations and experiments. Monolayer cell population models were constructed for simulating and analyzing distributions of nucleus-specific energy within cell populations combined with the Monte Carlo method and microdosimetric analysis. Furthermore, the 10 × Genomics single-cell sequencing technology was employed to capture the heterogeneity of individual cell responses to low-dose radiation in the same irradiated sample. The numerical uncertainties can be found both in the specific energy distribution in microdosimetry and in differential gene expressions in radiation cytogenetics. Subsequently, the distribution of nucleus-specific energy was compared with the distribution of differential gene expressions to guide the selection of differential genes bioinformatics analysis. Dose inhomogeneity is pronounced at low doses, where an increase in dose corresponds to a decrease in the dispersion of cellular-specific energy distribution. Multiple screening of differential genes by microdosimetric features and statistical analysis indicate a number of potential pathways induced by low-dose exposure. It also provides a novel perspective on the selection of sensitive biomarkers that respond to low-dose radiation.


Subject(s)
Dose-Response Relationship, Radiation , Single-Cell Analysis , Single-Cell Analysis/methods , Humans , Monte Carlo Method , Radiometry/methods , Cell Line , Gamma Rays/adverse effects
SELECTION OF CITATIONS
SEARCH DETAIL
...