Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
1.
J Appl Clin Med Phys ; : e14338, 2024 Apr 12.
Article in English | MEDLINE | ID: mdl-38610118

ABSTRACT

PURPOSE: Volumetric-modulated arc therapy (VMAT) is a widely accepted treatment method for head and neck (HN) and cervical cancers; however, creating contours and plan optimization for VMAT plans is a time-consuming process. Our group has created an automated treatment planning tool, the Radiation Planning Assistant (RPA), that uses deep learning models to generate organs at risk (OARs), planning structures and automates plan optimization. This study quantitatively evaluates the quality of contours generated by the RPA tool. METHODS: For patients with HN (54) and cervical (39) cancers, we retrospectively generated autoplans using the RPA. Autoplans were generated using deep-learning and RapidPlan models developed in-house. The autoplans were, then, applied to the original, physician-drawn contours, which were used as a ground truth (GT) to compare with the autocontours (RPA). Using a "two one-sided tests" (TOST) procedure, we evaluated whether the autocontour normal tissue dose was equivalent to that of the ground truth by a margin, δ, that we determined based on clinical judgement. We also calculated the number of plans that met established clinically accepted dosimetric criteria. RESULTS: For HN plans, 91.8% and 91.7% of structures met dosimetric criteria for automatic and manual contours, respectively; for cervical plans, 95.6% and 95.7% of structures met dosimetric criteria for automatic and manual contours, respectively. Autocontours were equivalent to the ground truth for 71% and 75% of common DVH metrics for the HN and cervix, respectively. CONCLUSIONS: This study shows that dosimetrically equivalent normal tissue contours can be created for HN and cervical cancers using deep learning techniques. In general, differences between the contours did not affect the passing or failing of clinical dose tolerances.

2.
BMJ Open ; 13(12): e077253, 2023 12 07.
Article in English | MEDLINE | ID: mdl-38149419

ABSTRACT

INTRODUCTION: Fifty per cent of patients with cancer require radiotherapy during their disease course, however, only 10%-40% of patients in low-income and middle-income countries (LMICs) have access to it. A shortfall in specialised workforce has been identified as the most significant barrier to expanding radiotherapy capacity. Artificial intelligence (AI)-based software has been developed to automate both the delineation of anatomical target structures and the definition of the position, size and shape of the radiation beams. Proposed advantages include improved treatment accuracy, as well as a reduction in the time (from weeks to minutes) and human resources needed to deliver radiotherapy. METHODS: ARCHERY is a non-randomised prospective study to evaluate the quality and economic impact of AI-based automated radiotherapy treatment planning for cervical, head and neck, and prostate cancers, which are endemic in LMICs, and for which radiotherapy is the primary curative treatment modality. The sample size of 990 patients (330 for each cancer type) has been calculated based on an estimated 95% treatment plan acceptability rate. Time and cost savings will be analysed as secondary outcome measures using the time-driven activity-based costing model. The 48-month study will take place in six public sector cancer hospitals in India (n=2), Jordan (n=1), Malaysia (n=1) and South Africa (n=2) to support implementation of the software in LMICs. ETHICS AND DISSEMINATION: The study has received ethical approval from University College London (UCL) and each of the six study sites. If the study objectives are met, the AI-based software will be offered as a not-for-profit web service to public sector state hospitals in LMICs to support expansion of high quality radiotherapy capacity, improving access to and affordability of this key modality of cancer cure and control. Public and policy engagement plans will involve patients as key partners.


Subject(s)
Artificial Intelligence , Prostatic Neoplasms , Male , Humans , Prospective Studies , Prostatic Neoplasms/radiotherapy , Software , Radiotherapy Planning, Computer-Assisted , Observational Studies as Topic
3.
J Vis Exp ; (200)2023 10 06.
Article in English | MEDLINE | ID: mdl-37870317

ABSTRACT

Access to radiotherapy worldwide is limited. The Radiation Planning Assistant (RPA) is a fully automated, web-based tool that is being developed to offer fully automated radiotherapy treatment planning tools to clinics with limited resources. The goal is to help clinical teams scale their efforts, thus reaching more patients with cancer. The user connects to the RPA via a webpage, completes a Service Request (prescription and information about the radiotherapy targets), and uploads the patient's CT image set. The RPA offers two approaches to automated planning. In one-step planning, the system uses the Service Request and CT scan to automatically generate the necessary contours and treatment plan. In two-step planning, the user reviews and edits the automatically generated contours before the RPA continues to generate a volume-modulated arc therapy plan. The final plan is downloaded from the RPA website and imported into the user's local treatment planning system, where the dose is recalculated for the locally commissioned linac; if necessary, the plan is edited prior to approval for clinical use.


Subject(s)
Neoplasms , Radiotherapy, Intensity-Modulated , Humans , Radiotherapy, Intensity-Modulated/methods , Radiotherapy Planning, Computer-Assisted/methods , Neoplasms/diagnostic imaging , Neoplasms/radiotherapy , Radiotherapy Dosage , Internet
4.
Pilot Feasibility Stud ; 9(1): 4, 2023 Jan 09.
Article in English | MEDLINE | ID: mdl-36624548

ABSTRACT

BACKGROUND: Self-management support (SMS) forms a central pillar in the management of long-term conditions. It is firmly aligned with UK health policy but there is a paucity of evidence exploring how it is enacted in the context of neuromuscular diseases (NMDs). Bridges is a SMS programme originally developed in stroke. A new version of the programme (Neuromuscular Bridges) has recently been co-designed with people with lived experience of NMD and requires evaluation. The implementation of SMS is inherently complex with potential barriers at the level of the patient, provider, and wider organisation. The success of implementing programmes can be highly dependent on context, indicating a rationale for considering implementation determinants at an early stage. This study aims to explore the feasibility of (1) delivering, (2) evaluating, and (3) implementing Neuromuscular Bridges at a specialist neuromuscular centre. METHODS: This study employs a hybrid II design underpinned by Normalisation Process Theory (NPT), which has been used prospectively to inform the implementation plan and will also inform the analysis. The feasibility of delivering, evaluating, and implementing Neuromuscular Bridges will be assessed using a single-arm pre-post design. In terms of delivery and evaluation, we will explore acceptability, demand within the service, performance of outcome measures, recruitment, and retention. Implementation strategies have been selected from a refined taxonomy of strategies, mapped to NPT, and targeted at known barriers and facilitators at the specialist centre that were identified from preliminary stakeholder engagement activities. The impact of the strategy bundle on fidelity, acceptability, appropriateness, and adoption will be evaluated using qualitative interviews, administrative data, surveys, and a notes audit. CONCLUSIONS: This this study will provide valuable feasibility data on a co-designed SMS programme for people with NMDs that will be used to inform a larger implementation study, requirements for embedding it in a specialist centre, and rollout to other specialist centres. Using hybrid methodology at the feasibility stage is unusual and this study will provide important insights into the usefulness of taking this approach at this point in the research pipeline. TRIAL REGISTRATION: ISRCTN Trial ID: ISRCTN14208138 . Date registered: 18/08/2021.

5.
Nat Commun ; 12(1): 4720, 2021 08 05.
Article in English | MEDLINE | ID: mdl-34354055

ABSTRACT

Forecasting the evolution of contagion dynamics is still an open problem to which mechanistic models only offer a partial answer. To remain mathematically or computationally tractable, these models must rely on simplifying assumptions, thereby limiting the quantitative accuracy of their predictions and the complexity of the dynamics they can model. Here, we propose a complementary approach based on deep learning where effective local mechanisms governing a dynamic on a network are learned from time series data. Our graph neural network architecture makes very few assumptions about the dynamics, and we demonstrate its accuracy using different contagion dynamics of increasing complexity. By allowing simulations on arbitrary network structures, our approach makes it possible to explore the properties of the learned dynamics beyond the training data. Finally, we illustrate the applicability of our approach using real data of the COVID-19 outbreak in Spain. Our results demonstrate how deep learning offers a new and complementary perspective to build effective models of contagion dynamics on networks.


Subject(s)
COVID-19/epidemiology , Communicable Disease Control/methods , Deep Learning , Disease Outbreaks/prevention & control , Forecasting/methods , Humans , Models, Theoretical , SARS-CoV-2 , Spain/epidemiology
6.
Phys Rev E ; 97(3-1): 032302, 2018 Mar.
Article in English | MEDLINE | ID: mdl-29776174

ABSTRACT

In binary cascade dynamics, the nodes of a graph are in one of two possible states (inactive, active), and nodes in the inactive state make an irreversible transition to the active state, as soon as their precursors satisfy a predetermined condition. We introduce a set of recursive equations to compute the probability of reaching any final state, given an initial state, and a specification of the transition probability function of each node. Because the naive recursive approach for solving these equations takes factorial time in the number of nodes, we also introduce an accelerated algorithm, built around a breath-first search procedure. This algorithm solves the equations as efficiently as possible in exponential time.

7.
Phys Rev E ; 97(3-1): 032309, 2018 Mar.
Article in English | MEDLINE | ID: mdl-29776179

ABSTRACT

We present a general class of geometric network growth mechanisms by homogeneous attachment in which the links created at a given time t are distributed homogeneously between a new node and the existing nodes selected uniformly. This is achieved by creating links between nodes uniformly distributed in a homogeneous metric space according to a Fermi-Dirac connection probability with inverse temperature ß and general time-dependent chemical potential µ(t). The chemical potential limits the spatial extent of newly created links. Using a hidden variable framework, we obtain an analytical expression for the degree sequence and show that µ(t) can be fixed to yield any given degree distributions, including a scale-free degree distribution. Additionally, we find that depending on the order in which nodes appear in the network-its history-the degree-degree correlations can be tuned to be assortative or disassortative. The effect of the geometry on the structure is investigated through the average clustering coefficient 〈c〉. In the thermodynamic limit, we identify a phase transition between a random regime where 〈c〉→0 when ß<ß_{c} and a geometric regime where 〈c〉>0 when ß>ß_{c}.

8.
Phys Rev E ; 97(2-1): 022305, 2018 Feb.
Article in English | MEDLINE | ID: mdl-29548152

ABSTRACT

We present a degree-based theoretical framework to study the susceptible-infected-susceptible (SIS) dynamics on time-varying (rewired) configuration model networks. Using this framework on a given degree distribution, we provide a detailed analysis of the stationary state using the rewiring rate to explore the whole range of the time variation of the structure relative to that of the SIS process. This analysis is suitable for the characterization of the phase transition and leads to three main contributions: (1) We obtain a self-consistent expression for the absorbing-state threshold, able to capture both collective and hub activation. (2) We recover the predictions of a number of existing approaches as limiting cases of our analysis, providing thereby a unifying point of view for the SIS dynamics on random networks. (3) We obtain bounds for the critical exponents of a number of quantities in the stationary state. This allows us to reinterpret the concept of hub-dominated phase transition. Within our framework, it appears as a heterogeneous critical phenomenon: observables for different degree classes have a different scaling with the infection rate. This phenomenon is followed by the successive activation of the degree classes beyond the epidemic threshold.

9.
Phys Rev E ; 95(6-1): 062304, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28709195

ABSTRACT

It has been shown in recent years that the stochastic block model is sometimes undetectable in the sparse limit, i.e., that no algorithm can identify a partition correlated with the partition used to generate an instance, if the instance is sparse enough and infinitely large. In this contribution, we treat the finite case explicitly, using arguments drawn from information theory and statistics. We give a necessary condition for finite-size detectability in the general SBM. We then distinguish the concept of average detectability from the concept of instance-by-instance detectability and give explicit formulas for both definitions. Using these formulas, we prove that there exist large equivalence classes of parameters, where widely different network ensembles are equally detectable with respect to our definitions of detectability. In an extensive case study, we investigate the finite-size detectability of a simplified variant of the SBM, which encompasses a number of important models as special cases. These models include the symmetric SBM, the planted coloring model, and more exotic SBMs not previously studied. We conclude with three appendices, where we study the interplay of noise and detectability, establish a connection between our information-theoretic approach and random matrix theory, and provide proofs of some of the more technical results.

10.
Med Phys ; 44(3): 1050-1062, 2017 Mar.
Article in English | MEDLINE | ID: mdl-28112418

ABSTRACT

PURPOSE: Many radiomics features were originally developed for non-medical imaging applications and therefore original assumptions may need to be reexamined. In this study, we investigated the impact of slice thickness and pixel spacing (or pixel size) on radiomics features extracted from Computed Tomography (CT) phantom images acquired with different scanners as well as different acquisition and reconstruction parameters. The dependence of CT texture features on gray-level discretization was also evaluated. METHODS AND MATERIALS: A texture phantom composed of 10 different cartridges of different materials was scanned on eight different CT scanners from three different manufacturers. The images were reconstructed for various slice thicknesses. For each slice thickness, the reconstruction Field Of View (FOV) was varied to render pixel sizes ranging from 0.39 to 0.98 mm. A fixed spherical region of interest (ROI) was contoured on the images of the shredded rubber cartridge and the 3D printed, 20% fill, acrylonitrile butadiene styrene plastic cartridge (ABS20) for all phantom imaging sets. Radiomic features were extracted from the ROIs using an in-house program. Features categories were: shape (10), intensity (16), GLCM (24), GLZSM (11), GLRLM (11), and NGTDM (5), fractal dimensions (8) and first-order wavelets (128), for a total of 213 features. Voxel-size resampling was performed to investigate the usefulness of extracting features using a suitably chosen voxel size. Acquired phantom image sets were resampled to a voxel size of 1 × 1 × 2 mm3 using linear interpolation. Image features were therefore extracted from resampled and original datasets and the absolute value of the percent coefficient of variation (%COV) for each feature was calculated. Based on the %COV values, features were classified in 3 groups: (1) features with large variations before and after resampling (%COV >50); (2) features with diminished variation (%COV <30) after resampling; and (3) features that had originally moderate variation (%COV <50%) and were negligibly affected by resampling. Group 2 features were further studied by modifying feature definitions to include voxel size. Original and voxel-size normalized features were used for interscanner comparisons. A subsequent analysis investigated feature dependency on gray-level discretization by extracting 51 texture features from ROIs from each of the 10 different phantom cartridges using 16, 32, 64, 128, and 256 gray levels. RESULTS: Out of the 213 features extracted, 150 were reproducible across voxel sizes, 42 improved significantly (%COV <30, Group 2) after resampling, and 21 had large variations before and after resampling (Group 1). Ten features improved significantly after definition modification effectively removed their voxel-size dependency. Interscanner comparison indicated that feature variability among scanners nearly vanished for 8 of these 10 features. Furthermore, 17 out of 51 texture features were found to be dependent on the number of gray levels. These features were redefined to include the number of gray levels which greatly reduced this dependency. CONCLUSION: Voxel-size resampling is an appropriate pre-processing step for image datasets acquired with variable voxel sizes to obtain more reproducible CT features. We found that some of the radiomics features were voxel size and gray-level discretization-dependent. The introduction of normalizing factors in their definitions greatly reduced or removed these dependencies.


Subject(s)
Tomography, X-Ray Computed/methods , Algorithms , Phantoms, Imaging , Tomography Scanners, X-Ray Computed , Tomography, X-Ray Computed/instrumentation
12.
Article in English | MEDLINE | ID: mdl-26764746

ABSTRACT

Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.


Subject(s)
Models, Theoretical , Algorithms , Cluster Analysis , Fractals
13.
J Clin Microbiol ; 42(9): 3975-7, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15364978

ABSTRACT

The Centers for Disease Control and Prevention (CDC) recommend universal screening of all pregnant women between 35 and 37 weeks of gestation for group B streptococci (GBS) by use of a selective broth medium. Recent reports suggest that Granada medium can be used for rapid and direct visual identification of GBS colonies. However, studies comparing the Granada medium method to the selective broth method are few, and while some report comparable sensitivities, others have found significant differences in detection rates between the two methods. This prospective study compared a method using Granada agar to a Todd-Hewitt broth method with subculture to blood agar in order to determine which GBS detection method is more sensitive and less labor-intensive and has a more rapid turnaround time. Detection rates for three sampling techniques (rectovaginal, vaginal only, and cervical only) were also compared. Consecutive specimens for GBS screening received over a 6-month period from 1,635 pregnant women were included. Overall, GBS was detected in 390 (23.8%) women. The Granada medium gave positive results for 348 of these women, and the selective broth gave positive results for 385, indicating sensitivities of 89.2% for the Granada medium and 98.7% for the selective broth. These findings show that the Granada medium method is less sensitive than the selective broth method and should not replace it as the only method for screening pregnant women for GBS. However, the Granada medium method reduced detection time to 1 day and also reduced the use of ancillary tests in approximately 90% of positive cases. Additionally, no significant differences were noted in the detection rates with rectovaginal, vaginal, and cervical specimens.


Subject(s)
Pregnancy Complications, Infectious/microbiology , Streptococcus agalactiae/isolation & purification , Vagina/microbiology , Cervix Uteri/microbiology , Culture Media , Female , Humans , Pregnancy , Sensitivity and Specificity , Specimen Handling/methods , Streptococcus agalactiae/growth & development
SELECTION OF CITATIONS
SEARCH DETAIL
...