Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 403
Filter
1.
EJNMMI Phys ; 11(1): 62, 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-39004644

ABSTRACT

BACKGROUND: The aim was to investigate the noise and bias properties of quantitative 177Lu-SPECT with respect to the number of projection angles, and the number of subsets and iterations in the OS-EM reconstruction, for different total acquisition times. METHODS: Experimental SPECT acquisition of six spheres in a NEMA body phantom filled with 177Lu was performed, using medium-energy collimators and 120 projections with 180 s per projection. Bootstrapping was applied to generate data sets representing acquisitions with 20 to 120 projections for 10 min, 20 min, and 40 min, with 32 noise realizations per setting. Monte Carlo simulations were performed of 177Lu-DOTA-TATE in an anthropomorphic computer phantom with three tumours (2.8 mL to 40.0 mL). Projections representing 24 h and 168 h post administration were simulated, each with 32 noise realizations. Images were reconstructed using OS-EM with compensation for attenuation, scatter, and distance-dependent resolution. The number of subsets and iterations were varied within a constrained range of the product number of iterations × number of projections ≤ 2400 . Volumes-of-interest were defined following the physical size of the spheres and tumours, the mean activity-concentrations estimated, and the absolute mean relative error and coefficient of variation (CV) over noise realizations calculated. Pareto fronts were established by analysis of CV versus mean relative error. RESULTS: Points at the Pareto fronts with low CV and high mean error resulted from using a low number of subsets, whilst points at the Pareto fronts associated with high CV but low mean error resulted from reconstructions with a high number of subsets. The number of projection angles had limited impact. CONCLUSIONS: For accurate estimation of the 177Lu activity-concentration from SPECT images, the number of projection angles has limited importance, whilst the total acquisition time and the number of subsets and iterations are parameters of importance.

2.
Heliyon ; 10(11): e32346, 2024 Jun 15.
Article in English | MEDLINE | ID: mdl-38961934

ABSTRACT

Ultrasonic-assisted oxidative desulfurization (UAOD) is utilized to lessen environmental problems due to sulfur emissions. The process uses immiscible polar solvents and ultrasonic waves to enhance desulfurization efficiency. Prior research focused on comparing the effectiveness of UAOD for gasoline using response surface methodology. This study evaluates the desulfurization efficiency and operating costs, including ultrasonic power, irradiation time, and oxidant amount to determine optimal conditions. The study used a multi-objective fuzzy optimization (MOFO) approach to evaluate the economic viability of UAOD for gasoline. It identified upper and lower boundaries and then optimized the desulfurization efficiency and operating costs while considering uncertainty errors. The fuzzy model employed max-min aggregation to optimize the degree of satisfaction on a scale from 0 (unsatisfied) to 1 (satisfied). Optimal conditions for gasoline UAOD were found at 445.43 W ultrasonic power, 4.74 min irradiation time, and 6.73 mL oxidant, resulting in a 66.79 % satisfaction level. This yielded a 78.64 % desulfurization efficiency (YA) at an operating cost of 13.49 USD/L. Compared to existing literature, gasoline desulfurization was less efficient and less costly. The solutions provided by MOFO demonstrate not only economic viability through decreased overall operating costs and simplified process conditions, but also offer valuable insights for optimizing prospective future industrial-scale UAOD processes.

3.
Bioinspir Biomim ; 2024 Jul 17.
Article in English | MEDLINE | ID: mdl-39019076

ABSTRACT

In traditional hydraulic robotics, actuators must be sized for the highest possible load, resulting in significant energy losses when operating in lower force regimes. Variable recruitment fluidic artificial muscle (FAM) bundles offer a novel bio-inspired solution to this problem. Divided into individual MUs, each with its own control valve, a variable recruitment FAM bundle uses a switching control scheme to selectively bring MUs online according to load demand. To date, every dynamic variable recruitment study in the literature has considered homogeneous bundles containing MUs of equal size. However, natural mammalian muscle MUs are heterogeneous and primarily operate based on Henneman's size principle, which states that MUs are recruited from smallest to largest. Is it better for a FAM variable recruitment bundle to operate according to this principle, or are there other recruitment orders that result in better performance? What are the appropriate criteria for switching between recruitment states for these different recruitment orders? This chapter seeks to answer these questions by performing two case studies exploring different bundle MU size distributions, analyzing the tradeoffs between tracking performance and energetics, and determining how these tradeoffs are affected by different MU recruitment order and recruitment state transition thresholds. The only difference between the two test cases is the overall force capacity (i.e., total size) of the bundle. For each test case, a Pareto frontier for different MU size distributions, recruitment orders, and recruitment state transition thresholds is constructed. The results show that there is a complex relationship between overall bundle size, MU size distributions, recruitment orders, and recruitment state transition thresholds corresponding to the best tradeoffs change along the Pareto frontier. Overall, these two case studies validate the use of Henneman's Size Principle as a variable recruitment strategy, but also demonstrate that it should not be the only method considered.

4.
Heliyon ; 10(11): e31843, 2024 Jun 15.
Article in English | MEDLINE | ID: mdl-38873666

ABSTRACT

This paper presents the placement and sizing of energy hubs (EHs) in electricity, gas, and heating networks. EH is a coordinator framework for various power sources, storage devices, and responsive loads. For simultaneous modeling of economic, operation, reliability, and flexibility indices, the proposed scheme is expressed as a three-objective optimization in the form of Pareto optimization based on the sum of weighted functions. The objective functions of this problem respectively minimize the planning cost of EHs (equal to the total cost of construction of hubs and their expected operating cost), the expected energy loss of the mentioned networks, and the expected energy not-supplied (EENS) of these networks in the case of an N - 1 event. The problem is constrained by power flow equations and operation and reliability constraints of these network together with the EH planning and operation model, and flexibility constraints of the EHs. Then, to achieve unique optimal solution in the shortest possible time, a linear approximation model is extracted for the proposed scheme. Moreover, scenario-based stochastic programming (SBSP) is employed to model uncertainties of load, energy cost, renewable power, and accessibility of the mentioned network equipment. Finally, the obtained numerical results indicate the capability of the proposed scheme in enhancing the economic and flexibility situation of EHs and improving the reliability and operation status of energy networks along with achieving optimal planning and operation for EHs.

5.
J Gambl Stud ; 2024 Jun 24.
Article in English | MEDLINE | ID: mdl-38913215

ABSTRACT

Online gambling has grown to be a significant industry but it faces regulatory threats because of perception that it is heavily dependent on a small segment of its customers who gamble heavily and at a level carrying elevated risk of harm. Employing a large multi-operator data set from Britain, which records individual transactions by some 140,000 individuals observed over one year, we are enabled to provide more precise estimates of the degree of concentration of revenue, compared with previous studies. High dependence on a relatively small number of customers is shown though there is variation from product to product in how small the group of account-holders of potential concern is. We conclude with a discussion of prospects for the industry in light of heightened awareness of gambling harm and resulting restrictions on online gambling spending introduced or proposed by governments or regulators in several jurisdictions.

6.
bioRxiv ; 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38853830

ABSTRACT

Evolutionary models of quantitative traits often assume trade-offs between beneficial and detrimental traits, requiring modelers to specify a function linking costs to benefits. The choice of trade-off function is often consequential; functions that assume diminishing returns (accelerating costs) typically lead to single equilibrium genotypes, while decelerating costs often lead to evolutionary branching. Despite their importance, we still lack a strong theoretical foundation to base the choice of trade-off function. To address this gap, we explore how trade-off functions can emerge from the genetic architecture of a quantitative trait. We developed a multi-locus model of disease resistance, assuming each locus had random antagonistic pleiotropic effects on resistance and fecundity. We used this model to generate genotype landscapes and explored how additive versus epistatic genetic architectures influenced the shape of the trade-off function. Regardless of epistasis, our model consistently led to accelerating costs. We then used our genotype landscapes to build an evolutionary model of disease resistance. Unlike other models with accelerating costs, our approach often led to genetic polymorphisms at equilibrium. Our results suggest that accelerating costs are a strong null model for evolutionary trade-offs and that the eco-evolutionary conditions required for polymorphism may be more nuanced than previously believed.

7.
Sensors (Basel) ; 24(11)2024 May 30.
Article in English | MEDLINE | ID: mdl-38894328

ABSTRACT

OBJECTIVE: Aiming at the shortcomings of artificial surgical path planning for the thermal ablation of liver tumors, such as the time-consuming and labor-consuming process, and relying heavily on doctors' puncture experience, an automatic path-planning system for thermal ablation of liver tumors based on CT images is designed and implemented. METHODS: The system mainly includes three modules: image segmentation and three-dimensional reconstruction, automatic surgical path planning, and image information management. Through organ segmentation and three- dimensional reconstruction based on CT images, the personalized abdominal spatial anatomical structure of patients is obtained, which is convenient for surgical path planning. The weighted summation method based on clinical constraints and the concept of Pareto optimality are used to solve the multi-objective optimization problem, screen the optimal needle entry path, and realize the automatic planning of the thermal ablation path. The image information database was established to store the information related to the surgical path. RESULTS: In the discussion with clinicians, more than 78% of the paths generated by the planning system were considered to be effective, and the efficiency of system path planning is higher than doctors' planning efficiency. CONCLUSION: After improvement, the system can be used for the planning of the thermal ablation path of a liver tumor and has certain clinical application value.


Subject(s)
Liver Neoplasms , Tomography, X-Ray Computed , Humans , Liver Neoplasms/diagnostic imaging , Liver Neoplasms/surgery , Liver Neoplasms/pathology , Tomography, X-Ray Computed/methods , Imaging, Three-Dimensional/methods , Ablation Techniques/methods , Algorithms , Image Processing, Computer-Assisted/methods , Surgery, Computer-Assisted/methods , Liver/surgery , Liver/diagnostic imaging
8.
Foods ; 13(9)2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38731753

ABSTRACT

This study optimized the input processing factors, namely compression force, pressing speed, heating temperature, and heating time, for extracting oil from desiccated coconut medium using a vertical compression process by applying a maximum load of 100 kN. The samples' pressing height of 100 mm was measured using a vessel chamber of diameter 60 mm with a plunger. The Box-Behnken design was used to generate the factors' combinations of 27 experimental runs with each input factor set at three levels. The response surface regression technique was used to determine the optimum input factors of the calculated responses: oil yield (%), oil expression efficiency (%), and energy (J). The optimum factors' levels were the compression force 65 kN, pressing speed 5 mm min-1, heating temperature 80 °C, and heating time 52.5 min. The predicted values of the responses were 48.48%, 78.35%, and 749.58 J. These values were validated based on additional experiments producing 48.18 ± 0.45%, 77.86 ± 0.72%, and 731.36 ± 8.04 J. The percentage error values between the experimental and the predicted values ranged from 0.82 ± 0.65 to 2.43 ± 1.07%, confirming the suitability of the established regression models for estimating the responses.

9.
Sci Rep ; 14(1): 10423, 2024 May 07.
Article in English | MEDLINE | ID: mdl-38710762

ABSTRACT

In this study, we present a comprehensive optimization framework employing the Multi-Objective Multi-Verse Optimization (MOMVO) algorithm for the optimal integration of Distributed Generations (DGs) and Capacitor Banks (CBs) into electrical distribution networks. Designed with the dual objectives of minimizing energy losses and voltage deviations, this framework significantly enhances the operational efficiency and reliability of the network. Rigorous simulations on the standard IEEE 33-bus and IEEE 69-bus test systems underscore the effectiveness of the MOMVO algorithm, demonstrating up to a 47% reduction in energy losses and up to a 55% improvement in voltage stability. Comparative analysis highlights MOMVO's superiority in terms of convergence speed and solution quality over leading algorithms such as the Multi-Objective Jellyfish Search (MOJS), Multi-Objective Flower Pollination Algorithm (MOFPA), and Multi-Objective Lichtenberg Algorithm (MOLA). The efficacy of the study is particularly evident in the identification of the best compromise solutions using MOMVO. For the IEEE 33 network, the application of MOMVO led to a significant 47.58% reduction in daily energy loss and enhanced voltage profile stability from 0.89 to 0.94 pu. Additionally, it realized a 36.97% decrease in the annual cost of energy losses, highlighting substantial economic benefits. For the larger IEEE 69 network, MOMVO achieved a remarkable 50.15% reduction in energy loss and improved voltage profiles from 0.89 to 0.93 pu, accompanied by a 47.59% reduction in the annual cost of energy losses. These results not only confirm the robustness of the MOMVO algorithm in optimizing technical and economic efficiencies but also underline the potential of advanced optimization techniques in facilitating the sustainable integration of renewable energy resources into existing power infrastructures. This research significantly contributes to the field of electrical distribution network optimization, paving the way for future advancements in renewable energy integration and optimization techniques for enhanced system efficiency, reliability, and sustainability.

10.
Waste Manag Res ; : 734242X241252914, 2024 May 24.
Article in English | MEDLINE | ID: mdl-38785075

ABSTRACT

In the area of Solid Waste Management, transportation of the collected waste is a critical aspect considering the substantial time spent by garbage trucks on public roads. Studies have reported that transporting garbage has challenges related to public exposure and aesthetics. This study presents a generalised bi-objective formulation for the optimal routing of garbage trucks from transfer stations to recycling sites/landfills considering the trade-off between public exposure and aesthetic loss and constraining the operating cost. The formulation uses the novel link capacity function to account for the delay at traffic signals and the mix of trucks and cars on link performance. The proposed formulation is solved using the weighted sum and ε-constraint methods and applied to a realistic case study of the City of Chicago, USA. The Pareto Front obtained for the bi-objective formulation offers diverse trade-off solutions catering to distinct performance metrics. The results highlight the disparity across the solutions; the solution (P0.95 on Pareto Front) for minimum operating cost (or travel time or distance travelled) is very different from the solution (P0.4 on Pareto Front) for aesthetic cost and public exposure. The parametric study indicated that a modest operating budget may suffice for achieving aesthetic benefits, but minimising public exposure requires a higher operating budget. Finally, the proposed framework is adaptable to address various challenges pertaining to waste transportation, thereby serving as a valuable tool for evaluating policies and practices geared towards sustainability objectives.

11.
Radiat Environ Biophys ; 63(2): 215-262, 2024 May.
Article in English | MEDLINE | ID: mdl-38664268

ABSTRACT

In the present research, we have developed a model-based crisp logic function statistical classifier decision support system supplemented with treatment planning systems for radiation oncologists in the treatment of glioblastoma multiforme (GBM). This system is based on Monte Carlo radiation transport simulation and it recreates visualization of treatment environments on mathematical anthropomorphic brain (MAB) phantoms. Energy deposition within tumour tissue and normal tissues are graded by quality audit factors which ensure planned dose delivery to tumour site thereby minimising damages to healthy tissues. The proposed novel methodology predicts tumour growth response to radiation therapy from a patient-specific medicine quality audit perspective. Validation of the study was achieved by recreating thirty-eight patient-specific mathematical anthropomorphic brain phantoms of treatment environments by taking into consideration density variation and composition of brain tissues. Dose computations accomplished through water phantom, tissue-equivalent head phantoms are neither cost-effective, nor patient-specific customized and is often less accurate. The above-highlighted drawbacks can be overcome by using open-source Electron Gamma Shower (EGSnrc) software and clinical case reports for MAB phantom synthesis which would result in accurate dosimetry with due consideration to the time factors. Considerable dose deviations occur at the tumour site for environments with intraventricular glioblastoma, haematoma, abscess, trapped air and cranial flaps leading to quality factors with a lower logic value of 0. Logic value of 1 depicts higher dose deposition within healthy tissues and also leptomeninges for majority of the environments which results in radiation-induced laceration.


Subject(s)
Brain Neoplasms , Glioblastoma , Monte Carlo Method , Glioblastoma/radiotherapy , Humans , Brain Neoplasms/radiotherapy , Phantoms, Imaging , Radiotherapy Planning, Computer-Assisted/methods , Radiation Oncologists , Decision Support Systems, Clinical , Radiotherapy Dosage
12.
Brain Sci ; 14(4)2024 Apr 14.
Article in English | MEDLINE | ID: mdl-38672031

ABSTRACT

This paper presents a novel approach to improving the detection of mild cognitive impairment (MCI) through the use of super-resolved structural magnetic resonance imaging (MRI) and optimized deep learning models. The study introduces enhancements to the perceptual quality of super-resolved 2D structural MRI images using advanced loss functions, modifications to the upscaler part of the generator, and experiments with various discriminators within a generative adversarial training setting. It empirically demonstrates the effectiveness of super-resolution in the MCI detection task, showcasing performance improvements across different state-of-the-art classification models. The paper also addresses the challenge of accurately capturing perceptual image quality, particularly when images contain checkerboard artifacts, and proposes a methodology that incorporates hyperparameter optimization through a Pareto optimal Markov blanket (POMB). This approach systematically explores the hyperparameter space, focusing on reducing overfitting and enhancing model generalizability. The research findings contribute to the field by demonstrating that super-resolution can significantly improve the quality of MRI images for MCI detection, highlighting the importance of choosing an adequate discriminator and the potential of super-resolution as a preprocessing step to boost classification model performance.

13.
Cent Eur J Oper Res ; 32(2): 507-520, 2024.
Article in English | MEDLINE | ID: mdl-38650679

ABSTRACT

In this paper, we examine the sensitivity of the results of an earlier paper which presented and analyzed a dynamic game model of a monetary union with coalitions between governments (fiscal policy makers) and a common central bank (monetary policy maker). Here we examine alternative values of the parameters of the underlying model to show how the earlier results depend on the numerical parameter values chosen, which were obtained by calibration instead of econometric estimation. We demonstrate that the main results are qualitatively the same as in the original model for plausible smaller and larger values of the parameters. For the few cases where they differ, we interpret the deviations in economic terms and illustrate the policies and their macroeconomic effects resulting from the change to the parameter under consideration for one of these cases.

14.
Sensors (Basel) ; 24(8)2024 Apr 12.
Article in English | MEDLINE | ID: mdl-38676094

ABSTRACT

Federated learning (FL) is an emerging distributed learning technique through which models can be trained using the data collected by user devices in resource-constrained situations while protecting user privacy. However, FL has three main limitations: First, the parameter server (PS), which aggregates the local models that are trained using local user data, is typically far from users. The large distance may burden the path links between the PS and local nodes, thereby increasing the consumption of the network and computing resources. Second, user device resources are limited, but this aspect is not considered in the training of the local model and transmission of the model parameters. Third, the PS-side links tend to become highly loaded as the number of participating clients increases. The links become congested owing to the large size of model parameters. In this study, we propose a resource-efficient FL scheme. We follow the Pareto optimality concept with the biased client selection to limit client participation, thereby ensuring efficient resource consumption and rapid model convergence. In addition, we propose a hierarchical structure with location-based clustering for device-to-device communication using k-means clustering. Simulation results show that with prate at 0.75, the proposed scheme effectively reduced transmitted and received network traffic by 75.89% and 78.77%, respectively, compared to the FedAvg method. It also achieves faster model convergence compared to other FL mechanisms, such as FedAvg and D2D-FedAvg.

15.
Int J Mol Sci ; 25(8)2024 Apr 20.
Article in English | MEDLINE | ID: mdl-38674106

ABSTRACT

The significant heterogeneity of Wilms' tumors between different patients is thought to arise from genetic and epigenetic distortions that occur during various stages of fetal kidney development in a way that is poorly understood. To address this, we characterized the heterogeneity of alternative mRNA splicing in Wilms' tumors using a publicly available RNAseq dataset of high-risk Wilms' tumors and normal kidney samples. Through Pareto task inference and cell deconvolution, we found that the tumors and normal kidney samples are organized according to progressive stages of kidney development within a triangle-shaped region in latent space, whose vertices, or "archetypes", resemble the cap mesenchyme, the nephrogenic stroma, and epithelial tubular structures of the fetal kidney. We identified a set of genes that are alternatively spliced between tumors located in different regions of latent space and found that many of these genes are associated with the epithelial-to-mesenchymal transition (EMT) and muscle development. Using motif enrichment analysis, we identified putative splicing regulators, some of which are associated with kidney development. Our findings provide new insights into the etiology of Wilms' tumors and suggest that specific splicing mechanisms in early stages of development may contribute to tumor development in different patients.


Subject(s)
Alternative Splicing , Epithelial-Mesenchymal Transition , Kidney Neoplasms , Wilms Tumor , Wilms Tumor/genetics , Wilms Tumor/pathology , Humans , Kidney Neoplasms/genetics , Kidney Neoplasms/pathology , Epithelial-Mesenchymal Transition/genetics , Gene Expression Regulation, Neoplastic , Kidney/metabolism , Kidney/pathology
16.
Environ Sci Pollut Res Int ; 31(21): 31042-31053, 2024 May.
Article in English | MEDLINE | ID: mdl-38622419

ABSTRACT

Groundwater contamination is a global concern that has detrimental effect on public health and the environment. Sustainable groundwater treatment technologies such as adsorption require attaining a high removal efficiency at a minimal cost. This study investigated the adsorption of arsenate from groundwater utilizing chitosan-coated bentonite (CCB) under a fixed-bed column setup. Fuzzy multi-objective optimization was applied to identify the most favorable conditions for process variables, including volumetric flow rate, initial arsenate concentration, and CCB dosage. Empirical models were employed to examine how initial concentration, flow rate, and adsorbent dosage affect adsorption capacity at breakthrough, energy consumption, and total operational cost during optimization. The ε-constraint process was used in identifying the Pareto frontier, effectively illustrating the trade-off between adsorption capacity at breakthrough and the cost of the fixed-bed system. The integration of fuzzy optimization for adsorption capacity and its total operating cost utilized the global solver function in LINGO 20 software. A crucial equation derived from the Box-Behnken design and a cost equation based on energy and material usage in the fixed-bed system was employed. The results from identifying the Pareto front determined boundary limits for adsorption capacity at breakthrough (ranging from 12.96 ± 0.19 to 12.34 ± 0.42 µg/g) and total operating cost (ranging from 955.83 to 1106.32 USD/kg). An overall satisfaction level of 35.46% was achieved in the fuzzy optimization process. This results in a compromise solution of 12.90 µg/g for adsorption capacity at breakthrough and 1052.96 USD/kg for total operating cost. Henceforth, this can allow a suitable strategic decision-making approach for key stakeholders in future applications of the adsorption fixed-bed system.


Subject(s)
Arsenates , Bentonite , Chitosan , Groundwater , Water Pollutants, Chemical , Water Purification , Chitosan/chemistry , Arsenates/chemistry , Bentonite/chemistry , Adsorption , Water Pollutants, Chemical/chemistry , Groundwater/chemistry , Water Purification/methods
17.
Radiat Oncol ; 19(1): 45, 2024 Apr 08.
Article in English | MEDLINE | ID: mdl-38589961

ABSTRACT

BACKGROUND: Current automated planning solutions are calibrated using trial and error or machine learning on historical datasets. Neither method allows for the intuitive exploration of differing trade-off options during calibration, which may aid in ensuring automated solutions align with clinical preference. Pareto navigation provides this functionality and offers a potential calibration alternative. The purpose of this study was to validate an automated radiotherapy planning solution with a novel multi-dimensional Pareto navigation calibration interface across two external institutions for prostate cancer. METHODS: The implemented 'Pareto Guided Automated Planning' (PGAP) methodology was developed in RayStation using scripting and consisted of a Pareto navigation calibration interface built upon a 'Protocol Based Automatic Iterative Optimisation' planning framework. 30 previous patients were randomly selected by each institution (IA and IB), 10 for calibration and 20 for validation. Utilising the Pareto navigation interface automated protocols were calibrated to the institutions' clinical preferences. A single automated plan (VMATAuto) was generated for each validation patient with plan quality compared against the previously treated clinical plan (VMATClinical) both quantitatively, using a range of DVH metrics, and qualitatively through blind review at the external institution. RESULTS: PGAP led to marked improvements across the majority of rectal dose metrics, with Dmean reduced by 3.7 Gy and 1.8 Gy for IA and IB respectively (p < 0.001). For bladder, results were mixed with low and intermediate dose metrics reduced for IB but increased for IA. Differences, whilst statistically significant (p < 0.05) were small and not considered clinically relevant. The reduction in rectum dose was not at the expense of PTV coverage (D98% was generally improved with VMATAuto), but was somewhat detrimental to PTV conformality. The prioritisation of rectum over conformality was however aligned with preferences expressed during calibration and was a key driver in both institutions demonstrating a clear preference towards VMATAuto, with 31/40 considered superior to VMATClinical upon blind review. CONCLUSIONS: PGAP enabled intuitive adaptation of automated protocols to an institution's planning aims and yielded plans more congruent with the institution's clinical preference than the locally produced manual clinical plans.


Subject(s)
Prostatic Neoplasms , Radiotherapy, Intensity-Modulated , Male , Humans , Radiotherapy, Intensity-Modulated/methods , Radiotherapy Dosage , Radiotherapy Planning, Computer-Assisted/methods , Urinary Bladder , Prostatic Neoplasms/radiotherapy , Organs at Risk
18.
J Environ Manage ; 357: 120688, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38552511

ABSTRACT

The strategic reduction and remediation of degraded land is a global environmental priority. This is a particular priority in the Great Barrier Reef catchment area, Australia, where gully erosion a significant contributor to land degradation and water quality deterioration. Urgent action through the prioritisation and remediation of gully erosion sites is imperative to safeguard this UNESCO World Heritage site. In this study, we analyze a comprehensive dataset of 22,311 mapped gullies within a 3480 km2 portion of the lower Burdekin Basin, northeast Australia. Utilizing high-resolution lidar datasets, two independent methods - Minimum Contemporary Estimate (MCE) and Lifetime Average Estimate (LAE) - were developed to derive relative erosion rates. These methods, employing different data processing approaches and addressing different timeframes across the gully lifetime, yield erosion rates varying by up to several orders of magnitude. Despite some expected divergence, both methods exhibit strong, positive correlations with each other and additional validation data. There is a 43% agreement between the methods for the highest yielding 2% of gullies, although 80.5% of high-yielding gullies identified by either method are located within a 1 km proximity of each other. Importantly, distributions from both methods independently reveal that ∼80% of total volume of gully erosion in the study area is produced from only 20% of all gullies. Moreover, the top 2% of gullies generate 30% of the sediment loss and the majority of gullies do not significantly contribute to the overall catchment sediment yield. These results underscore the opportunity to achieve significant environmental outcomes through targeted gully management by prioritising a small cohort of high yielding gullies. Further insights and implications for management frameworks are discussed in the context of the characteristics of this cohort. Overall, this research provides a basis for informed decision-making in addressing gully erosion and advancing environmental conservation efforts.


Subject(s)
Conservation of Natural Resources , Soil , Humans , Conservation of Natural Resources/methods , Water Quality , Australia
19.
Math Biosci Eng ; 21(3): 3540-3562, 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-38549295

ABSTRACT

Dynamic multi-objective optimization problems have been popular because of its extensive application. The difficulty of solving the problem focuses on the moving PS as well as PF dynamically. A large number of efficient strategies have been put forward to deal with such problems by speeding up convergence and keeping diversity. Prediction strategy is a common method which is widely used in dynamic optimization environment. However, how to increase the efficiency of prediction is always a key but difficult issue. In this paper, a new prediction model is designed by using the rank sums of individuals, and the position difference of individuals in the previous two adjacent environments is defined to identify the present change type. The proposed prediction strategy depends on environment change types. In order to show the effectiveness of the proposed algorithm, the comparison is carried out with five state-of-the-art approaches on 20 benchmark instances of dynamic multi-objective problems. The experimental results indicate the proposed algorithm can get good convergence and distribution in dynamic environments.

20.
Heliyon ; 10(5): e26665, 2024 Mar 15.
Article in English | MEDLINE | ID: mdl-38486727

ABSTRACT

This research introduces the Multi-Objective Liver Cancer Algorithm (MOLCA), a novel approach inspired by the growth and proliferation patterns of liver tumors. MOLCA emulates the evolutionary tendencies of liver tumors, leveraging their expansion dynamics as a model for solving multi-objective optimization problems in engineering design. The algorithm uniquely combines genetic operators with the Random Opposition-Based Learning (ROBL) strategy, optimizing both local and global search capabilities. Further enhancement is achieved through the integration of elitist non-dominated sorting (NDS), information feedback mechanism (IFM) and Crowding Distance (CD) selection method, which collectively aim to efficiently identify the Pareto optimal front. The performance of MOLCA is rigorously assessed using a comprehensive set of standard multi-objective test benchmarks, including ZDT, DTLZ and various Constraint (CONSTR, TNK, SRN, BNH, OSY and KITA) and real-world engineering design problems like Brushless DC wheel motor, Safety isolating transformer, Helical spring, Two-bar truss and Welded beam. Its efficacy is benchmarked against prominent algorithms such as the non-dominated sorting grey wolf optimizer (NSGWO), multiobjective multi-verse optimization (MOMVO), non-dominated sorting genetic algorithm (NSGA-II), decomposition-based multiobjective evolutionary algorithm (MOEA/D) and multiobjective marine predator algorithm (MOMPA). Quantitative analysis is conducted using GD, IGD, SP, SD, HV and RT metrics to represent convergence and distribution, while qualitative aspects are presented through graphical representations of the Pareto fronts. The MOLCA source code is available at: https://github.com/kanak02/MOLCA.

SELECTION OF CITATIONS
SEARCH DETAIL
...