Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 668
Filter
1.
Molecules ; 29(13)2024 Jun 26.
Article in English | MEDLINE | ID: mdl-38998979

ABSTRACT

To reduce unwanted fat bloom in the manufacturing and storage of chocolates, detailed knowledge of the chemical composition and molecular mobility of the oils and fats contained is required. Although the formation of fat bloom on chocolate products has been studied for many decades with regard to its prevention and reduction, questions on the molecular level still remain to be answered. Chocolate products with nut-based fillings are especially prone to undesirable fat bloom. The chemical composition of fat bloom is thought to be dominated by the triacylglycerides of the chocolate matrix, which migrate to the chocolate's surface and recrystallize there. Migration of oils from the fillings into the chocolate as driving force for fat bloom formation is an additional factor in the discussion. In this work, the migration was studied and confirmed by MRI, while the chemical composition of the fat bloom was measured by NMR spectroscopy and HPLC-MS, revealing the most important triacylglycerides in the fat bloom. The combination of HPLC-MS with NMR spectroscopy at 800 MHz allows for detailed chemical structure determination. A rapid routine was developed combining the two modalities, which was then applied to investigate the aging, the impact of chocolate composition, and the influence of hazelnut fillings processing parameters, such as the degree of roasting and grinding of the nuts or the mixing time, on fat bloom formation.


Subject(s)
Chocolate , Magnetic Resonance Spectroscopy , Chocolate/analysis , Chromatography, High Pressure Liquid/methods , Magnetic Resonance Spectroscopy/methods , Mass Spectrometry/methods , Triglycerides/analysis , Triglycerides/chemistry , Cacao/chemistry , Food Analysis/methods , Corylus/chemistry , Liquid Chromatography-Mass Spectrometry
2.
Biotechnol J ; 19(7): e2400170, 2024 Jul.
Article in English | MEDLINE | ID: mdl-39014932

ABSTRACT

Therapeutic oligonucleotides (ONs) have great potential to treat many diseases due to their ability to regulate gene expression. However, the inefficiency of standard purification techniques to separate the target sequence from molecularly similar variants is hindering development of large scale ON manufacturing at a reasonable cost. Multicolumn Countercurrent Solvent Gradient Purification (MCSGP) is a valuable process able to bypass the purity-yield tradeoff typical of single-column operations, and hence to make the ON production more sustainable from both an economic and environmental point of view. However, operating close to the optimum of MCSGP can be challenging, resulting in unstable process performance and in a drift in product quality, especially when running a continuous process for extended periods where process parameters such as temperature are prone to variation. In this work, we demonstrate how greater process robustness is introduced in the design and execution of MCSGP for the purification of a 20mer single-stranded DNA sequence through the implementation of UV-based dynamic control. With this novel approach, the cyclic steady state was reached already in the third cycle and disturbances coming from fluctuations in the feed quality, loading amount and temperature were effectively compensated allowing a stable operation close to the optimum. In response to the perturbations, the controlled process kept the standard deviation on product recovery below 3.4%, while for the non-controlled process it increased up to 27.5%.


Subject(s)
Oligonucleotides , Solvents , Oligonucleotides/chemistry , Oligonucleotides/isolation & purification , Solvents/chemistry , Countercurrent Distribution/methods , Ultraviolet Rays , Temperature , DNA, Single-Stranded/chemistry , DNA, Single-Stranded/isolation & purification
3.
Mol Ther Nucleic Acids ; 35(2): 102223, 2024 Jun 11.
Article in English | MEDLINE | ID: mdl-38948330

ABSTRACT

The development of messenger RNA (mRNA) vaccines and therapeutics necessitates the production of high-quality in vitro-transcribed mRNA drug substance with specific critical quality attributes (CQAs), which are closely tied to the uniformity of linear DNA template. The supercoiled plasmid DNA is the precursor to the linear DNA template, and the supercoiled DNA percentage is commonly regarded as a key in-process control (IPC) during the manufacturing of linear DNA template. In this study, we investigate the influence of supercoiled DNA percentage on key mRNA CQAs, including purity, capping efficiency, double-stranded RNA (dsRNA), and distribution of poly(A) tail. Our findings reveal a significant impact of supercoiled DNA percentage on mRNA purity and in vitro transcription yield. Notably, we observe that the impact on mRNA purity can be mitigated through oligo-dT chromatography, alleviating the tight range of DNA supercoiled percentage to some extent. Overall, this study provides valuable insights into IPC strategies for DNA template chemistry, manufacturing, and controls (CMC) and process development for mRNA drug substance.

4.
Biotechnol Bioeng ; 2024 Jun 27.
Article in English | MEDLINE | ID: mdl-38938008

ABSTRACT

Ethanol production is a significant industrial bioprocess for energy. The primary objective of this study is to control the process reactor temperature to get the desired product, that is, ethanol. Advanced model-based control systems face challenges due to model-process mismatch, but Reinforcement Learning (RL) is a class of machine learning which can help by allowing agents to learn policies directly from the environment. Hence a RL algorithm called twin delayed deep deterministic policy gradient (TD3) is employed. The control of reactor temperature is categorized into two categories namely unconstrained and constrained control approaches. The TD3 with various reward functions are tested on a nonlinear bioreactor model. The results are compared with existing popular RL algorithm, namely, deep deterministic policy gradient (DDPG) algorithm with a performance measure such as mean squared error (MSE). In the unconstrained control of the bioreactor, the TD3 based controller designed with the integral absolute error (IAE) reward yields a lower MSE of 0.22, whereas the DDPG produces an MSE of 0.29. Similarly, in the case of constrained controller, TD3 based controller designed with the IAE reward yields a lower MSE of 0.38, whereas DDPG produces an MSE of 0.48. In addition, the TD3 trained agent successfully rejects the disturbances, namely, input flow rate and inlet temperature in addition to a setpoint change with better performance metrics.

5.
J Chromatogr A ; 1730: 465110, 2024 Jun 21.
Article in English | MEDLINE | ID: mdl-38941794

ABSTRACT

Maximizing product quality attributes by optimizing process parameters and performance attributes is a crucial aspect of bioprocess chromatography process design. Process parameters include but are not limited to bed height, eluate cut points, and elution pH. An under-characterized chromatography process parameter for protein A chromatography is process temperature. Here, we present a mechanistic understanding of the effects of temperature on the protein A purification of a monoclonal antibody (mAb) using a commercial chromatography resin for batch and continuous counter-current systems. A self-designed 3D-printed heating jacket controlled the 1 mL chromatography process temperature during the loading, wash, elution, and cleaning-in-place (CIP) steps. Batch loading experiments at 10, 20, and 30 °C demonstrated increased dynamic binding capacity (DBC) with temperature. The experimental data were fit to mechanistic and correlation-based models that predicted the optimal operating conditions over a range of temperatures. These model-based predictions optimized the development of a 3-column temperature-controlled periodic counter-current chromatography (TCPCC) and were validated experimentally. Operating a 3-column TCPCC at 30 °C led to a 47% increase in DBC relative to 20 °C batch chromatography. The DBC increase resulted in a two-fold increase in productivity relative to 20 °C batch. Increasing the number of columns to the TCPCC to optimize for increasing feed concentration resulted in further improvements to productivity. The feed-optimized TCPCC showed a respective two, three, and four-fold increase in productivity at feed concentrations of 1, 5, and 15 mg/mL mAb, respectively. The derived and experimentally validated temperature-dependent models offer a valuable tool for optimizing both batch and continuous chromatography systems under various operating conditions.

6.
Bioengineering (Basel) ; 11(6)2024 Jun 13.
Article in English | MEDLINE | ID: mdl-38927846

ABSTRACT

The continuous manufacturing of biologics offers significant advantages in terms of reducing manufacturing costs and increasing capacity, but it is not yet widely implemented by the industry due to major challenges in the automation, scheduling, process monitoring, continued process verification, and real-time control of multiple interconnected processing steps, which must be tightly controlled to produce a safe and efficacious product. The process produces a large amount of data from different sensors, analytical instruments, and offline analyses, requiring organization, storage, and analyses for process monitoring and control without compromising accuracy. We present a case study of a cyber-physical production system (CPPS) for the continuous manufacturing of mAbs that provides an automation infrastructure for data collection and storage in a data historian, along with data management tools that enable real-time analysis of the ongoing process using multivariate algorithms. The CPPS also facilitates process control and provides support in handling deviations at the process level by allowing the continuous train to re-adjust itself via a series of interconnected surge tanks and by recommending corrective actions to the operator. Successful steady-state operation is demonstrated for 55 h with end-to-end process automation and data collection via a range of in-line and at-line sensors. Following this, a series of deviations in the downstream unit operations, including affinity capture chromatography, cation exchange chromatography, and ultrafiltration, are monitored and tracked using multivariate approaches and in-process controls. The system is in line with Industry 4.0 and smart manufacturing concepts and is the first end-to-end CPPS for the continuous manufacturing of mAbs.

7.
J Appl Stat ; 51(9): 1621-1641, 2024.
Article in English | MEDLINE | ID: mdl-38933140

ABSTRACT

This paper aims to detect anomalous changes in social network structure in real time and to offer early warnings by phase II monitoring social networks. First, the exponential random graph model is used to model social networks. Then, a test and online monitoring technique of the exponential random graph model is developed based on the split likelihood-ratio test after determining the model and its parameters for a specific data set. This proposed approach uses pseudo-maximum likelihood estimation and likelihood ratio to construct the test statistics, avoiding the several steps of discovering Monte Carlo Markov Chain maximum likelihood estimation through an iterative method. A bisection algorithm for the control limit is given. Simulations on three data sets Flobusiness, Kapferer and Faux.mesa.high are presented to study the performance of the procedure. Different change points and shift sizes are compared to see how they affect the average run length. A real application example on the MIT reality mining social proximity network is used to illustrate the proposed modelling and online monitoring methods.

8.
Sensors (Basel) ; 24(12)2024 Jun 14.
Article in English | MEDLINE | ID: mdl-38931654

ABSTRACT

Conveyor belts serve as the primary mode of ore transportation in mineral processing plants. Feeders, comprised of shorter conveyors, regulate the material flow from silos to longer conveyor belts by adjusting their velocity. This velocity manipulation is facilitated by automatic controllers that gauge the material weight on the conveyor using scales. However, due to positioning constraints of these scales, a notable delay ensues between measurement and the adjustment of the feeder speed. This dead time poses a significant challenge in control design, aiming to prevent oscillations in material levels on the conveyor belt. This paper contributes in two key areas: firstly, through a simulation-based comparison of various control techniques addressing this issue across diverse scenarios; secondly, by implementing the Smith predictor solution in an operational plant and contrasting its performance with that of a single PID controller. Evaluation spans both the transient flow rate during step change setpoints and a month-long assessment. The experimental results reveal a notable increase in production by 355 t/h and a substantial reduction in flow rate oscillations on the conveyor belt, evidenced by a 55% decrease in the standard deviation.

9.
Sensors (Basel) ; 24(11)2024 May 30.
Article in English | MEDLINE | ID: mdl-38894312

ABSTRACT

To evaluate the suitability of an analytical instrument, essential figures of merit such as the limit of detection (LOD) and the limit of quantification (LOQ) can be employed. However, as the definitions k nown in the literature are mostly applicable to one signal per sample, estimating the LOD for substances with instruments yielding multidimensional results like electronic noses (eNoses) is still challenging. In this paper, we will compare and present different approaches to estimate the LOD for eNoses by employing commonly used multivariate data analysis and regression techniques, including principal component analysis (PCA), principal component regression (PCR), as well as partial least squares regression (PLSR). These methods could subsequently be used to assess the suitability of eNoses to help control and steer processes where volatiles are key process parameters. As a use case, we determined the LODs for key compounds involved in beer maturation, namely acetaldehyde, diacetyl, dimethyl sulfide, ethyl acetate, isobutanol, and 2-phenylethanol, and discussed the suitability of our eNose for that dertermination process. The results of the methods performed demonstrated differences of up to a factor of eight. For diacetyl, the LOD and the LOQ were sufficiently low to suggest potential for monitoring via eNose.


Subject(s)
Beer , Electronic Nose , Limit of Detection , Principal Component Analysis , Beer/analysis , Least-Squares Analysis , Volatile Organic Compounds/analysis
10.
Biotechnol Bioeng ; 2024 May 29.
Article in English | MEDLINE | ID: mdl-38812405

ABSTRACT

Reinforcement learning (RL), a subset of machine learning (ML), could optimize and control biomanufacturing processes, such as improved production of therapeutic cells. Here, the process of CAR T-cell activation by antigen-presenting beads and their subsequent expansion is formulated in silico. The simulation is used as an environment to train RL-agents to dynamically control the number of beads in culture to maximize the population of robust effector cells at the end of the culture. We make periodic decisions of incremental bead addition or complete removal. The simulation is designed to operate in OpenAI Gym, enabling testing of different environments, cell types, RL-agent algorithms, and state inputs to the RL-agent. RL-agent training is demonstrated with three different algorithms (PPO, A2C, and DQN), each sampling three different state input types (tabular, image, mixed); PPO-tabular performs best for this simulation environment. Using this approach, training of the RL-agent on different cell types is demonstrated, resulting in unique control strategies for each type. Sensitivity to input-noise (sensor performance), number of control step interventions, and advantages of pre-trained RL-agents are also evaluated. Therefore, we present an RL framework to maximize the population of robust effector cells in CAR T-cell therapy production.

11.
Biomed Phys Eng Express ; 10(4)2024 May 10.
Article in English | MEDLINE | ID: mdl-38697044

ABSTRACT

Objective.The aim of this work was to develop a Phase I control chart framework for the recently proposed multivariate risk-adjusted Hotelling'sT2chart. Although this control chart alone can identify most patients receiving extreme organ-at-risk (OAR) dose, it is restricted by underlying distributional assumptions, making it sensitive to extreme observations in the sample, as is typically found in radiotherapy plan quality data such as dose-volume histogram (DVH) points. This can lead to slightly poor-quality plans that should have been identified as out-of-control (OC) to be signaled in-control (IC).Approach. We develop a robust iterative control chart framework to identify all OC patients with abnormally high OAR dose and improve them via re-optimization to achieve an IC sample prior to establishing the Phase I control chart, which can be used to monitor future treatment plans.Main Results. Eighty head-and-neck patients were used in this study. After the first iteration, P14, P67, and P68 were detected as OC for high brainstem dose, warranting re-optimization aimed to reduce brainstem dose without worsening other planning criteria. The DVH and control chart were updated after re-optimization. On the second iteration, P14, P67, and P68 were IC, but P40 was identified as OC. After re-optimizing P40's plan and updating the DVH and control chart, P40 was IC, but P14* (P14's re-optimized plan) and P62 were flagged as OC. P14* could not be re-optimized without worsening target coverage, so only P62 was re-optimized. Ultimately, a fully IC sample was achieved. Multiple iterations were needed to identify and improve all OC patients, and to establish a more robust control limit to monitor future treatment plans.Significance. The iterative procedure resulted in a fully IC sample of patients. With this sample, a more robust Phase I control chart that can monitor OAR doses of new plans was established.


Subject(s)
Organs at Risk , Quality Control , Radiotherapy Dosage , Radiotherapy Planning, Computer-Assisted , Humans , Organs at Risk/radiation effects , Radiotherapy Planning, Computer-Assisted/methods , Head and Neck Neoplasms/radiotherapy , Algorithms
12.
Sci Rep ; 14(1): 10512, 2024 May 07.
Article in English | MEDLINE | ID: mdl-38714824

ABSTRACT

The study presents a new parameter free adaptive exponentially weighted moving average (AEWMA) control chart tailored for monitoring process dispersion, utilizing an adaptive approach for determining the smoothing constant. This chart is crafted to adeptly detect shifts within anticipated ranges in process dispersion by dynamically computing the smoothing constant. To assess its effectiveness, the chart's performance is measured through concise run-length profiles generated from Monte Carlo simulations. A notable aspect is the incorporation of an unbiased estimator in computing the smoothing constant through the suggested function, thereby improving the chart's capability to identify different levels of increasing and decreasing shifts in process dispersion. The comparison with an established adaptive EWMA-S2 dispersion chart highlights the considerable efficiency of the proposed chart in addressing diverse magnitudes of process dispersion shifts. Additionally, the study includes an application to a real-life dataset, showcasing the practicality and user-friendly nature of the proposed chart in real-world situations.

13.
Sci Rep ; 14(1): 11565, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38773191

ABSTRACT

This research presents a new adaptive exponentially weighted moving average control chart, known as the coefficient of variation (CV) EWMA statistic to study the relative process variability. The production process CV monitoring is a long-term process observation with an unstable mean. Therefore, a new modified adaptive exponentially weighted moving average (AAEWMA) CV monitoring chart using a novel function hereafter referred to as the "AAEWMA CV" monitoring control chart. the novelty of the suggested AAEWMA CV chart statistic is to identify the infrequent process CV changes. A continuous function is suggested to be used to adapt the plotting statistic smoothing constant value as per the process estimated shift size that arises in the CV parametric values. The Monte Carlo simulation method is used to compute the run-length values, which are used to analyze efficiency. The existing AEWMA CV chart is less effective than the proposed AAEWMA CV chart. An industrial data example is used to examine the strength of the proposed AAEWMA CV chart and to clarify the implementation specifics which is provided in the example section. The results strongly recommend the implementation of the proposed AAEWMA CV control chart.

14.
Sensors (Basel) ; 24(10)2024 May 15.
Article in English | MEDLINE | ID: mdl-38793986

ABSTRACT

In this paper, a dispersion of glass beads of different sizes in an ammonium nitrate solution is investigated with the aid of Raman spectroscopy. The signal losses caused by the dispersion are quantified by an additional scattered light measurement and used to correct the measured ammonium nitrate concentration. Each individual glass bead represents an interface at which the excitation laser is deflected from its direction causing distortion in the received Raman signal. It is shown that the scattering losses measured with the scattered light probe correlate with the loss of the Raman signal, which means that the data obtained can be used to correct the measured values. The resulting correction function considers different particle sizes in the range of 2-99 µm as well as ammonium nitrate concentrations of 0-20 wt% and delivers an RMSEP of 1.952 wt%. This correction provides easier process access to dispersions that were previously difficult or impossible to measure.

15.
Polymers (Basel) ; 16(10)2024 May 18.
Article in English | MEDLINE | ID: mdl-38794625

ABSTRACT

A 1D model describing the dynamics of an injection moulding machine and the injection process is presented. The model describes an injection cylinder actuated by a dual-pump electro-hydraulic speed-variable drive and the filling, holding and cooling phases of the injection moulding process utilising amorphous polymers. The model is suggested as the foundation for the design of model-based pressure controllers of, e.g., the nozzle pressure. The focus is on using material, mould and machine properties to construct the model, making it possible to analyse and design the dynamic system prior to manufacturing hardware or conducting experiments. Both the presented model and the developed controller show good agreement with experimental results. The proposed method is general in nature and enables the design, analysis and evaluation of the machine, material and mould dynamics for controller design based solely on the physical properties of the system.

16.
Bioresour Technol ; 403: 130891, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38788808

ABSTRACT

To reduce the cost of docosahexaenoic acid (DHA) production from Schizochytrium sp., the waste Pichia pastoris was successfully used as an alternative nitrogen source to achieve high-density cultivation during the cell growth phase. However, due to the high oxygen consumption feature when implementing high-density cultivation, the control of both the nitrogen source and dissolved oxygen concentration (DO) at each sufficient level was impossible; thus, two realistic control strategies, including "DO sufficiency-nitrogen limitation" and "DO limitation-nitrogen sufficiency", were proposed. When using the strategy of "DO sufficiency-nitrogen limitation", the lowest maintenance coefficient of glucose (12.3 mg/g/h vs. 17.0 mg/g/h) and the highest activities of related enzymes in DHA biosynthetic routes were simultaneously obtained; thus, a maximum DHA concentration of 12.8 ± 1.2 g/L was achieved, which was 1.58-fold greater than that of the control group. Overall, two-stage feeding control for alternative nitrogen sources is an efficient strategy to industrial DHA fermentation.


Subject(s)
Docosahexaenoic Acids , Nitrogen , Stramenopiles , Docosahexaenoic Acids/metabolism , Docosahexaenoic Acids/biosynthesis , Nitrogen/metabolism , Stramenopiles/metabolism , Fermentation , Oxygen/metabolism , Glucose/metabolism , Saccharomycetales/metabolism
17.
Clin Chem Lab Med ; 2024 May 16.
Article in English | MEDLINE | ID: mdl-38748888

ABSTRACT

OBJECTIVES: Patient-based real-time quality control (PBRTQC) is an alternative tool for laboratories that has gained increasing attention. Despite the progress made by using various algorithms, the problems of data volume imbalance between in-control and out-of-control results, as well as the issue of variation remain challenges. We propose a novel integrated framework using anomaly detection and graph neural network, combining clinical variables and statistical algorithms, to improve the error detection performance of patient-based quality control. METHODS: The testing results of three representative analytes (sodium, potassium, and calcium) and eight independent variables of patients (test date, time, gender, age, department, patient type, and reference interval limits) were collected. Graph-based anomaly detection network was modeled and used to generate control limits. Proportional and random errors were simulated for performance evaluation. Five mainstream PBRTQC statistical algorithms were chosen for comparison. RESULTS: The framework of a patient-based graph anomaly detection network for real-time quality control (PGADQC) was established and proven feasible for error detection. Compared with classic PBRTQC, the PGADQC showed a more balanced performance for both positive and negative biases. For different analytes, the average number of patient samples until error detection (ANPed) of PGADQC decreased variably, and reductions could reach up to approximately 95 % at a small bias of 0.02 taking calcium as an example. CONCLUSIONS: The PGADQC is an effective framework for patient-based quality control, integrating statistical and artificial intelligence algorithms. It improves error detection in a data-driven fashion and provides a new approach for PBRTQC from the data science perspective.

18.
Environ Res ; 252(Pt 4): 119133, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38735379

ABSTRACT

Phosphorus in wastewater poses a significant environmental threat, leading to water pollution and eutrophication. However, it plays a crucial role in the water-energy-resource recovery-environment (WERE) nexus. Recovering Phosphorus from wastewater can close the phosphorus loop, supporting circular economy principles by reusing it as fertilizer or in industrial applications. Despite the recognized importance of phosphorus recovery, there is a lack of analysis of the cyber-physical framework concerning the WERE nexus. Advanced methods like automatic control, optimal process technologies, artificial intelligence (AI), and life cycle assessment (LCA) have emerged to enhance wastewater treatment plants (WWTPs) operations focusing on improving effluent quality, energy efficiency, resource recovery, and reducing greenhouse gas (GHG) emissions. Providing insights into implementing modeling and simulation platforms, control, and optimization systems for Phosphorus recovery in WERE (P-WERE) in WWTPs is extremely important in WWTPs. This review highlights the valuable applications of AI algorithms, such as machine learning, deep learning, and explainable AI, for predicting phosphorus (P) dynamics in WWTPs. It emphasizes the importance of using AI to analyze microbial communities and optimize WWTPs for different various objectives. Additionally, it discusses the benefits of integrating mechanistic and data-driven models into plant-wide frameworks, which can enhance GHG simulation and enable simultaneous nitrogen (N) and Phosphorus (P) removal. The review underscores the significance of prioritizing recovery actions to redirect Phosphorus from effluent to reusable products for future considerations.


Subject(s)
Phosphorus , Waste Disposal, Fluid , Wastewater , Phosphorus/analysis , Wastewater/chemistry , Wastewater/analysis , Waste Disposal, Fluid/methods , Artificial Intelligence , Water Purification/methods , Water Pollutants, Chemical/analysis , Water Pollutants, Chemical/chemistry
19.
Polymers (Basel) ; 16(8)2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38674978

ABSTRACT

Injection molding is a highly nonlinear procedure that is easily influenced by various external factors, thereby affecting the stability of the product's quality. High-speed injection molding is required for production due to the rapid cooling characteristics of thin-walled parts, leading to increased manufacturing complexity. Consequently, establishing appropriate process parameters for maintaining quality stability in long-term production is challenging. This study selected a hot runner mold with a thin wall fitted with two external sensors, a nozzle pressure sensor and a tie-bar strain gauge, to collect data regarding the nozzle peak pressure, the timing of peak pressure, the viscosity index, and the clamping force difference value. The product weight was defined as the quality indicator, and a standardized parameter optimization process was constructed, including injection speed, V/P switchover point, packing, and clamping force. Finally, the optimized process parameters were applied to the adaptive process control experiments using the developed control system operated within the micro-controller unit (MCU). The results revealed that the control system effectively stabilized the product weight variation and standard deviation of 0.677% and 0.0178 g, respectively.

20.
Cureus ; 16(3): e57176, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38681323

ABSTRACT

Hospital pharmacies are integral to the healthcare system, and evaluating the factors influencing their efficiency and service standards is imperative. This analysis offers global insights to assist in developing strategies for future enhancements. The objective is to identify the optimal Lean Six Sigma methodologies to improve workflow and quality of hospital pharmacy services. A strategic search, aligned with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, encompassed an extensive range of academic databases, including Scopus, PubMed/Medline, Web of Science, and other sources for relevant studies published from 2009 to 2023. The focus was on management tactics and those examining outcomes, prioritizing publications reflecting pharmacy operations management's state. The quality of the selected articles was assessed, and the results were combined and analyzed. The search yielded 1,447 studies, of which 73 met the inclusion criteria. The systematic review found a low to moderate overall risk of bias. The number of publications rose during the coronavirus disease (COVID-19) outbreak. Among studies, research output in the United States of America represented 26% of the total. Other countries such as Indonesia, Spain, Canada, China, Saudi Arabia, the United Arab Emirates, and the United Kingdom also made significant contributions. Each country accounted for 12%, 8%, 7%, 5%, 5%, 5%, and 5%, respectively. The pharmacy journals led with 26 publications, and healthcare/medical with 14. The quality category came next with 12 articles, while seven journals represented engineering. Studies used empirical and observational methods, focusing on practice quality enhancement. The process control plan had 26 instances, and the define, measure, analyze, improve, and control (DMAIC) was identified 13 times. The sort, set in order, shine, standardize, and sustain (5S) ranked third, totaling seven occurrences. Failure mode and effects analysis (FMEA) and root cause analysis were moderately utilized, with six and four instances, respectively. Poka-Yoke (mistake-proofing measures) and value stream mapping were each counted three times. Quality improvement and workflow optimization dominated managerial strategies in 22 (30.14%) studies each, followed by technology integration in 15 (20.55%). Cost, patient care, and staffing each featured in three (4.11%) studies, while two (2.74%) focused on inventory management. One (1.37%) study each highlighted continuing education, collaboration, and policy changes. Analysis of the 73 studies on Lean and Six Sigma in hospital pharmacy operations showed significant impacts, with 26% of studies reporting decreased medication turnaround time, 15% showing process efficiency improvements, and 11% each for enhanced inventory management and bottleneck/failure mode reduction. Additionally, 9% of studies observed decreased medication errors, 8% noted increased satisfaction and cost savings, 6% identified enhancements in clinical activities, 3% improved prescription accuracy, 2% reduced workflow interruptions, and 1% reported increased knowledge. Also, this study has identified key strategies for service delivery improvement and the importance of quality practices and lean leadership. To the best of the author's knowledge, this research is believed to be the first in-depth analysis of Lean and Six Sigma in the hospital pharmacy domain, spanning 15 years from 2009 to 2023.

SELECTION OF CITATIONS
SEARCH DETAIL
...