Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 83
Filter
1.
Heliyon ; 10(14): e34424, 2024 Jul 30.
Article in English | MEDLINE | ID: mdl-39149066

ABSTRACT

In this article, we develop a new control chart based on the Exponentially Weighted Moving Average (EWMA) statistic, termed the New Extended Exponentially Weighted Moving Average (NEEWMA) statistic, designed to recognize slight changes in the process mean. We derive expressions for the mean and variance of the NEEWMA statistic, ensuring an unbiased estimation of the mean, with simulation results showing lower variance compared to traditional EWMA charts. Evaluating its performance using Average Run Length (ARL), our analysis reveals that the NEEWMA control chart outperforms EWMA and Extended EWMA (EEWMA) charts in swiftly recognizing shifts in the process mean. Illustrating its operational methodology through Monte Carlo simulations, an illustrative example using practical data is also provided to showcase its effectiveness.

2.
Sci Rep ; 14(1): 13561, 2024 Jun 12.
Article in English | MEDLINE | ID: mdl-38866892

ABSTRACT

In various practical situations, the information about the process distribution is sometimes partially or completely unavailable. In these instances, practitioners prefer to use nonparametric charts as they don't restrict the assumption of normality or specific distribution. In this current article, a nonparametric double homogeneously weighted moving average control chart based on the Wilcoxon signed-rank statistic is developed for monitoring the location parameter of the process. The run-length profiles of the newly developed chart are obtained by using Monte Carlo simulations. Comparisons are made based on various performance metrics of run-length distribution among proposed and existing nonparametric counterparts charts. The extra quadratic loss is used to evaluate the overall performance of the proposed and existing charts. The newly developed scheme showed comparatively better results than its existing counterparts. For practical implementation of the suggested scheme, the real-world dataset related to the inside diameter of the automobile piston rings is also used.

3.
Sci Rep ; 14(1): 11565, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38773191

ABSTRACT

This research presents a new adaptive exponentially weighted moving average control chart, known as the coefficient of variation (CV) EWMA statistic to study the relative process variability. The production process CV monitoring is a long-term process observation with an unstable mean. Therefore, a new modified adaptive exponentially weighted moving average (AAEWMA) CV monitoring chart using a novel function hereafter referred to as the "AAEWMA CV" monitoring control chart. the novelty of the suggested AAEWMA CV chart statistic is to identify the infrequent process CV changes. A continuous function is suggested to be used to adapt the plotting statistic smoothing constant value as per the process estimated shift size that arises in the CV parametric values. The Monte Carlo simulation method is used to compute the run-length values, which are used to analyze efficiency. The existing AEWMA CV chart is less effective than the proposed AAEWMA CV chart. An industrial data example is used to examine the strength of the proposed AAEWMA CV chart and to clarify the implementation specifics which is provided in the example section. The results strongly recommend the implementation of the proposed AAEWMA CV control chart.

4.
Sci Rep ; 14(1): 10512, 2024 May 07.
Article in English | MEDLINE | ID: mdl-38714824

ABSTRACT

The study presents a new parameter free adaptive exponentially weighted moving average (AEWMA) control chart tailored for monitoring process dispersion, utilizing an adaptive approach for determining the smoothing constant. This chart is crafted to adeptly detect shifts within anticipated ranges in process dispersion by dynamically computing the smoothing constant. To assess its effectiveness, the chart's performance is measured through concise run-length profiles generated from Monte Carlo simulations. A notable aspect is the incorporation of an unbiased estimator in computing the smoothing constant through the suggested function, thereby improving the chart's capability to identify different levels of increasing and decreasing shifts in process dispersion. The comparison with an established adaptive EWMA-S2 dispersion chart highlights the considerable efficiency of the proposed chart in addressing diverse magnitudes of process dispersion shifts. Additionally, the study includes an application to a real-life dataset, showcasing the practicality and user-friendly nature of the proposed chart in real-world situations.

5.
Sci Rep ; 14(1): 9633, 2024 04 26.
Article in English | MEDLINE | ID: mdl-38671182

ABSTRACT

In the current study, we demonstrate the use of a quality framework to review the process for improving the quality and safety of the patient in the health care department. The researchers paid attention to assessing the performance of the health care service, where the data is usually heterogeneous to patient's health conditions. In our study, the support vector machine (SVM) regression model is used to handle the challenge of adjusting the risk factors attached to the patients. Further, the design of exponentially weighted moving average (EWMA) control charts is proposed based on the residuals obtained through SVM regression model. Analyzing real cardiac surgery patient data, we employed the SVM method to gauge patient condition. The resulting SVM-EWMA chart, fashioned via SVM modeling, revealed superior shift detection capabilities and demonstrated enhanced efficacy compared to the risk-adjusted EWMA control chart.


Subject(s)
Cardiac Surgical Procedures , Support Vector Machine , Humans , Cardiac Surgical Procedures/methods , Risk Factors , Risk Adjustment/methods
6.
Sci Rep ; 14(1): 8923, 2024 Apr 18.
Article in English | MEDLINE | ID: mdl-38637650

ABSTRACT

The simultaneous monitoring of both the process mean and dispersion has gained considerable attention in statistical process control, especially when the process follows the normal distribution. This paper introduces a novel Bayesian adaptive maximum exponentially weighted moving average (Max-EWMA) control chart, designed to jointly monitor the mean and dispersion of a non-normal process. This is achieved through the utilization of the inverse response function, particularly suitable for processes conforming to a Weibull distribution. To assess the effectiveness of the proposed control chart, we employed the average run length (ARL) and the standard deviation of run length (SDRL). Subsequently, we compared the performance of our proposed control chart with that of an existing Max-EWMA control chart. Our findings suggest that the proposed control chart demonstrates a higher level of sensitivity in detecting out-of-control signals. Finally, to illustrate the effectiveness of our Bayesian Max-EWMA control chart under various Loss Functions (LFs) for a Weibull process, we present a practical case study focusing on the hard-bake process in the semiconductor manufacturing industry. This case study highlights the adaptability of the chart to different scenarios. Our results provide compelling evidence of the exceptional performance of the suggested control chart in rapidly detecting out-of-control signals during the hard-bake process, thereby significantly contributing to the improvement of process monitoring and quality control.

7.
J Appl Stat ; 51(6): 1171-1190, 2024.
Article in English | MEDLINE | ID: mdl-38628443

ABSTRACT

Distribution-free or nonparametric control charts are used for monitoring the process parameters when there is a lack of knowledge about the underlying distribution. In this paper, we investigate a single distribution-free triple exponentially weighted moving average control chart based on the Lepage statistic (referred as TL chart) for simultaneously monitoring shifts in the unknown location and scale parameters of a univariate continuous distribution. The design and implementation of the proposed chart are discussed using time-varying and steady-state control limits for the zero-state case. The run-length distribution of the TL chart is evaluated by performing Monte Carlo simulations. The performance of the proposed chart is compared to those of the existing EWMA-Lepage (EL) and DEWMA-Lepage (DL) charts. It is observed that the TL chart with a time-varying control limit is superior to its competitors, especially for small to moderate shifts in the process parameters. We also provide a real example from a manufacturing process to illustrate the application of the proposed chart.

8.
Sci Rep ; 14(1): 9948, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38688965

ABSTRACT

This article introduces an adaptive approach within the Bayesian Max-EWMA control chart framework. Various Bayesian loss functions were used to jointly monitor process deviations from the mean and variance of normally distributed processes. Our study proposes the mechanism of using a function-based adaptive method that picks self-adjusting weights incorporated in Bayesian Max-EWMA for the estimation of mean and variance. This adaptive mechanism significantly enhances the effectiveness and sensitivity of the Max-EWMA chart in detecting process shifts in both the mean and dispersion. The Monte Carlo simulation technique was used to calculate the run-length profiles of different combinations. A comparative performance analysis with an existing chart demonstrates its effectiveness. A practical example from the hard-bake process in semiconductor manufacturing is presented for practical context and illustration of the chart settings and performance. The empirical results showcase the superior performance of the Adaptive Bayesian Max-EWMA control chart in identifying out-of-control signals. The chart's ability to jointly monitor the mean and variance of a process, its adaptive nature, and its Bayesian framework make it a useful and effective control chart.

9.
Sci Rep ; 14(1): 5604, 2024 03 07.
Article in English | MEDLINE | ID: mdl-38453950

ABSTRACT

Control charts are a statistical approach for monitoring cancer data that can assist discover patterns, trends, and unusual deviations in cancer-related data across time. To detect deviations from predicted patterns, control charts are extensively used in quality control and process management. Control charts may be used to track numerous parameters in cancer data, such as incidence rates, death rates, survival time, recovery time, and other related indicators. In this study, CDEC chart is proposed to monitor the cancer patients recovery time censored data. This paper presents a composite dual exponentially weighted moving average Cumulative sum (CDEC) control chart for monitoring cancer patients recovery time censored data. This approach seeks to detect changes in the mean recovery time of cancer patients which usually follows Weibull lifetimes. The results are calculated using type I censored data under known and estimated parameter conditions. We combine the conditional expected value (CEV) and conditional median (CM) approaches, which are extensively used in statistical analysis to determine the central tendency of a dataset, to create an efficient control chart. The suggested chart's performance is assessed using the average run length (ARL), which evaluates how efficiently the chart can detect a change in the process mean. The CDEC chart is compared to existing control charts. A simulation study and a real-world data set related to cancer patients recovery time censored data is used for results illustration. The proposed CDEC control chart is developed for the data monitoring when complete information about the patients are not available. So, instead of doping the patients information we can used the proposed chart to monitor the patients information even if it is censored. The authors conclude that the suggested CDEC chart is more efficient than competitor control charts for monitoring cancer patients recovery time censored data. Overall, this study introduces an efficient new approach for cancer patients recovery time censored data, which might have significant effect on quality control and process improvement across a wide range of healthcare and medical studies.


Subject(s)
Ditiocarb/analogs & derivatives , Health Facilities , Neoplasms , Humans , Computer Simulation , Time , Quality Control , Neoplasms/diagnosis
10.
J Magn Reson Imaging ; 59(2): 522-532, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37203257

ABSTRACT

BACKGROUND: Vertical run-length nonuniformity (VRLN) is a texture feature representing heterogeneity within native T1 images and reflects the extent of cardiac fibrosis. In uremic cardiomyopathy, interstitial fibrosis was the major histological alteration. The prognostic value of VRLN in patients with end-stage renal disease (ESRD) remains unclear. PURPOSE: To evaluate the prognostic value of VRLN MRI in patients with ESRD. STUDY TYPE: Prospective. POPULATION: A total of 127 ESRD patients (30 participants in the major adverse cardiac events, MACE group). FIELD STRENGTH/SEQUENCE: 3.0 T/steady-state free precession sequence, modified Look-Locker imaging. ASSESSMENT: MRI image qualities were assessed by three independent radiologists. VRLN values were measured in the myocardium on the mid-ventricular short-axis slice of T1 mapping. Left ventricular (LV) mass, LV end-diastolic and end-systolic volume, as well as LV global strain cardiac parameters were measured. STATISTICAL TESTS: The primary endpoint was the incident of MACE from enrollment time to January 2023. MACE is a composite endpoint consisting of all-cause mortality, acute myocardial infarction, stroke, heart failure hospitalization, and life-threatening arrhythmia. Cox proportional-hazards regression was performed to test whether VRLN independently correlated with MACE. The intraclass correlation coefficients of VRLN were calculated to evaluate intraobserver and interobserver reproducibility. The C-index was computed to examine the prognostic value of VRLN. P-value <0.05 were considered statistically significant. RESULTS: Participants were followed for a median of 26 months. VRLN, age, LV end-systolic volume index, and global longitudinal strain remained significantly associated with MACE in the multivariable model. Adding VRLN to a baseline model containing clinical and conventional cardiac MRI parameters significantly improved the accuracy of the predictive model (C-index of the baseline model: 0.781 vs. the model added VRLN: 0.814). DATA CONCLUSION: VRLN is a novel marker for risk stratification toward MACE in patients with ESRD, superior to native T1 mapping and LV ejection fraction. EVIDENCE LEVEL: 2 TECHNICAL EFFICACY STAGE: 2.


Subject(s)
Cardiomyopathies , Kidney Failure, Chronic , Humans , Prognosis , Prospective Studies , Reproducibility of Results , Risk Factors , Magnetic Resonance Imaging , Ventricular Function, Left , Stroke Volume , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/diagnostic imaging , Predictive Value of Tests , Magnetic Resonance Imaging, Cine/methods
11.
Stat Methods Med Res ; 32(12): 2299-2317, 2023 12.
Article in English | MEDLINE | ID: mdl-37881001

ABSTRACT

In recent years, with the increasing number and complexity of infectious diseases, the idea of using control charts to monitor public health and disease has been proposed. In this paper, we study multivariate control charts for monitoring a bivariate integer-valued autocorrelation process with bivariate Poisson distribution and select the optimal control scheme by comparing the performance of control charts. Furthermore, the meningococcal patient event in two states in Australia serves as an example to illustrate the application of these methods. The results show that the D exponentially weighted moving average control scheme can detect the changes in the mean value faster, which is a significant advantage.


Subject(s)
Communicable Diseases , Meningococcal Infections , Humans , Poisson Distribution , Australia/epidemiology
12.
Comput Biol Med ; 165: 107439, 2023 10.
Article in English | MEDLINE | ID: mdl-37678135

ABSTRACT

DNA storage systems have begun to attract considerable attention as next-generation storage technologies due to their high densities and longevity. However, efficient primer design for random-access in synthesized DNA strands is still an issue that needs to be solved. Although previous studies have explored various constraints for primer design in DNA storage systems, there is no attention paid to the combination of weakly mutually uncorrelated codes with the maximum run length constraint. In this paper, we first propose a code design by combining weakly mutually uncorrelated codes with the maximum run length constraint. Moreover, we also explore the weakly mutually uncorrelated codes to satisfy combinations of maximum run length constraint with more constraints such as being almost-balanced and having large Hamming distance, which are also efficient constraints for random-access in DNA storage systems. To guarantee that the proposed codes can be adapted to primer design with variable length, we present modified code construction methods to achieve different lengths of the code. Then, we provide an analysis of the size of the proposed codes, which indicates the capacity to support primer design. Finally, we compare the codes with those of previous works to show that the proposed codes can always guarantee the maximum run length constraint, which is helpful for random-access for DNA storage.


Subject(s)
DNA , Salaries and Fringe Benefits
13.
Heliyon ; 9(6): e17602, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37457815

ABSTRACT

Data stored on physical storage devices and transmitted over communication channels often have a lot of redundant information, which can be reduced through compression techniques to conserve space and reduce the time it takes to transmit the data. The need for adequate security measures, such as secret key control in specific techniques, raises concerns about data exposure to potential attacks. Encryption plays a vital role in safeguarding information and maintaining its confidentiality by utilizing a secret key to make the data unreadable and unalterable. The focus of this paper is to tackle the challenge of simultaneously compressing and encrypting data without affecting the efficacy of either process. The authors propose an efficient and secure compression method incorporating a secret key to accomplish this goal. Encoding input data involves scrambling it with a generated key and then transforming it through the Burrows-Wheeler Transform (BWT). Subsequently, the output from the BWT is compressed through both Move-To-Front Transform and Run-Length Encoding. This method blends the cryptographic principles of confusion and diffusion into the compression process, enhancing its performance. The proposed technique is geared towards providing robust encryption and sufficient compression. Experimentation results show that it outperforms other techniques in terms of compression ratio. A security analysis of the technique has determined that it is susceptible to the secret key and plaintext, as measured by the unicity distance. Additionally, the results of the proposed technique showed a significant improvement with a compression ratio close to 90% after passing all the test text files.

14.
ISA Trans ; 142: 335-346, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37524624

ABSTRACT

The electrocardiogram (ECG) signals are commonly used to identify heart complications. These recordings generate large data that needed to be stored or transferred in telemedicine applications, which require more storage space and bandwidth. Therefore, a strong motivation is present to develop efficient compression algorithms for ECG signals. In the above context, this work proposes a novel compression algorithm using adaptive tunable-Q wavelet transform (TQWT) and modified dead-zone quantizer (DZQ). The parameters of TQWT and threshold values of DZQ are selected using the proposed Sparse-grey wolf optimization (Sparse-GWO) algorithm. The Sparse-GWO is proposed in this work to reduce the computation time of the original GWO. Moreover, it is also compared with some popular algorithms such as original GWO, particle swarm optimization (PSO), Hybrid PSOGWO, and Sparse-PSO. The DZQ has been utilized to perform thresholding and quantization. Then, run-length encoding (RLE) has been used to encode the quantized coefficients. The proposed work has been performed on the MIT-BIH arrhythmia database. Quality assessment performed on reconstructed signals ensure the minimal impact of compression on the morphology of reconstructed ECG signals. The compression performance of proposed algorithm is measured in terms of the following evaluation matrices: percent root-mean-square difference (PRD1), compression ratio (CR), signal-to-noise ratio (SNR), and quality score (QS1). The obtained average values are 3.21%, 20.56, 30.62 dB, and 7.79, respectively.

15.
J Appl Stat ; 50(10): 2079-2107, 2023.
Article in English | MEDLINE | ID: mdl-37434629

ABSTRACT

In the present article, a double generally weighted moving average (DGWMA) control chart based on a three-parameter logarithmic transformation is proposed for monitoring the process variability, namely the S2-DGWMA chart. Monte-Carlo simulations are utilized in order to evaluate the run-length performance of the S2-DGWMA chart. In addition, a detailed comparative study is conducted to compare the performance of the S2-DGWMA chart with several well-known memory-type control charts in the literature. The comparisons indicate that the proposed one is more efficient in detecting small shifts, while it is more sensitive in identifying upward shifts in the process variability. A real data example is given to present the implementation of the new S2-DGWMA chart.

16.
Biomed Phys Eng Express ; 9(4)2023 06 14.
Article in English | MEDLINE | ID: mdl-37279702

ABSTRACT

Background. In telecardiology, the bio-signal acquisition processing and communication for clinical purposes occupies larger storage and significant bandwidth over a communication channel. Electrocardiograph (ECG) compression with effective reproductivity is highly desired. In the present work, a compression technique for ECG signals with less distortion by using a non-decimated stationary wavelet with a run-length encoding scheme has been proposed.Method. In the present work non-decimated stationary wavelet transform (NSWT) method has been developed to compress the ECG signals. The signal is subdivided into N levels with different thresholding values. The wavelet coefficients having values larger than the threshold are evaluated and the remaining are suppressed. In the presented technique, the biorthogonal (bior) wavelet is employed as it improves the compression ratio as well percentage root means square ratio (PRD) when compared to the existing method and exhibits improved results. After pre-processing, the coefficients are subjected to the Savitzky-Golay filter to remove corrupted signals. The wavelet coefficients are then quantized using dead-zone quantization, which eliminates values that are close to zero. To encode these values, a run-length encoding (RLE) scheme is applied, resulting in compressed ECG signals.Results. The presented methodology has been evaluated on the MITDB arrhythmias database which contains 4800 ECG fragments from forty-eight clinical records. The proposed technique has achieved an average compression ratio of 33.12, PRD of 1.99, NPRD of 2.53, and QS of 16.57, making it a promising approach for various applications.Conclusion. The proposed technique exhibits a high compression ratio and reduces distortion compared to the existing method.


Subject(s)
Data Compression , Wavelet Analysis , Algorithms , Data Compression/methods , Signal Processing, Computer-Assisted , Electrocardiography/methods
17.
J Appl Stat ; 50(7): 1477-1495, 2023.
Article in English | MEDLINE | ID: mdl-37197761

ABSTRACT

In competitive business, such as insurance and telecommunications, customers can easily replace one provider for another, which leads to customer attrition. Keeping customer attrition rate low is crucial for companies, since retaining a customer is more profitable than recruiting a new one. As a main statistical process control (SPC) method, the CUSUM scheme is able to detect small and persistent shifts in customer attrition. However, customer attrition summaries are typically available on an uneven time scale (e.g. 4-week and 5-week 'business month'), which may not satisfy the assumptions of traditional CUSUM designs. This paper mainly develops a latent CUSUM chart based on an exponential model for monitoring 'monthly' customer attrition, under varying time scales. Both maximum likelihood and least squares methods are studied, where the former mostly performs better and the latter is advantageous for quite small shifts. We apply a Markov chain algorithm to obtain the average run length (ARL), make calibrations for different combinations of parameters, and present reference tables of cutoffs. Three more complicated models are considered to test the robustness of deviations from the initial model. Furthermore, a real example of monitoring monthly customer attrition from a Chinese insurance company is used to illustrate the scheme.

18.
Entropy (Basel) ; 25(3)2023 Mar 02.
Article in English | MEDLINE | ID: mdl-36981333

ABSTRACT

The geometric first-order integer-valued autoregressive process (GINAR(1)) can be particularly useful to model relevant discrete-valued time series, namely in statistical process control. We resort to stochastic ordering to prove that the GINAR(1) process is a discrete-time Markov chain governed by a totally positive order 2 (TP2) transition matrix.Stochastic ordering is also used to compare transition matrices referring to pairs of GINAR(1) processes with different values of the marginal mean. We assess and illustrate the implications of these two stochastic ordering results, namely on the properties of the run length of geometric charts for monitoring GINAR(1) counts.

19.
Eur J Radiol Open ; 10: 100476, 2023.
Article in English | MEDLINE | ID: mdl-36793772

ABSTRACT

Purpose: To develop models based on radiomics and genomics for predicting the histopathologic nuclear grade with localized clear cell renal cell carcinoma (ccRCC) and to assess whether macro-radiomics models can predict the microscopic pathological changes. Method: In this multi-institutional retrospective study, a computerized tomography (CT) radiomic model for nuclear grade prediction was developed. Utilizing a genomics analysis cohort, nuclear grade-associated gene modules were identified, and a gene model was constructed based on top 30 hub mRNA to predict the nuclear grade. Using a radiogenomic development cohort, biological pathways were enriched by hub genes and a radiogenomic map was created. Results: The four-features-based SVM model predicted nuclear grade with an area under the curve (AUC) score of 0.94 in validation sets, while a five-gene-based model predicted nuclear grade with an AUC of 0.73 in the genomics analysis cohort. A total of five gene modules were identified to be associated with the nuclear grade. Radiomic features were only associated with 271 out of 603 genes in five gene modules and eight top 30 hub genes. Differences existed in the enrichment pathway between associated and un-associated with radiomic features, which were associated with two genes of five-gene signatures in the mRNA model. Conclusion: The CT radiomics models exhibited higher predictive performance than mRNA models. The association between radiomic features and mRNA related to nuclear grade is not universal.

20.
Stat Methods Med Res ; 32(4): 671-690, 2023 04.
Article in English | MEDLINE | ID: mdl-36788007

ABSTRACT

A useful tool that has gained popularity in the Quality Control area is the control chart which monitors a process over time, identifies potential changes, understands variations, and eventually improves the quality and performance of the process. This article introduces a new class of multivariate semiparametric control charts for monitoring multivariate mixed-type data, which comprise both continuous and discrete random variables (rvs). Our methodology leverages ideas from clustering and Statistical Process Control to develop control charts for MIxed-type data. We propose four control chart schemes based on modified versions of the KAy-means for MIxed LArge KAMILA data clustering algorithm, where we assume that the two existing clusters represent the reference and the test sample. The charts are semiparametric, the continuous rvs follow a distribution that belongs in the class of elliptical distributions. Categorical scale rvs follow a multinomial distribution. We present the algorithmic procedures and study the characteristics of the new control charts. The performance of the proposed schemes is evaluated on the basis of the False Alarm Rate and in-control Average Run Length. Finally, we demonstrate the effectiveness and applicability of our proposed methods utilizing real-world data.


Subject(s)
Algorithms
SELECTION OF CITATIONS
SEARCH DETAIL