Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 798
Filtrar
1.
Artigo em Inglês | MEDLINE | ID: mdl-38981068

RESUMO

Glass ceramic (GC) is the most promising material for objective lenses for extreme ultraviolet lithography that must meet the subnanometer precision, which is characterized by low values of high spatial frequency surface roughness (HSFR). However, the HSFR of GC is typically degraded during ion beam figuring (IBF). Herein, a developed method for constructing molecular dynamics (MD) models of GC was presented, and the formation mechanisms of surface morphologies were investigated. The results indicated that the generation of the dot-like microstructure was the result of the difference in the erosion rate caused by the difference in the intrinsic properties between ceramic phases (CPs) and glass phases (GPs). Further, the difference in the microstructure of the IBF surface under different beam angles was mainly caused by the difference in the two types of sputtering. Quantum mechanical calculations showed that the presence of interstitial atoms would result in electron rearrangement and that the electron localization can lead to a reduction in CP stability. To obtain a homogeneous surface, the effects of beam parameters on the heterogeneous surface were systematically investigated based on the proposed MD model. Then, a novel ion beam modification (IBM) method was proposed and demonstrated by TEM and GIXRD. The range of ion beam smoothing parameters that could effectively converge the HSFR of the modified surface was determined through numerous experiments. Using the optimized beam parameters, an ultrathin homogeneous modified surface within 3 nm was obtained. The HSFR of GC smoothed by ion beam modification-assisted smoothing (IBMS) dropped from 0.348 to 0.090 nm, a 74% reduction. These research results offer a deeper understanding of the morphology formation mechanisms of the GC surfaces involved in ion beam processing and may point to a new approach for achieving ultrasmooth heterostructure surfaces down to the subnanometer scale.

2.
PeerJ ; 12: e17408, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38948203

RESUMO

Background: Over the last few decades, diabetes-related mortality risks (DRMR) have increased in Florida. Although there is evidence of geographic disparities in pre-diabetes and diabetes prevalence, little is known about disparities of DRMR in Florida. Understanding these disparities is important for guiding control programs and allocating health resources to communities most at need. Therefore, the objective of this study was to investigate geographic disparities and temporal changes of DRMR in Florida. Methods: Retrospective mortality data for deaths that occurred from 2010 to 2019 were obtained from the Florida Department of Health. Tenth International Classification of Disease codes E10-E14 were used to identify diabetes-related deaths. County-level mortality risks were computed and presented as number of deaths per 100,000 persons. Spatial Empirical Bayesian (SEB) smoothing was performed to adjust for spatial autocorrelation and the small number problem. High-risk spatial clusters of DRMR were identified using Tango's flexible spatial scan statistics. Geographic distribution and high-risk mortality clusters were displayed using ArcGIS, whereas seasonal patterns were visually represented in Excel. Results: A total of 54,684 deaths were reported during the study period. There was an increasing temporal trend as well as seasonal patterns in diabetes mortality risks with high risks occurring during the winter. The highest mortality risk (8.1 per 100,000 persons) was recorded during the winter of 2018, while the lowest (6.1 per 100,000 persons) was in the fall of 2010. County-level SEB smoothed mortality risks varied by geographic location, ranging from 12.6 to 81.1 deaths per 100,000 persons. Counties in the northern and central parts of the state tended to have high mortality risks, whereas southern counties consistently showed low mortality risks. Similar to the geographic distribution of DRMR, significant high-risk spatial clusters were also identified in the central and northern parts of Florida. Conclusion: Geographic disparities of DRMR exist in Florida, with high-risk spatial clusters being observed in rural central and northern areas of the state. There is also evidence of both increasing temporal trends and Winter peaks of DRMR. These findings are helpful for guiding allocation of resources to control the disease, reduce disparities, and improve population health.


Assuntos
Diabetes Mellitus , Humanos , Florida/epidemiologia , Estudos Retrospectivos , Diabetes Mellitus/mortalidade , Diabetes Mellitus/epidemiologia , Feminino , Masculino , Teorema de Bayes , Disparidades nos Níveis de Saúde , Pessoa de Meia-Idade , Fatores de Risco , Estações do Ano , Idoso , Adulto
3.
Sci Total Environ ; 943: 173748, 2024 Sep 15.
Artigo em Inglês | MEDLINE | ID: mdl-38857793

RESUMO

In many coastal cities around the world, continuing water degradation threatens the living environment of humans and aquatic organisms. To assess and control the water pollution situation, this study estimated the Biochemical Oxygen Demand (BOD) concentration of Hong Kong's marine waters using remote sensing and an improved machine learning (ML) method. The scheme was derived from four ML algorithms (RBF, SVR, RF, XGB) and calibrated using a large amount (N > 1000) of in-situ BOD5 data. Based on labeled datasets with different preprocessing, i.e., the original BOD5, the log10(BOD5), and label distribution smoothing (LDS), three types of models were trained and evaluated. The results highlight the superior potential of the LDS-based model to improve BOD5 estimate by dealing with imbalanced training dataset. Additionally, XGB and RF outperformed RBF and SVR when the model was developed using log10(BOD5) or LDS(BOD5). Over two decades, the BOD5 concentration of Hong Kong marine waters in the autumn (Sep. to Nov.) shows a downward trend, with significant decreases in Deep Bay, Western Buffer, Victoria Harbour, Eastern Buffer, Junk Bay, Port Shelter, and the Tolo Harbour and Channel. Principal component analysis revealed that nutrient levels emerged as the predominant factor in Victoria Harbour and the interior of Deep Bay, while chlorophyll-related and physical parameters were dominant in Southern, Mirs Bay, Northwestern, and the outlet of Deep Bay. LDS provides a new perspective to improve ML-based water quality estimation by alleviating the imbalance in the labeled dataset. Overall, the remotely sensed BOD5 can offer insight into the spatial-temporal distribution of organic matter in Hong Kong coastal waters and valuable guidance for the pollution control.


Assuntos
Monitoramento Ambiental , Aprendizado de Máquina , Água do Mar , Hong Kong , Monitoramento Ambiental/métodos , Água do Mar/química , Tecnologia de Sensoriamento Remoto , Análise da Demanda Biológica de Oxigênio , Poluição da Água/estatística & dados numéricos , Poluição da Água/análise , Poluentes Químicos da Água/análise
4.
Brief Bioinform ; 25(4)2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38877886

RESUMO

Single-cell sequencing has revolutionized our ability to dissect the heterogeneity within tumor populations. In this study, we present LoRA-TV (Low Rank Approximation with Total Variation), a novel method for clustering tumor cells based on the read depth profiles derived from single-cell sequencing data. Traditional analysis pipelines process read depth profiles of each cell individually. By aggregating shared genomic signatures distributed among individual cells using low-rank optimization and robust smoothing, the proposed method enhances clustering performance. Results from analyses of both simulated and real data demonstrate its effectiveness compared with state-of-the-art alternatives, as supported by improvements in the adjusted Rand index and computational efficiency.


Assuntos
Neoplasias , Análise de Célula Única , Análise de Célula Única/métodos , Humanos , Neoplasias/genética , Neoplasias/patologia , Análise por Conglomerados , Algoritmos , Biologia Computacional/métodos , Sequenciamento de Nucleotídeos em Larga Escala/métodos , Genômica/métodos
5.
Sensors (Basel) ; 24(11)2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38894126

RESUMO

Prefabricated construction has pioneered a new model in the construction industry, where prefabricated component modules are produced in factories and assembled on-site by construction workers, resulting in a highly efficient and convenient production process. Within the construction industry value chain, the smoothing and roughening of precast concrete components are critical processes. Currently, these tasks are predominantly performed manually, often failing to achieve the desired level of precision. This paper designs and develops a robotic system for smoothing and roughening precast concrete surfaces, along with a multi-degree-of-freedom integrated intelligent end-effector for smoothing and roughening. Point-to-point path planning methods are employed to achieve comprehensive path planning for both smoothing and roughening, enhancing the diversity of textural patterns using B-spline curves. In the presence of embedded obstacles, a biologically inspired neural network method is introduced for precise smoothing operation planning, and the A* algorithm is incorporated to enable the robot's escape from dead zones. Experimental validation further confirms the feasibility of the entire system and the accuracy of the machining path planning methods. The experimental results demonstrate that the proposed system meets the precision requirements for smoothing and offers diversity in roughening, affirming its practicality in the precast concrete process and expanding the automation level and application scenarios of robots in the field of prefabricated construction.

6.
J Appl Stat ; 51(7): 1287-1317, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38835826

RESUMO

The area of functional principal component analysis (FPCA) has seen relatively few contributions from the Bayesian inference. A Bayesian method in FPCA is developed under the cases of continuous and binary observations for sparse and irregularly spaced data. In the proposed Markov chain Monte Carlo (MCMC) method, Gibbs sampler approach is adopted to update the different variables based on their conditional posterior distributions. In FPCA, a set of eigenfunctions is suggested under Stiefel manifold, and samples are drawn from a Langevin-Bingham matrix variate distribution. Penalized splines are used to model mean trajectory and eigenfunction trajectories in generalized functional mixed models; and the proposed model is casted into a mixed-effects model framework for Bayesian inference. To determine the number of principal components, reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithm is implemented. Four different simulation settings are conducted to demonstrate competitive performance against non-Bayesian approaches in FPCA. Finally, the proposed method is illustrated to the analysis of body mass index (BMI) data by gender and ethnicity.

7.
PeerJ Comput Sci ; 10: e2011, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38855226

RESUMO

Mixup is an effective data augmentation method that generates new augmented samples by aggregating linear combinations of different original samples. However, if there are noises or aberrant features in the original samples, mixup may propagate them to the augmented samples, leading to over-sensitivity of the model to these outliers. To solve this problem, this paper proposes a new mixup method called AMPLIFY. This method uses the attention mechanism of Transformer itself to reduce the influence of noises and aberrant values in the original samples on the prediction results, without increasing additional trainable parameters, and the computational cost is very low, thereby avoiding the problem of high resource consumption in common mixup methods such as Sentence Mixup. The experimental results show that, under a smaller computational resource cost, AMPLIFY outperforms other mixup methods in text classification tasks on seven benchmark datasets, providing new ideas and new ways to further improve the performance of pre-trained models based on the attention mechanism, such as BERT, ALBERT, RoBERTa, and GPT. Our code can be obtained at https://github.com/kiwi-lilo/AMPLIFY.

8.
Stat Med ; 2024 Jun 16.
Artigo em Inglês | MEDLINE | ID: mdl-38881189

RESUMO

In health and clinical research, medical indices (eg, BMI) are commonly used for monitoring and/or predicting health outcomes of interest. While single-index modeling can be used to construct such indices, methods to use single-index models for analyzing longitudinal data with multiple correlated binary responses are underdeveloped, although there are abundant applications with such data (eg, prediction of multiple medical conditions based on longitudinally observed disease risk factors). This article aims to fill the gap by proposing a generalized single-index model that can incorporate multiple single indices and mixed effects for describing observed longitudinal data of multiple binary responses. Compared to the existing methods focusing on constructing marginal models for each response, the proposed method can make use of the correlation information in the observed data about different responses when estimating different single indices for predicting response variables. Estimation of the proposed model is achieved by using a local linear kernel smoothing procedure, together with methods designed specifically for estimating single-index models and traditional methods for estimating generalized linear mixed models. Numerical studies show that the proposed method is effective in various cases considered. It is also demonstrated using a dataset from the English Longitudinal Study of Aging project.

9.
Materials (Basel) ; 17(12)2024 Jun 18.
Artigo em Inglês | MEDLINE | ID: mdl-38930346

RESUMO

Pinch milling is a new technique for slender and long blade machining, which can simultaneously improve the machining quality and efficiency. However, two-cutter orientation planning is a major challenge due to the irregular blade surfaces and the structural constraints of nine-axis machine tools. In this paper, a method of twin-tool smoothing orientation determination is proposed for a thin-walled blade with pinch milling. Considering the processing status of the two cutters and workpiece, the feasible domain of the twin-tool axis vector and its characterization method are defined. At the same time, an evaluation algorithm of global and local optimization is proposed, and a smoothing algorithm is explored within the feasible domain along the two tool paths. Finally, a set of smoothly aligned tool orientations are generated, and the overall smoothness is nearly globally optimized. A preliminary simulation verification of the proposed algorithm is conducted on a turbine blade model and the planning tool orientation is found to be stable, smooth, and well formed, which avoids collision interference and ultimately improves the machining accuracy of the blade with difficult-to-machine materials.

10.
Sensors (Basel) ; 24(12)2024 Jun 18.
Artigo em Inglês | MEDLINE | ID: mdl-38931718

RESUMO

In dynamic environments, real-time trajectory planners are required to generate smooth trajectories. However, trajectory planners based on real-time sampling often produce jerky trajectories that necessitate post-processing steps for smoothing. Existing local smoothing methods may result in trajectories that collide with obstacles due to the lack of a direct connection between the smoothing process and trajectory optimization. To address this limitation, this paper proposes a novel trajectory-smoothing method that considers obstacle constraints in real time. By introducing virtual attractive forces from original trajectory points and virtual repulsive forces from obstacles, the resultant force guides the generation of smooth trajectories. This approach enables parallel execution with the trajectory-planning process and requires low computational overhead. Experimental validation in different scenarios demonstrates that the proposed method not only achieves real-time trajectory smoothing but also effectively avoids obstacles.

11.
Comput Biol Chem ; 111: 108094, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38781748

RESUMO

DNA methylation is an important epigenetic modification involved in gene regulation. Advances in the next generation sequencing technology have enabled the retrieval of DNA methylation information at single-base-resolution. However, due to the sequencing process and the limited amount of isolated DNA, DNA-methylation-data are often noisy and sparse, which complicates the identification of differentially methylated regions (DMRs), especially when few replicates are available. We present a varying-coefficient model for detecting DMRs by using single-base-resolved methylation information. The model simultaneously smooths the methylation profiles and allows detection of DMRs, while accounting for additional covariates. The proposed model takes into account possible overdispersion by using a beta-binomial distribution. The overdispersion itself can be modeled as a function of the genomic region and explanatory variables. We illustrate the properties of the proposed model by applying it to two real-life case studies.


Assuntos
Metilação de DNA , Análise de Sequência de DNA , Humanos , Análise de Sequência de DNA/métodos , Sequenciamento de Nucleotídeos em Larga Escala
12.
Appl Radiat Isot ; 210: 111361, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38815446

RESUMO

In the nuclear spectrum analysis processing, spectrum smoothing can remove the statistical fluctuation in the spectrum, which is beneficial for peak detection and peak area calculation. In this work, a spectrum smoothing algorithm is proposed based on digital Sallen-Key filter, which contains four parameters (m, n, k, D). The amplitude-frequency response curve of Sallen-Key filter is deduced and the filtering performance is analyzed. Meanwhile, the effects of the four parameters on the shape of the smoothed spectrum are explored: D affects the counts and peak areas of the spectrum, and the peak area can be corrected by the peak area correction function S'. The parameters of m, n and k affect the peak position after smoothing, making the peak position shift to the right, and the peak position correction function P' can be used to correct the peak position, when n¿2, the spectrum data appear negative after smoothing, when k¿2, the smoothed spectrum broadening degree is greater than 20%. Smoothness (R), noise smoothing factor (NSF), spectrum count ratio before and after smoothing (PER), and comprehensive evaluation factor (Q) are used to evaluate the smoothing effect of the algorithm. The parameters of the algorithm are optimally selected: about the gamma spectrum of 137Cs and 60Co, the optimal parameters are m=1.5 n=2 k=2 D=1, about the characteristic X-ray spectrum of Fe and quasi-geological sample (TiMnFeNiCuZn), the optimal parameters are m=1.1 n=1.1 k=1.3 D=1. Based on Sallen-Key smoothing method, Fourier transform method, Gaussian function method, wavelet transformation method, center of gravity method and least squares method, the gamma spectrum of 137Cs is smoothed and denoised in this paper. The results show that the Sallen-Key method has better spectrum denoising effect (R=0.6056) and comprehensive performance indicators (Q=0.6104), which can be further applied for the smoothing of nuclear spectrum data.

13.
Health Syst (Basingstoke) ; 13(2): 133-149, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38800601

RESUMO

An accurate forecast of Emergency Department (ED) arrivals by an hour of the day is critical to meet patients' demand. It enables planners to match ED staff to the number of arrivals, redeploy staff, and reconfigure units. In this study, we develop a model based on Generalised Additive Models and an advanced dynamic model based on exponential smoothing to generate an hourly probabilistic forecast of ED arrivals for a prediction window of 48 hours. We compare the forecast accuracy of these models against appropriate benchmarks, including TBATS, Poisson Regression, Prophet, and simple empirical distribution. We use Root Mean Squared Error to examine the point forecast accuracy and assess the forecast distribution accuracy using Quantile Bias, PinBall Score and Pinball Skill Score. Our results indicate that the proposed models outperform their benchmarks. Our developed models can also be generalised to other services, such as hospitals, ambulances or clinical desk services.

14.
Sci Rep ; 14(1): 11865, 2024 05 24.
Artigo em Inglês | MEDLINE | ID: mdl-38789592

RESUMO

Chest X-ray (CXR) is an extensively utilized radiological modality for supporting the diagnosis of chest diseases. However, existing research approaches suffer from limitations in effectively integrating multi-scale CXR image features and are also hindered by imbalanced datasets. Therefore, there is a pressing need for further advancement in computer-aided diagnosis (CAD) of thoracic diseases. To tackle these challenges, we propose a multi-branch residual attention network (MBRANet) for thoracic disease diagnosis. MBRANet comprises three components. Firstly, to address the issue of inadequate extraction of spatial and positional information by the convolutional layer, a novel residual structure incorporating a coordinate attention (CA) module is proposed to extract features at multiple scales. Next, based on the concept of a Feature Pyramid Network (FPN), we perform multi-scale feature fusion in the following manner. Thirdly, we propose a novel Multi-Branch Feature Classifier (MFC) approach, which leverages the class-specific residual attention (CSRA) module for classification instead of relying solely on the fully connected layer. In addition, the designed BCEWithLabelSmoothing loss function improves the generalization ability and mitigates the problem of class imbalance by introducing a smoothing factor. We evaluated MBRANet on the ChestX-Ray14, CheXpert, MIMIC-CXR, and IU X-Ray datasets and achieved average AUCs of 0.841, 0.895, 0.805, and 0.745, respectively. Our method outperformed state-of-the-art baselines on these benchmark datasets.


Assuntos
Radiografia Torácica , Humanos , Radiografia Torácica/métodos , Redes Neurais de Computação , Doenças Torácicas/diagnóstico por imagem , Doenças Torácicas/diagnóstico , Algoritmos , Diagnóstico por Computador/métodos
15.
Appl Spectrosc ; : 37028241248200, 2024 May 02.
Artigo em Inglês | MEDLINE | ID: mdl-38695133

RESUMO

Smoothing is a widely used approach for measurement noise reduction in spectral analysis. However, it suffers from signal distortion caused by peak suppression. A locally self-adjustive smoothing method is developed that retains sharp peaks and less distorted signals. The proposed method uses only one parameter that determines the global smoothness of data, while balancing the local smoothness using the data itself. Simulation and real experiments in comparison with existing convolution-based smoothing methods indicate both qualitatively and quantitatively improved noise reduction performance in practical scenarios. We also discuss parameter selection and demonstrate an application for the automated smoothing and detection of a given number of peaks from noisy measurement data.

16.
Heliyon ; 10(7): e28199, 2024 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-38571638

RESUMO

In recent times, many investigators have delved into plastic waste (PW) research, both locally and internationally. Many of these studies have focused on problems related to land-based and marine-based PW management with its attendant impact on public health and the ecosystem. Hitherto, there have been little or no studies on forecasting PW quantities in developing countries (DCs). The key objective of this study is to provide a forecast on PW generation in the city of Johannesburg (CoJ), South Africa over the next three decades. The data used for the forecasting were historical data obtained from Statistics South Africa (StatsSA). For effective prediction and comparison, three-time series models were employed in this study. They include exponential smoothing (ETS), Artificial Neural Network (ANN), and the Gaussian Process Regression (GPR). The exponential kernel GPR model performed best on the overall plastic prediction with a determination coefficient (R2) of 0.96, however, on individual PW estimation, ANN was better with an overall R2 of 0.93. From the result, it is predicted that between 2021 and 2050, the total PW generated in CoJ is forecasted to be around 6.7 megatonnes with an average of 0.22 megatonnes/year. In addition, the estimated plastic composition is 17,910 tonnes PS per year; 13,433 tonnes PP per year; 59,440 tonnes HDPE per year; 4478 tonnes PVC per year; 85,074 tonnes PET per year; 34,590 tonnes LDPE per year and 8955 tonnes other PWs per year.

17.
J Appl Stat ; 51(6): 1041-1056, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38628452

RESUMO

Traffic pattern identification and accident evaluation are essential for improving traffic planning, road safety, and traffic management. In this paper, we establish classification and regression models to characterize the relationship between traffic flows and different time points and identify different patterns of traffic flows by a negative binomial model with smoothing splines. It provides mean response curves and Bayesian credible bands for traffic flows, a single index, and the log-likelihood difference, for traffic flow pattern recognition. We further propose an impact measure for evaluating the influence of accidents on traffic flows based on the fitted negative binomial model. The proposed method has been successfully applied to real-world traffic flows, and it can be used for improving traffic management.

18.
Skin Res Technol ; 30(4): e13672, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38591218

RESUMO

BACKGROUND: Hyaluronic acid (HA) is a widely used active cosmetic ingredient. Its multiple skin care benefits are modulated by its molecular weight. Low molecular weight (LMW) HA can penetrate the skin, but high molecular weight (HMW) HA remains at the surface. Here, we assessed how vectorization of HMW HA with bentonite clay-achieved with an innovative technology-enhances its cosmetic and hydrating properties. MATERIALS AND METHODS: The two HA forms were applied to skin explants; their penetration and smoothing effects were monitored by Raman spectroscopy and scanning electron microscopy. The two forms were biochemically characterised by chromatography, enzyme sensitivity assays, and analysis of Zeta potential. Cosmetics benefits such as, the smoothing effect of vectorised-HA was assessed in ex vivo experiments on skin explants. A placebo-controlled clinical study was finally conducted applying treatments for 28 days to analyse the final benefits in crow's feet area. RESULTS: Raman spectroscopy analysis revealed native HMW HA to accumulate at the surface of skin explants, whereas vectorised HMW HA was detected in deeper skin layers. This innovative vectorisation process changed the zeta potential of vectorised HMW HA, being then more anionic and negative without impacting the biochemical structure of native HA. In terms of cosmetic benefits, following application of vectorised HMW HA ex vivo, the skin's surface was visibly smoother. This smoothing was clinically confirmed, with a significant reduction in fine lines. CONCLUSION: The development of innovative process vectorising HMW HA allowed HMW HA penetration in the skin. This enhanced penetration extends the clinical benefits of this iconic cosmetic ingredient.


Assuntos
Ácido Hialurônico , Envelhecimento da Pele , Humanos , Ácido Hialurônico/farmacologia , Ácido Hialurônico/química , Argila , Peso Molecular , Pele
19.
Lifetime Data Anal ; 2024 Apr 16.
Artigo em Inglês | MEDLINE | ID: mdl-38625444

RESUMO

In studies with time-to-event outcomes, multiple, inter-correlated, and time-varying covariates are commonly observed. It is of great interest to model their joint effects by allowing a flexible functional form and to delineate their relative contributions to survival risk. A class of semiparametric transformation (ST) models offers flexible specifications of the intensity function and can be a general framework to accommodate nonlinear covariate effects. In this paper, we propose a partial-linear single-index (PLSI) transformation model that reduces the dimensionality of multiple covariates into a single index and provides interpretable estimates of the covariate effects. We develop an iterative algorithm using the regression spline technique to model the nonparametric single-index function for possibly nonlinear joint effects, followed by nonparametric maximum likelihood estimation. We also propose a nonparametric testing procedure to formally examine the linearity of covariate effects. We conduct Monte Carlo simulation studies to compare the PLSI transformation model with the standard ST model and apply it to NYU Langone Health de-identified electronic health record data on COVID-19 hospitalized patients' mortality and a Veteran's Administration lung cancer trial.

20.
Stat Med ; 43(10): 2007-2042, 2024 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-38634309

RESUMO

Quantile regression, known as a robust alternative to linear regression, has been widely used in statistical modeling and inference. In this paper, we propose a penalized weighted convolution-type smoothed method for variable selection and robust parameter estimation of the quantile regression with high dimensional longitudinal data. The proposed method utilizes a twice-differentiable and smoothed loss function instead of the check function in quantile regression without penalty, and can select the important covariates consistently using the efficient gradient-based iterative algorithms when the dimension of covariates is larger than the sample size. Moreover, the proposed method can circumvent the influence of outliers in the response variable and/or the covariates. To incorporate the correlation within each subject and enhance the accuracy of the parameter estimation, a two-step weighted estimation method is also established. Furthermore, we prove the oracle properties of the proposed method under some regularity conditions. Finally, the performance of the proposed method is demonstrated by simulation studies and two real examples.


Assuntos
Algoritmos , Modelos Estatísticos , Humanos , Simulação por Computador , Modelos Lineares , Tamanho da Amostra
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...