Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 157
Filter
1.
Sensors (Basel) ; 24(12)2024 Jun 18.
Article in English | MEDLINE | ID: mdl-38931720

ABSTRACT

This paper addresses the critical need for advanced real-time vehicle detection methodologies in Vehicle Intelligence Systems (VIS), especially in the context of using Unmanned Aerial Vehicles (UAVs) for data acquisition in severe weather conditions, such as heavy snowfall typical of the Nordic region. Traditional vehicle detection techniques, which often rely on custom-engineered features and deterministic algorithms, fall short in adapting to diverse environmental challenges, leading to a demand for more precise and sophisticated methods. The limitations of current architectures, particularly when deployed in real-time on edge devices with restricted computational capabilities, are highlighted as significant hurdles in the development of efficient vehicle detection systems. To bridge this gap, our research focuses on the formulation of an innovative approach that combines the fractional B-spline wavelet transform with a tailored U-Net architecture, operational on a Raspberry Pi 4. This method aims to enhance vehicle detection and localization by leveraging the unique attributes of the NVD dataset, which comprises drone-captured imagery under the harsh winter conditions of northern Sweden. The dataset, featuring 8450 annotated frames with 26,313 vehicles, serves as the foundation for evaluating the proposed technique. The comparative analysis of the proposed method against state-of-the-art detectors, such as YOLO and Faster RCNN, in both accuracy and efficiency on constrained devices, emphasizes the capability of our method to balance the trade-off between speed and accuracy, thereby broadening its utility across various domains.

2.
Sensors (Basel) ; 24(12)2024 Jun 18.
Article in English | MEDLINE | ID: mdl-38931735

ABSTRACT

Autonomous exploration in unknown environments is a fundamental problem for the practical application of unmanned ground vehicles (UGVs). However, existing exploration methods face difficulties when directly applied to UGVs due to limited sensory coverage, conservative exploration strategies, inappropriate decision frequencies, and the non-holonomic constraints of wheeled vehicles. In this paper, we present IB-PRM, a hierarchical planning method that combines Incremental B-splines with a probabilistic roadmap, which can support rapid exploration by a UGV in complex unknown environments. We define a new frontier structure that includes both information-gain guidance and a B-spline curve segment with different arrival orientations to satisfy the non-holonomic constraint characteristics of UGVs. We construct and maintain local and global graphs to generate and store filtered frontiers. By jointly solving the Traveling Salesman Problem (TSP) using these frontiers, we obtain the optimal global path traversing feasible frontiers. Finally, we optimize the global path based on the Time Elastic Band (TEB) algorithm to obtain a smooth, continuous, and feasible local trajectory. We conducted comparative experiments with existing advanced exploration methods in simulation environments of different scenarios, and the experimental results demonstrate that our method can effectively improve the efficiency of UGV exploration.

3.
Entropy (Basel) ; 26(6)2024 Jun 07.
Article in English | MEDLINE | ID: mdl-38920507

ABSTRACT

Many semiparametric spatial autoregressive (SSAR) models have been used to analyze spatial data in a variety of applications; however, it is a common phenomenon that heteroscedasticity often occurs in spatial data analysis. Therefore, when considering SSAR models in this paper, it is allowed that the variance parameters of the models can depend on the explanatory variable, and these are called heterogeneous semiparametric spatial autoregressive models. In order to estimate the model parameters, a Bayesian estimation method is proposed for heterogeneous SSAR models based on B-spline approximations of the nonparametric function. Then, we develop an efficient Markov chain Monte Carlo sampling algorithm on the basis of the Gibbs sampler and Metropolis-Hastings algorithm that can be used to generate posterior samples from posterior distributions and perform posterior inference. Finally, some simulation studies and real data analysis of Boston housing data have demonstrated the excellent performance of the proposed Bayesian method.

4.
Entropy (Basel) ; 26(6)2024 Jun 15.
Article in English | MEDLINE | ID: mdl-38920526

ABSTRACT

When using traditional Euler deconvolution optimization strategies, it is difficult to distinguish between anomalies and their corresponding Euler tails (those solutions are often distributed outside the anomaly source, forming "tail"-shaped spurious solutions, i.e., misplaced Euler solutions, which must be removed or marked) with only the structural index. The nonparametric estimation method based on the normalized B-spline probability density (BSS) is used to separate the Euler solution clusters and mark different anomaly sources according to the similarity and density characteristics of the Euler solutions. For display purposes, the BSS needs to map the samples onto the estimation grid at the points where density will be estimated in order to obtain the probability density distribution. However, if the size of the samples or the estimation grid is too large, this process can lead to high levels of memory consumption and excessive computation times. To address this issue, a fast linear binning approximation algorithm is introduced in the BSS to speed up the computation process and save time. Subsequently, the sample data are quickly projected onto the estimation grid to facilitate the discrete convolution between the grid and the density function using a fast Fourier transform. A method involving multivariate B-spline probability density estimation based on the FFT (BSSFFT), in conjunction with fast linear binning appropriation, is proposed in this paper. The results of two random normal distributions show the correctness of the BSS and BSSFFT algorithms, which is verified via a comparison with the true probability density function (pdf) and Gaussian kernel smoothing estimation algorithms. Then, the Euler solutions of the two synthetic models are analyzed using the BSS and BSSFFT algorithms. The results are consistent with their theoretical values, which verify their correctness regarding Euler solutions. Finally, the BSSFFT is applied to Bishop 5X data, and the numerical results show that the comprehensive analysis of the 3D probability density distributions using the BSSFFT algorithm, derived from the Euler solution subset of x0,y0,z0, can effectively separate and locate adjacent anomaly sources, demonstrating strong adaptability.

5.
Sci Rep ; 14(1): 14458, 2024 Jun 24.
Article in English | MEDLINE | ID: mdl-38914778

ABSTRACT

Unmanned aerial vehicles (UAVs) have become the focus of current research because of their practicability in various scenarios. However, current local path planning methods often result in trajectories with numerous sharp or inflection points, which are not ideal for smooth UAV flight. This paper introduces a UAV path planning approach based on distance gradients. The key improvements include generating collision-free paths using collision information from initial trajectories and obstacles. Then, collision-free paths are subsequently optimized using distance gradient information. Additionally, a trajectory time adjustment method is proposed to ensure the feasibility and safety of the trajectory while prioritizing smoothness. The Limited-memory BFGS algorithm is employed to efficiently solve optimal local paths, with the ability to quickly restart the trajectory optimization program. The effectiveness of the proposed method is validated in the Robot Operating System simulation environment, demonstrating its ability to meet trajectory planning requirements for UAVs in complex unknown environments with high dynamics. Moreover, it surpasses traditional UAV trajectory planning methods in terms of solution speed, trajectory length, and data volume.

6.
Stat Appl Genet Mol Biol ; 23(1)2024 Jan 01.
Article in English | MEDLINE | ID: mdl-38736398

ABSTRACT

Longitudinal time-to-event analysis is a statistical method to analyze data where covariates are measured repeatedly. In survival studies, the risk for an event is estimated using Cox-proportional hazard model or extended Cox-model for exogenous time-dependent covariates. However, these models are inappropriate for endogenous time-dependent covariates like longitudinally measured biomarkers, Carcinoembryonic Antigen (CEA). Joint models that can simultaneously model the longitudinal covariates and time-to-event data have been proposed as an alternative. The present study highlights the importance of choosing the baseline hazards to get more accurate risk estimation. The study used colon cancer patient data to illustrate and compare four different joint models which differs based on the choice of baseline hazards [piecewise-constant Gauss-Hermite (GH), piecewise-constant pseudo-adaptive GH, Weibull Accelerated Failure time model with GH & B-spline GH]. We conducted simulation study to assess the model consistency with varying sample size (N = 100, 250, 500) and censoring (20 %, 50 %, 70 %) proportions. In colon cancer patient data, based on Akaike information criteria (AIC) and Bayesian information criteria (BIC), piecewise-constant pseudo-adaptive GH was found to be the best fitted model. Despite differences in model fit, the hazards obtained from the four models were similar. The study identified composite stage as a prognostic factor for time-to-event and the longitudinal outcome, CEA as a dynamic predictor for overall survival in colon cancer patients. Based on the simulation study Piecewise-PH-aGH was found to be the best model with least AIC and BIC values, and highest coverage probability(CP). While the Bias, and RMSE for all the models showed a competitive performance. However, Piecewise-PH-aGH has shown least bias and RMSE in most of the combinations and has taken the shortest computation time, which shows its computational efficiency. This study is the first of its kind to discuss on the choice of baseline hazards.


Subject(s)
Colonic Neoplasms , Proportional Hazards Models , Humans , Longitudinal Studies , Colonic Neoplasms/mortality , Colonic Neoplasms/genetics , Survival Analysis , Computer Simulation , Models, Statistical , Bayes Theorem , Carcinoembryonic Antigen/blood
7.
Heliyon ; 10(8): e29423, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38644892

ABSTRACT

In order to improve the accuracy of stress intensity factors (SIFs) calculated by traditional boundary element methods (BEM), the multi-domain wavelet boundary element method (WBEM) is proposed. Firstly, by adjusting the nodes of the B-spline wavelet element on the interval, crack-tip elements are constructed. Since B-spline wavelet on the interval (BSWI) has excellent compact support characteristics and is particularly suitable for describing solution domains with large gradient changes, the constructed crack-tip can reduce the numerical oscillation effect near the crack tip. Secondly, the crack-tip elements are implemented into WBEM. And the combination of WBEM and multi-domain technology can effectively handle interface cracks. Thirdly, the crack problem solving strategy based on multi-domain WBEM can directly evaluate the SIFs of cracks. Finally, several numerical examples involving homogeneous media and bi-material models are given to verify that the proposed method is simple and highly accurate.

8.
Lifetime Data Anal ; 2024 Apr 16.
Article in English | MEDLINE | ID: mdl-38625444

ABSTRACT

In studies with time-to-event outcomes, multiple, inter-correlated, and time-varying covariates are commonly observed. It is of great interest to model their joint effects by allowing a flexible functional form and to delineate their relative contributions to survival risk. A class of semiparametric transformation (ST) models offers flexible specifications of the intensity function and can be a general framework to accommodate nonlinear covariate effects. In this paper, we propose a partial-linear single-index (PLSI) transformation model that reduces the dimensionality of multiple covariates into a single index and provides interpretable estimates of the covariate effects. We develop an iterative algorithm using the regression spline technique to model the nonparametric single-index function for possibly nonlinear joint effects, followed by nonparametric maximum likelihood estimation. We also propose a nonparametric testing procedure to formally examine the linearity of covariate effects. We conduct Monte Carlo simulation studies to compare the PLSI transformation model with the standard ST model and apply it to NYU Langone Health de-identified electronic health record data on COVID-19 hospitalized patients' mortality and a Veteran's Administration lung cancer trial.

9.
Magn Reson Imaging ; 110: 112-127, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38615850

ABSTRACT

This study proposes a versatile and efficient optimisation method for discrete coils that induce a magnetic field by their steady currents. The prime target is gradient coils for MRI (Magnetic Resonance Imaging). The derivative (gradient) of the z-component the magnetic field, which is calculated by the Biot-Savart's law, with respect to the z-coordinate in the Cartesian xyz coordinate system is considered as the objective function. Then, the derivative of the objective function with respect to a change of coils in shape is formulated according to the concept of shape optimisation. The resulting shape derivative (as well as the Biot-Savart's law) is smoothly discretised with the closed B-spline curves. In this case, the control points (CPs) of the curves are naturally selected as the design variables. As a consequence, the shape derivative is discretised to the sensitivities of the objective function with respect to the CPs. Those sensitivities are available to solve the present shape-optimisation problem with a certain gradient-based nonlinear-programming solver. The numerical examples exhibit the mathematical reliability, computational efficiency, and engineering applicability of the proposed methodology based on the shape derivative/sensitivities and the closed B-spline curves.


Subject(s)
Algorithms , Equipment Design , Magnetic Resonance Imaging , Reproducibility of Results , Computer Simulation , Humans , Magnetic Fields , Computer-Aided Design
10.
Math Biosci Eng ; 21(1): 1356-1393, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38303469

ABSTRACT

Many correlation analysis methods can capture a wide range of functional types of variables. However, the influence of uncertainty and distribution status in data is not considered, which leads to the neglect of the regularity information between variables, so that the correlation of variables that contain functional relationship but subject to specific distributions cannot be well identified. Therefore, a novel correlation analysis framework for detecting associations between variables with randomness (RVCR-CA) is proposed. The new method calculates the normalized RMSE to evaluate the degree of functional relationship between variables, calculates entropy difference to measure the degree of uncertainty in variables and constructs the copula function to evaluate the degree of dependence on random variables with distributions. Then, the weighted sum method is performed to the above three indicators to obtain the final correlation coefficient R. In the study, which considers the degree of functional relationship between variables, the uncertainty in variables and the degree of dependence on the variables containing distributions, cannot only measure the correlation of functional relationship variables with specific distributions, but also can better evaluate the correlation of variables without clear functional relationships. In experiments on the data with functional relationship between variables that contain specific distributions, UCI data and synthetic data, the results show that the proposed method has more comprehensive evaluation ability and better evaluation effect than the traditional method of correlation analysis.

11.
J Anim Breed Genet ; 141(4): 365-378, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38217261

ABSTRACT

The current study sought to genetically assess the lactation curve of Alpine × Beetal crossbred goats through the application of random regression models (RRM). The objective was to estimate genetic parameters of the first lactation test-day milk yield (TDMY) for devising a practical breeding strategy within the nucleus breeding programme. In order to model variations in lactation curves, 25,998 TDMY records were used in this study. For the purpose of estimating genetic parameters, orthogonal Legendre polynomials (LEG) and B-splines (BS) were examined in order to generate suitable and parsimonious models. A single-trait RRM technique was used for the analysis. The average first lactation TDMY was 1.22 ± 0.03 kg and peak yield (1.35 ± 0.02 kg) was achieved around the 7th test day (TD). The present investigation has demonstrated the superiority of the B-spline model for the genetic evaluation of Alpine × Beetal dairy goats. The optimal random regression model was identified as a quadratic B-spline function, characterized by six knots to represent the central trend. This model effectively captured the patterns of additive genetic influences, animal-specific permanent environmental effects (c2) and 22 distinct classes of (heterogeneous) residual variance. Additive variances and heritability (h2) estimates were lower in the early lactation, however, moderate across most parts of the lactation studied, ranging from 0.09 ± 0.04 to 0.33 ± 0.06. The moderate heritability estimates indicate the potential for selection using favourable combinations of test days throughout the lactation period. It was also observed that a high proportion of total variance was attributed to the animal's permanent environment. Positive genetic correlations were observed for adjacent TDMY values, while the correlations became less pronounced for more distant TDMY values. Considering better fitting of the lactation curve, the use of B-spline functions for genetic evaluation of Alpine × Beetal goats using RRM is recommended.


Subject(s)
Goats , Lactation , Animals , Goats/genetics , Goats/physiology , Lactation/genetics , Female , Breeding , Regression Analysis , Models, Genetic , Milk/metabolism , Dairying
12.
Anim Biosci ; 37(5): 817-825, 2024 May.
Article in English | MEDLINE | ID: mdl-38271977

ABSTRACT

OBJECTIVE: The aim of this study was to identify suitable polynomial regression for modeling the average growth trajectory and to estimate the relative development of the rib eye area, scrotal circumference, and morphometric measurements of Guzerat young bulls. METHODS: A total of 45 recently weaned males, aged 325.8±28.0 days and weighing 219.9±38.05 kg, were evaluated. The animals were kept on Brachiaria brizantha pastures, received multiple supplementations, and were managed under uniform conditions for 294 days, with evaluations conducted every 56 days. The average growth trajectory was adjusted using ordinary polynomials, Legendre polynomials, and quadratic B-splines. The coefficient of determination, mean absolute deviation, mean square error, the value of the restricted likelihood function, Akaike information criteria, and consistent Akaike information criteria were applied to assess the quality of the fits. For the study of allometric growth, the power model was applied. RESULTS: Ordinary polynomial and Legendre polynomial models of the fifth order provided the best fits. B-splines yielded the best fits in comparing models with the same number of parameters. Based on the restricted likelihood function, Akaike's information criterion, and consistent Akaike's information criterion, the B-splines model with six intervals described the growth trajectory of evaluated animals more smoothly and consistently. In the study of allometric growth, the evaluated traits exhibited negative heterogeneity (b<1) relative to the animals' weight (p<0.01), indicating the precocity of Guzerat cattle for weight gain on pasture. CONCLUSION: Complementary studies of growth trajectory and allometry can help identify when an animal's weight changes and thus assist in decision-making regarding management practices, nutritional requirements, and genetic selection strategies to optimize growth and animal performance.

13.
J Med Imaging (Bellingham) ; 11(1): 014004, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38173655

ABSTRACT

Purpose: Optical coherence tomography has emerged as an important intracoronary imaging technique for coronary artery disease diagnosis as it produces high-resolution cross-sectional images of luminal and plaque morphology. Precise and fast lumen segmentation is essential for efficient OCT morphometric analysis. However, due to the presence of various image artifacts, including side branches, luminal blood artifacts, and complicated lesions, this remains a challenging task. Approach: Our research study proposes a rapid automatic segmentation method that utilizes nonuniform rational B-spline to connect limited pixel points and identify the edges of the OCT lumen. The proposed method suppresses image noise and accurately extracts the lumen border with a high correlation to ground truth images based on the area, minimal diameter, and maximal diameter. Results: We evaluated the method using 3300 OCT frames from 10 patients and found that it achieved favorable results. The average time taken for automatic segmentation by the proposed method is 0.17 s per frame. Additionally, the proposed method includes seamless vessel reconstruction following the lumen segmentation. Conclusions: The developed automated system provides an accurate, efficient, robust, and user-friendly platform for coronary lumen segmentation and reconstruction, which can pave the way for improved assessment of the coronary artery lumen morphology.

14.
Sensors (Basel) ; 23(21)2023 Oct 27.
Article in English | MEDLINE | ID: mdl-37960455

ABSTRACT

"Three straight and two flat" is the inevitable demand when realizing the intelligent mining of a fully mechanized mining face. To address the crucial technical issue of lacking accurate perception of the shape of the scraper conveyor during intelligent coal mining, a three-dimensional curvature sensor involving fiber Bragg grating (FBG) is used as a perceptive tool to conduct curve reconstruction research based on different local motion frames and to reconstruct the shape of the scraper conveyor. Firstly, the formation process of the 'S'-shaped bending section of the scraper conveyor during the pushing process is determined. Based on the FBG sensing principle, a mathematical model between the variation in the central wavelength and the strain and curvature is established, and the cubic B-spline interpolation method is employed to continuously process the obtained discrete curvature. Secondly, based on differential geometry, a spatial curve reconstruction algorithm based on the Frenet moving frame is derived, and the shape curve prediction interpolation model is built based on a gated recurrent unit (GRU) model, which reduces the impact of the decrease in curve reconstruction accuracy caused by damage to some grating measuring points. Finally, an experimental platform was designed and built, and sensors with curvature radii of 6 m, 7 m, and 8 m were tested. The experimental results showed that the reconstructed curve was essentially consistent with the actual shape, and the absolute error at the end was about 2 mm. The feasibility of this reconstruction algorithm in engineering has been proven, and this is of great significance in achieving shape curve perception and straightness control for scraper conveyors.

15.
BMC Res Notes ; 16(1): 282, 2023 Oct 19.
Article in English | MEDLINE | ID: mdl-37858117

ABSTRACT

This work proposes a uniformly convergent numerical scheme to solve singularly perturbed parabolic problems of large time delay with two small parameters. The approach uses implicit Euler and the exponentially fitted extended cubic B-spline for time and space derivatives respectively. Extended cubic B-splines have advantages over classical B-splines. This is because for a given value of the free parameter [Formula: see text] the solution obtained by the extended B-spline is better than the solution obtained by the classical B-spline. To confirm the correspondence of the numerical methods with the theoretical results, numerical examples are presented. The present numerical technique converges uniformly, leading to the current study of being more efficient.


Subject(s)
Algorithms , Convection
16.
Sensors (Basel) ; 23(19)2023 Sep 28.
Article in English | MEDLINE | ID: mdl-37836981

ABSTRACT

To meet the real-time path planning requirements of intelligent vehicles in dynamic traffic scenarios, a path planning and evaluation method is proposed in this paper. Firstly, based on the B-spline algorithm and four-stage lane-changing theory, an obstacle avoidance path planning algorithm framework is constructed. Then, to obtain the optimal real-time path, a comprehensive real-time path evaluation mechanism that includes path safety, smoothness, and comfort is established. Finally, to verify the proposed approach, co-simulation and real vehicle testing are conducted. In the dynamic obstacle avoidance scenario simulation, the lateral acceleration, yaw angle, yaw rate, and roll angle fluctuation ranges of the ego-vehicle are ±2.39 m/s2, ±13.31°, ±13.26°/s, and ±0.938°, respectively. The results show that the proposed algorithm can generate real-time, available obstacle avoidance paths. And the proposed evaluation mechanism can find the optimal path for the current scenario.

17.
MethodsX ; 11: 102336, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37693653

ABSTRACT

The solutions of two parameters singularly perturbed boundary value problems typically exhibit two boundary layers. Because of the presence of these layers standard numerical methods fail to give accurate approximations. This paper introduces a numerical treatment of a class of two parameters singularly perturbed boundary value problems whose solution exhibits boundary layer phenomena. A graded mesh is considered to resolve the boundary layers and collocation method with cubic B-splines on the graded mesh is proposed and analyzed. The proposed method leads to a tri-diagonal linear system of equations. The stability and parameters uniform convergence of the present method are examined. To verify the theoretical estimates and efficiency of the method several known test problems in the literature are considered. Comparisons to some existing results are made to show the better efficiency of the proposed method. Summing up:•The present method is found to be stable and parameters uniform convergent and the numerical results support the theoretical findings.•Experimental results show that the present method approximates the solution very well and has a rate of convergence of order two in the maximum norm.•Experimental results show that cubic B-spline collocation method on graded mesh is more efficient than cubic B-spline collocation method on Shishkin mesh and some other existing methods in the literature.

18.
Med Phys ; 50(11): 6844-6856, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37750537

ABSTRACT

BACKGROUND: Peripheral arterial disease (PAD) is a chronic occlusive disease that restricts blood flow in the lower limbs, causing partial or complete blockages of the blood flow. While digital subtraction angiography (DSA) has traditionally been the preferred method for assessing blood flow in the lower limbs, advancements in wide beam Computed Tomography (CT), allowing successive acquisition at high frame rate, might enable hemodynamic measurements. PURPOSE: To quantify the arterial blood flow in stenotic below-the-knee (BTK) arteries. To this end, we propose a novel method for contrast bolus tracking and assessment of quantitative hemodynamic parameters in stenotic arteries using 4D-CT. METHODS: Fifty patients with suspected PAD underwent 4D-CT angiography in addition to the clinical run-off computed tomography angiography (CTA). From these dynamic acquisitions, the BTK arteries were segmented and the region of maximum blood flow was extracted. Time attenuation curves (TAC) were estimated using 2D spatio-temporal B-spline regression, enforcing both spatial and temporal smoothness. From these curves, quantitative hemodynamic parameters, describing the shape of the propagating contrast bolus were automatically extracted. We evaluated the robustness of the proposed TAC fitting method with respect to interphase delay and imaging noise and compared it to commonly used approaches. Finally, to illustrate the potential value of 4D-CT, we assessed the correlation between the obtained hemodynamic parameters and the presence of PAD. RESULTS: 280 out of 292 arteries were successfully segmented, with failures mainly due to a delayed contrast arrival. The proposed method led to physiologically plausible hemodynamic parameters and was significantly more robust compared to 1D temporal regression. A significant correlation between the presence of proximal stenoses and several hemodynamic parameters was found. CONCLUSIONS: The proposed method based on spatio-temporal bolus tracking was shown to lead to stable and physiologically plausible estimation of quantitative hemodynamic parameters, even in the case of stenotic arteries. These parameters may provide valuable information in the evaluation of PAD and contribute to its diagnosis.


Subject(s)
Computed Tomography Angiography , Four-Dimensional Computed Tomography , Humans , Computed Tomography Angiography/methods , Constriction, Pathologic/diagnostic imaging , Arteries , Hemodynamics , Lower Extremity , Angiography, Digital Subtraction
19.
Nanomaterials (Basel) ; 13(14)2023 Jul 13.
Article in English | MEDLINE | ID: mdl-37513077

ABSTRACT

We implemented a semi-empirical pseudopotential (SEP) method for calculating the band structures of graphene and graphene nanoribbons. The basis functions adopted are two-dimensional plane waves multiplied by several B-spline functions along the perpendicular direction. The SEP includes both local and non-local terms, which were parametrized to fit relevant quantities obtained from the first-principles calculations based on the density-functional theory (DFT). With only a handful of parameters, we were able to reproduce the full band structure of graphene obtained by DFT with a negligible difference. Our method is simple to use and much more efficient than the DFT calculation. We then applied this SEP method to calculate the band structures of graphene nanoribbons. By adding a simple correction term to the local pseudopotentials on the edges of the nanoribbon (which mimics the effect caused by edge creation), we again obtained band structures of the armchair nanoribbon fairly close to the results obtained by DFT. Our approach allows the simulation of optical and transport properties of realistic nanodevices made of graphene nanoribbons with very little computation effort.

20.
J Appl Stat ; 50(8): 1686-1708, 2023.
Article in English | MEDLINE | ID: mdl-37260470

ABSTRACT

Uncovering the heterogeneity in the disease progression of Alzheimer's is a key factor to disease understanding and treatment development, so that interventions can be tailored to target the subgroups that will benefit most from the treatment, which is an important goal of precision medicine. However, in practice, one top methodological challenge hindering the heterogeneity investigation is that the true subgroup membership of each individual is often unknown. In this article, we aim to identify latent subgroups of individuals who share a common disorder progress over time, to predict latent subgroup memberships, and to estimate and infer the heterogeneous trajectories among the subgroups. To achieve these goals, we apply a concave fusion learning method to conduct subgroup analysis for longitudinal trajectories of the Alzheimer's disease data. The heterogeneous trajectories are represented by subject-specific unknown functions which are approximated by B-splines. The concave fusion method can simultaneously estimate the spline coefficients and merge them together for the subjects belonging to the same subgroup to automatically identify subgroups and recover the heterogeneous trajectories. The resulting estimator of the disease trajectory of each subgroup is supported by an asymptotic distribution. It provides a sound theoretical basis for further conducting statistical inference in subgroup analysis.

SELECTION OF CITATIONS
SEARCH DETAIL
...