Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 120
Filter
1.
BMC Med Res Methodol ; 24(1): 141, 2024 Jun 28.
Article in English | MEDLINE | ID: mdl-38943087

ABSTRACT

BACKGROUND: On-site monitoring is a crucial component of quality control in clinical trials. However, many cast doubt on its cost-effectiveness due to various issues, such as a lack of monitoring focus that could assist in prioritizing limited resources during a site visit. Consequently, an increasing number of trial sponsors are implementing a hybrid monitoring strategy that combines on-site monitoring with centralised monitoring. One of the primary objectives of centralised monitoring, as stated in the clinical trial guidelines, is to guide and adjust the extent and frequency of on-site monitoring. Quality tolerance limits (QTLs) introduced in ICH E6(R2) and thresholds proposed by TransCelerate Biopharma are two existing approaches for achieving this objective at the trial- and site-levels, respectively. The funnel plot, as another threshold-based site-level method, overcomes the limitation of TransCelerate's method by adjusting thresholds flexibly based on site sizes. Nonetheless, both methods do not transparently explain the reason for choosing the thresholds that they used or whether their choices are optimal in any certain sense. Additionally, related Bayesian monitoring methods are also lacking. METHODS: We propose a simple, transparent, and user-friendly Bayesian-based risk boundary for determining the extent and frequency of on-site monitoring both at the trial- and site-levels. We developed a four-step approach, including: 1) establishing risk levels for key risk indicators (KRIs) along with their corresponding monitoring actions and estimates; 2) calculating the optimal risk boundaries; 3) comparing the outcomes of KRIs against the optimal risk boundaries; and 4) providing recommendations based on the comparison results. Our method can be used to identify the optimal risk boundaries within an established risk level range and is applicable to continuous, discrete, and time-to-event endpoints. RESULTS: We evaluate the performance of the proposed risk boundaries via simulations that mimic various realistic clinical trial scenarios. The performance of the proposed risk boundaries is compared against the funnel plot using real clinical trial data. The results demonstrate the applicability and flexibility of the proposed method for clinical trial monitoring. Moreover, we identify key factors that affect the optimality and performance of the proposed risk boundaries, respectively. CONCLUSION: Given the aforementioned advantages of the proposed risk boundaries, we expect that they will benefit the clinical trial community at large, in particular in the realm of risk-based monitoring.


Subject(s)
Bayes Theorem , Humans , Clinical Trials as Topic/methods , Quality Control , Algorithms
2.
J Dent Sci ; 19(2): 894-899, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38618128

ABSTRACT

Background/purpose: History of periodontitis is a well-documented risk indicator of peri-implantitis. However, the influence of severity of periodontitis is still unclear, especially for severe periodontitis. This study was aimed to investigate the prevalence of peri-implant disease and analyze the risk indicators in patients with treated severe periodontitis. Materials and methods: A total of 182 implants from 88 patients (44 males and 44 females) with severe periodontitis with a mean fellow-up period of 76.5 months were enrolled in this study. Patient and implant information, and periodontal and peri-implant conditions were collected to evaluate the prevalence of peri-implant disease and risk indicators. Results: The prevalence of peri-implantitis was 9.1% and 6.6% at the patient-level and implant-level. The prevalence of peri-implant mucositis was 76.1% and 51.1% at the patient-level and implant-level. Risk indicators of peri-implantitis included older age (OR: 1.132), poor proximal cleaning habits (OR: 14.218), implants in anterior area (OR: 10.36), poor periodontal disease control (OR: 12.76), high peri-implant plaque index (OR: 4.27), and keratinized tissue width (KTW)<2 mm (OR: 19.203). Conclusion: Implants in patients with severe periodontitis after periodontal treatment and maintenance show a low prevalence (9.1%) of peri-implantitis and a relatively high prevalence (76.2%) of peri-implant mucositis. Patient age, peri-implant proximal cleaning habits, implant position, periodontal disease control, peri-implant plaque index, and KTW are associated with prevalence of peri-implantitis.

3.
Addiction ; 119(7): 1166-1167, 2024 07.
Article in English | MEDLINE | ID: mdl-38472154

Subject(s)
Video Games , Humans , Internet
4.
Int J Oral Maxillofac Implants ; 39(1): 164-172, 2024 Feb 27.
Article in English | MEDLINE | ID: mdl-38416010

ABSTRACT

PURPOSE: To report the prevalence of early implant failure and evaluate factors that contribute to the early failure of dental implants placed at a teaching clinic. The study also aims to identify risk indicators for early implant loss to better predict and prevent early implant loss in the future. MATERIALS AND METHODS: This retrospective study included all patients with a dental implant placed by the Section of Oral Surgery and Oral Medicine, Department of Clinical Dentistry, University of Bergen, between January 2011 and December 2018. All information was collected from operation logbooks and from patient records. A failed implant in this study was defined as an implant lost before functional loading. RESULTS: A total of 1,005 dental implants were placed in the studied time period, of which 54 failed early, giving an early failure rate (EFR) of 5.4%, with functional loading obtained for the remaining 94.6%. Analysis showed an increased hazard for early implant failure among smokers, men, and younger patients. With an age increase of 10 years, the risk of implant failure was reduced by 14% (hazard ratio [HRR] = 0.86, P = .037). A higher failure rate was found in anterior maxillary implants than in posterior maxillary implants (7.79% vs 3.29%, respectively; HRR = 0.47; P = .041). The probability for early failure in the posterior mandible was significantly increased compared to the posterior maxilla (HRR = 3.68, P = .005). If the first implant failed, it was more likely that the consecutive implant would also fail (HRR = 1.82). In the study, 53.4% of the placed implants were Straumann (EFR = 5.2%), 30.3% were Nobel Biocare (EFR = 7.2%), and 16.3% were Astra Tech (EFR = 2.5%). CONCLUSIONS: This study found that younger, male, and smoker patients were associated with an increase in early failure of dental implants. Significantly increased failure rates were also seen for implants placed in the mandible, and there were differences with respect to implant system. Although differences were found in early failure both for patient- and implant-related factors, the overall early failure rate (5.4%) in this study was low.


Subject(s)
Dental Implants , Surgery, Oral , Humans , Male , Child , Dental Implants/adverse effects , Retrospective Studies , Risk Factors , Mandible
5.
Saudi Dent J ; 36(1): 117-122, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38375373

ABSTRACT

Background: This study investigates the prevalence, distribution and risk indicators of buccal gingival recessions (GRs) in periodontitis patients. Methods: A retrospective examination of 400 periodontitis patients files was performed using an operating sheet. Univariate logistic regression analysis was performed to identify risk indicators of GRs. Multivariate regression analysis was conducted for selected variables with p < 0.05. Results: 354/400 (88.5 %) patients have at least one GR ≥ 1 mm. The prevalence of recession type (RT) at the patient level was 0.5 %, 2.25 % and 85.75 % for RT1, RT2 and RT3 respectively. Lower incisors are the most affected teeth (79.8 %). Upper canines present the lowest frequency (41.8 %). The univariate logistic regression showed that age (SE = 0.021; 95 % CI 1.01-1.10; p = 0.006), plaque index (SE = 0.50; 95 % CI 1.49-10.50; p = 0.006), level of plaque control (SE = 0.529; 95 % CI 0.90-0.72; p = 0.010) and periodontitis stage (SE = 0.41; 95 % CI 1.41-7.07; p = 0.005) were significantly associated with the presence of GR. In the multivariate regression model, significant results were confirmed only for age (SE = 0.021; 95 % CI 1.02-1.17; p = 0.006) and periodontitis stage (SE = 0.41; 95 % CI 1.35-6.75; p = 0.007). Conclusion: The cross-sectional study showed a high prevalence of GRs. Lower incisors were the most affected teeth. Most patients have GRs with advanced interproximal attachment loss (RT3 GRs). Age, plaque index, level of plaque control and periodontitis stage resulted as risk indicators of GRs.

6.
Saudi Dent J ; 36(1): 60-65, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38375380

ABSTRACT

Objectives: To determine the prevalence of primary dentition malocclusion and its linked risk indicators among a group of Saudi preschool children. Methods: A cross-sectional study was carried out on preschool children aged 3 to 5 years residing in Riyadh City, the capital of Saudi Arabia. The study sample included 709 Saudi children of both genders with complete primary dentition. Oral examination was conducted for children to assess the anteroposterior, transverse, and vertical dimensions, arch spacings, and oral habits. Results: The prevalence of malocclusion was 59.1% among the study participants. A deep overbite and increased overjet were found in 26.23 % and 25.11%, respectively. Arch space problems were reported, including missing primate spaces in 24.12%, missing developmental spacing in 27.93%, and crowding in 14.1%. An association between mothers aged 25 years and younger at childbirth was linked with their child's malocclusion in the primary dentition (p-value of 0.03). Conclusion: The prevalence of primary dentition malocclusion among a specific group of Saudi preschool children was significantly high. Increased overbite and overjet were the most prevalent occlusal discrepancies, followed by arch spacing problems. The younger mother's age at childbirth is significantly associated with her child's malocclusion. The study results can serve as a baseline for future investigations.

7.
Comput Methods Biomech Biomed Engin ; 27(3): 347-364, 2024 Mar.
Article in English | MEDLINE | ID: mdl-36880851

ABSTRACT

In this numerical study, areas of the carotid bifurcation and of a distal stenosis in the internal carotid artery are closely observed to evaluate the patient's current risks of ischemic stroke. An indicator for the vessel wall defects is the stress exerted by blood on the vessel tissue, typically expressed by the amplitude of the wall shear stress vector (WSS) and its oscillatory shear index. To detect negative shear stresses corresponding with reversal flow, we perform orientation-based shear evaluation. We investigate the longitudinal component of the wall shear vector, where tangential vectors aligned longitudinally with the vessel are necessary. However, resulting from imaging segmentation resolution of patients' computed tomography angiography scans and stenotic regions, the geometry model's mesh is non-smooth on its surface areas and the automatically generated tangential vector field is discontinuous and multi-directional, making an interpretation of our orientation-based risk indicators unreliable. We improve the evaluation of longitudinal shear stress by applying the projection of the vessel's centerline to the surface to construct smooth tangential field aligned longitudinally with the vessel. We validate our approach for the longitudinal WSS component and the corresponding oscillatory index by comparing them to results obtained using automatically generated tangents in both rigid and elastic vessel modeling and to amplitude-based indicators. We present the major benefit of our longitudinal WSS evaluation based on its directionality for the cardiovascular risk assessment, which is the detection of negative WSS indicating persistent reversal or transverse flow. This is impossible in the case of the amplitude-based WSS.


Subject(s)
Carotid Arteries , Models, Cardiovascular , Humans , Carotid Arteries/diagnostic imaging , Carotid Artery, Internal/diagnostic imaging , Constriction, Pathologic , Stress, Mechanical , Blood Flow Velocity , Shear Strength
8.
Behav Genet ; 54(1): 73-85, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38135768

ABSTRACT

Low- and middle-income countries (LMICs) globally have undergone rapid urbanisation, and changes in demography and health behaviours. In Sri Lanka, cardio-vascular disease and diabetes are now leading causes of mortality. High prevalence of their risk factors, including hypertension, dysglycaemia and obesity have also been observed. Diet is a key modifiable risk factor for both cardio-vascular disease and diabetes as well as their risk factors. Although typically thought of as an environmental risk factor, dietary choice has been shown to be genetically influenced, and genes associated with this behaviour correlate with metabolic risk indicators. We used Structural Equation Model fitting to investigate the aetiology of dietary choices and cardio-metabolic phenotypes in COTASS, a population-based twin and singleton sample in Colombo, Sri Lanka. Participants completed a Food Frequency Questionnaire (N = 3934) which assessed frequency of intake of 14 food groups including meat, vegetables and dessert or sweet snacks. Anthropometric (N = 3675) and cardio-metabolic (N = 3477) phenotypes were also collected including weight, blood pressure, cholesterol, fasting plasma glucose and triglycerides. Frequency of consumption of most food items was found to be largely environmental in origin with both the shared and non-shared environmental influences indicated. Modest genetic influences were observed for some food groups (e.g. fruits and leafy greens). Cardio-metabolic phenotypes showed moderate genetic influences with some shared environmental influence for Body Mass Index, blood pressure and triglycerides. Overall, it seemed that shared environmental effects were more important for both dietary choices and cardio-metabolic phenotypes compared to populations in the Global North.


Subject(s)
Diabetes Mellitus , Vascular Diseases , Humans , Sri Lanka/epidemiology , Obesity/genetics , Risk Factors , Triglycerides
9.
Clin Cosmet Investig Dent ; 15: 333-347, 2023.
Article in English | MEDLINE | ID: mdl-38107875

ABSTRACT

Background: Several factors are associated with coronal and root caries in older persons. The purpose of this study was to determine the experience, prevalence, and risk indicators (socioeconomic, sociodemographic, and dental variables) of coronal and root caries in older persons residing in nursing homes in Mexico. Methods: A cross-sectional study was carried out in 227 dentate participants with natural teeth. Convenience sample, where all dentate residents were invited to participate. The dependent variables were coronal caries and root caries, which were determined through an oral clinical examination. The independent variables were sociodemographic factors, location, type of center, surfaces free of dental biofilm and calculus, surfaces with recession, retainers in contact with surfaces with recession, xerostomia, smoking, and the previous use of dental services. The binary logistic regression model was used in the analysis. Results: The mean age of the participants in this study was 77.7±8.8 years, and 69.2% were women. Moreover, 71.8% live in long-term care facilities, and 48.0% live in Mexico City. The prevalence of coronal and root caries was found to be 67.8% and 50.7%, respectively. Being male and living in Mexico City were risk indicators for coronal caries, and with a 1% increase in surfaces with no biofilm, the risk decreased by 2%. Being widowed, having government or no social security, denture retainers, and coronal caries were risk indicators for root caries, while the utilization of dental services indicated lower risk. Conclusion: Several variables that differ in nature were found to be risk indicators for coronal and root caries. Coronal caries increases the risk of root caries. Prevention should be aimed at identifying persons at higher risk, and dental care should be improved for persons living in long-term care institutions.

10.
BMC Oral Health ; 23(1): 726, 2023 10 07.
Article in English | MEDLINE | ID: mdl-37805466

ABSTRACT

BACKGROUND: Meeting the oral health needs of the increasing population of older adults presents a major challenge in dental care. Knowledge about the oral health status in the young-elderly age group is essential for the planning of future oral health education and prevention programs. The aims of the present study were therefore to investigate the caries experience among 65-year-olds in Oslo, Norway, and to explore associations between having decayed teeth and sociodemographic, behavioural, and biological factors. METHODS: A random sample of 65­year­olds in Oslo answered a questionnaire and underwent clinical and radiographic examinations (n = 457, 52% men and 48% women) at the Research Clinic, Faculty of Dentistry, University of Oslo, between February and December 2019. Primary- and secondary coronal and root caries lesions, root remnants, and missing and restored teeth were recorded. Decayed teeth (DT) were defined as teeth with coronal- and root caries lesions that had progressed into dentine and root remnants, and the DMFT/S scores were calculated. RESULTS: The mean number of teeth was 25 (SD: 4) and the mean DMFT was 19.4 (SD: 4.7). Thirty seven percent of the individuals had at least one decayed tooth (DT > 0), and the mean number of filled teeth (FT) was 16.1 (SD: 5.4). Multivariable logistic regression analysis showed that male gender (OR: 1.8, 95% CI: 1.2-2.8), basic level of education (OR: 1.9, 95% CI: 1.2-2.9), irregular dental attendance (OR: 2.2, 95% CI: 1.0-4.8), and hyposalivation (OR: 2.1, 95% CI: 1.0-4.4) were significant risk indicators for having decayed teeth (DT > 0) (p < 0.05). CONCLUSIONS: In conclusion, 65-year-olds in Oslo had a low average number of decayed and missing teeth, and a high number of restored teeth. Irregular dental attendance and hyposalivation were the strongest risk indicators for having decayed teeth. Based on the present results, it will be important to ensure access to regular dental care and to increase the emphasis on caries preventive measures for individuals with hyposalivation in this age group.


Subject(s)
Dental Caries , Root Caries , Xerostomia , Humans , Male , Female , Aged , Dental Caries/epidemiology , Dental Caries/prevention & control , Cross-Sectional Studies , Dental Caries Susceptibility , Norway/epidemiology , DMF Index , Prevalence
11.
ESC Heart Fail ; 10(5): 2895-2902, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37427483

ABSTRACT

AIMS: Early start and patient profile-oriented heart failure (HF) management has been recommended. In this post hoc analysis from the SHIFT trial, we analysed the treatment effects of ivabradine in HF patients with systolic blood pressure (SBP) < 110 mmHg, resting heart rate (RHR) ≥ 75 b.p.m., left ventricular ejection fraction (LVEF) ≤ 25%, New York Heart Association (NYHA) Class III/IV, and their combination. METHODS AND RESULTS: The SHIFT trial enrolled 6505 patients (LVEF ≤ 35% and RHR ≥ 70 b.p.m.), randomized to ivabradine or placebo on the background of guideline-defined standard care. Compared with placebo, ivabradine was associated with a similar relative risk reduction of the primary endpoint (cardiovascular death or HF hospitalization) in patients with SBP < 110 and ≥110 mmHg [hazard ratio (HR) 0.89, 95% confidence interval (CI) 0.74-1.08 vs. HR 0.80, 95% CI 0.72-0.89, P interaction = 0.34], LVEF ≤ 25% and >25% (HR 0.85, 95% CI 0.72-1.01 vs. HR 0.80, 95% CI 0.71-0.90, P interaction = 0.53), and NYHA III-IV and II (HR 0.83, 95% CI 0.74-0.94 vs. HR 0.81, 95% CI 0.69-0.94, P interaction = 0.79). The effect was more pronounced in patients with RHR ≥ 75 compared with <75 (HR 0.76, 95% CI 0.68-0.85 vs. HR 0.97, 95% CI 0.81-0.1.16, P interaction = 0.02). When combining these profiling parameters, treatment with ivabradine was also associated with risk reductions comparable with patients with low-risk profiles for the primary endpoint (relative risk reduction 29%), cardiovascular death (11%), HF death (49%), and HF hospitalization (38%; all P values for interaction: 0.40). No safety concerns were observed between study groups. CONCLUSIONS: Our analysis shows that RHR reduction with ivabradine is effective and improves clinical outcomes in HF patients across various risk indicators such as low SBP, high RHR, low LVEF, and high NYHA class to a similar extent and without safety concern.

12.
Health Qual Life Outcomes ; 21(1): 65, 2023 Jul 04.
Article in English | MEDLINE | ID: mdl-37403085

ABSTRACT

BACKGROUND: This study aimed to compare the quality of life (QoL) reported by childhood cancer survivors (CCS) drawn from a cohort of the German Childhood Cancer Registry with a representative general population sample and, within CCS, to test associations between QoL and health behavior, health risk factors, and physical illness. METHODS: CCS (N = 633, age at diagnosis M = 6.34 (SD = 4.38), age at medical assessment M = 34.92 (SD = 5.70)) and a general population sample (age-aligned; N = 975) filled out the EORTC QLQ-C30. Comparisons were performed using General linear models (GLMs) (fixed effects: sex/gender, group (CCS vs. general population); covariates: age, education level). CCS underwent an extensive medical assessment (mean time from diagnosis to assessment was 28.07 (SD = 3.21) years) including an objective diagnosis of health risk factors and physical illnesses (e.g., diabetes and cardiovascular disease). Within CCS, we tested associations between QoL and sociodemographic characteristics, health behavior, health risk factors, and physical illness. RESULTS: CCS, especially female CCS, reported both worse functional QoL and higher symptom burden than the general population. Among CCS, better total QoL was related to younger age, higher level of education, being married, and engaging in active sports. Both health risk factors (dyslipidemia and physical inactivity) and manifest physical illnesses (cardiovascular disease) were associated with lower total QoL. CONCLUSIONS: In all domains, long-term CCS reported worse QoL than the comparison sample. The negative associations with risk factors and physical illnesses indicate an urgent need for long-term surveillance and health promotion.


Subject(s)
Cancer Survivors , Cardiovascular Diseases , Neoplasms , Humans , Child , Female , Neoplasms/epidemiology , Quality of Life , Survivors , Risk Factors
13.
BMC Microbiol ; 23(1): 62, 2023 03 07.
Article in English | MEDLINE | ID: mdl-36882680

ABSTRACT

BACKGROUND: The freshwater microbiome regulates aquatic ecological functionality, nutrient cycling, pathogenicity, and has the capacity to dissipate and regulate pollutants. Agricultural drainage ditches are ubiquitous in regions where field drainage is necessary for crop productivity, and as such, are first-line receptors of agricultural drainage and runoff. How bacterial communities in these systems respond to environmental and anthropogenic stressors are not well understood. In this study, we carried out a three year study in an agriculturally dominated river basin in eastern Ontario, Canada to explore the spatial and temporal dynamics of the core and conditionally rare taxa (CRT) of the instream bacterial communities using a 16S rRNA gene amplicon sequencing approach. Water samples were collected from nine stream and drainage ditch sites that represented the influence of a range of upstream land uses. RESULTS: The cross-site core and CRT accounted for 5.6% of the total number of amplicon sequence variants (ASVs), yet represented, on average, over 60% of the heterogeneity of the overall bacterial community; hence, well reflected the spatial and temporal microbial dynamics in the water courses. The contribution of core microbiome to the overall community heterogeneity represented the community stability across all sampling sites. CRT was primarily composed of functional taxa involved in nitrogen (N) cycling and was linked to nutrient loading, water levels, and flow, particularly in the smaller agricultural drainage ditches. Both the core and the CRT were sensitive responders to changes in hydrological conditions. CONCLUSIONS: We demonstrate that core and CRT can be considered as holistic tools to explore the temporal and spatial variations of the aquatic microbial community and can be used as sensitive indicators of the health and function of agriculturally dominated water courses. This approach also reduces computational complexity in relation to analyzing the entire microbial community for such purposes.


Subject(s)
Agriculture , Rivers , RNA, Ribosomal, 16S/genetics , Fresh Water , Water
14.
Ther Innov Regul Sci ; 57(4): 839-848, 2023 07.
Article in English | MEDLINE | ID: mdl-36972010

ABSTRACT

Since the release of ICH E6(R2), multiple efforts have been made to interpret the requirements and suggest ways of implementing quality tolerance limits (QTLs) alongside existing risk-based quality management methodologies. While these efforts have contributed positively to developing a common understanding of QTLs, some uncertainty remains regarding implementable approaches. In this article, we review the approaches taken by some leading biopharmaceutical companies, offering recommendations for how to make QTLs most effective, what makes them ineffective, and several case studies to illustrate these concepts. This includes how best to choose QTL parameters and thresholds for a given study, how to differentiate QTLs from key risk indicators, and how QTLs relate to critical-to-quality factors and the statistical design of the trials.


Subject(s)
Biological Products , Quantitative Trait Loci , Risk Management
15.
Pathobiology ; 90(4): 241-250, 2023.
Article in English | MEDLINE | ID: mdl-36724757

ABSTRACT

INTRODUCTION: The present study aimed to analyze the clinical features and laboratory markers of patients with Delta variant SARS-CoV-2 and explore the role of platelet in predicting the severity of Delta. METHODS: This retrospective, observational study was conducted on 863 patients laboratory-confirmed Delta variant SARS-CoV-2. These cases were sub-classified based on disease severity into mild (n = 304), moderate (n = 537), and severe (n = 22). A series of laboratory findings and clinical data were collected and analyzed during hospitalization. RESULTS: Of 863 hospitalized patients with Delta, the median age was 38 years (interquartile range, 30-51 years) and 471 (54.58%) were male. The most common clinical symptoms mainly included cough, fever, pharyngalgia, expectoration, dyspnea, fatigue, and headache, and the commonest comorbidities were hypertension and diabetes. Among the hematological variables, neutrophil count, red blood cell count, and hemoglobin, were found to be statistically significant with regard to subcategories based of disease severity (p < 0.05). Among coagulation parameters, there was a statistically significant difference in D-dimer, fibrinogen, international normalized ratio, and prothrombin time (p < 0.05). Statistically significant differences were observed in platelet markers including platelet count, large platelet count, and plateletcrit (p < 0.05). Additionally, there was strong correlation between platelet and other parameters with disease severity. Logistical regression analysis and ROC curves showed that D-dimer was a single best marker of disease severity (p = 0.005, p < 0.0001); however, platelet (p = 0.009, p = 0.002) and plateletcrit (p = 0.002, p = 0.001) could also predict severe disease. Platelet was identified as an independent risk factor for severe Delta. CONCLUSION: Low platelet may be a marker of disease severity in Delta variant SARS-CoV-2 and may contribute to determine the severity of patients infected with Delta.


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Male , Adult , Female , COVID-19/diagnosis , Blood Platelets , Retrospective Studies
16.
Clin Oral Implants Res ; 34(5): 463-474, 2023 May.
Article in English | MEDLINE | ID: mdl-36806171

ABSTRACT

OBJECTIVE: This study aims to report the implant survival rate of dental implants of partially dentate patients in the anterior mandible and the potential risk indicators for implant failure. MATERIALS AND METHODS: Patients with implant-supported restorations of single or multiple teeth in the anterior mandible restored with fixed partial implant-supported restorations were evaluated. Patient demographic data, implant placement timing, and loading protocol, biological and/or technical complications at the time of the last clinical and radiographic follow-up visit were registered. Survival rate, success rate, and potential risk indicators for implant failure were calculated. RESULTS: A total of 108 patients and 186 implants with a mean follow-up period of 5.48 years (0.1-11.34 years) were included. The 11.3-year cumulative survival rate was 90.9%. Immediate implant placement (OR = 2.75) (p = .08) and immediate implant loading (OR = 8.8) (p = .02*) indicated a higher risk of failure than late implant placement or loading. When combining both categories (type 1A), an OR = 10.59 (p = .04*) for implant failure was found compared to category 4C. Implants placed following static-computer-assisted implant surgery (S-CAIS) showed less risk of failure compared to freehand implant placement (OR = 0.18; 95% CI: 0.02-1.37) (p = .09). CONCLUSIONS: The survival rate of implants placed in the anterior mandible was considerably low (90.9%). S-CAIS, late placement, and conventional loading are protective factor against implant failure in the anterior mandible.


Subject(s)
Dental Implants , Immediate Dental Implant Loading , Humans , Dental Implantation, Endosseous/methods , Retrospective Studies , Dental Restoration Failure , Dental Prosthesis, Implant-Supported , Dental Prosthesis Design , Immediate Dental Implant Loading/methods , Mandible/surgery , Follow-Up Studies
17.
J Clin Periodontol ; 50 Suppl 26: 77-112, 2023 06.
Article in English | MEDLINE | ID: mdl-36807599

ABSTRACT

AIM: This systematic review and meta-analysis aims to assess the efficacy of risk factor control to prevent the occurrence of peri-implant diseases (PIDs) in adult patients awaiting dental implant rehabilitation (primordial prevention) or in patients with dental implants surrounded by healthy peri-implant tissues (primary prevention). MATERIALS AND METHODS: A literature search was performed without any time limit on different databases up to August 2022. Interventional and observational studies with at least 6 months of follow-up were considered. The occurrence of peri-implant mucositis and/or peri-implantitis was the primary outcome. Pooled data analyses were performed using random effect models according to the type of risk factor and outcome. RESULTS: Overall, 48 studies were selected. None assessed the efficacy of primordial preventive interventions for PIDs. Indirect evidence on the primary prevention of PID indicated that diabetic patients with dental implants and good glycaemic control have a significantly lower risk of peri-implantitis (odds ratio [OR] = 0.16; 95% confidence interval [CI]: 0.03-0.96; I2 : 0%), and lower marginal bone level (MBL) changes (OR = -0.36 mm; 95% CI: -0.65 to -0.07; I2 : 95%) compared to diabetic patients with poor glycaemic control. Patients attending supportive periodontal/peri-implant care (SPC) regularly have a lower risk of overall PIDs (OR = 0.42; 95% CI: 0.24-0.75; I2 : 57%) and peri-implantitis compared to irregular attendees. The risk of dental implant failure (OR = 3.76; 95% CI: 1.50-9.45; I2 : 0%) appears to be greater under irregular or no SPC than regular SPC. Implants sites with augmented peri-implant keratinized mucosa (PIKM) show lower peri-implant inflammation (SMD = -1.18; 95% CI: -1.85 to -0.51; I2 : 69%) and lower MBL changes (MD = -0.25; 95% CI: -0.45 to -0.05; I2 : 62%) compared to dental implants with PIKM deficiency. Studies on smoking cessation and oral hygiene behaviors were inconclusive. CONCLUSIONS: Within the limitations of available evidence, the present findings indicate that in patients with diabetes, glycaemic control should be promoted to avoid peri-implantitis development. The primary prevention of peri-implantitis should involve regular SPC. PIKM augmentation procedures, where a PIKM deficiency exists, may favour the control of peri-implant inflammation and the stability of MBL. Further studies are needed to assess the impact of smoking cessation and oral hygiene behaviours, as well as the implementation of standardized primordial and primary prevention protocols for PIDs.


Subject(s)
Dental Implants , Diabetes Mellitus , Peri-Implantitis , Stomatitis , Adult , Humans , Peri-Implantitis/prevention & control , Peri-Implantitis/epidemiology , Dental Implants/adverse effects , Stomatitis/epidemiology , Inflammation , Primary Prevention
18.
Crim Behav Ment Health ; 33(1): 62-71, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36715447

ABSTRACT

BACKGROUND: Research shows that the prevalence of substance use disorders among the prison population is high globally. Although prisons are highly controlled environments, access to drugs and other substances in prison remains a major problem. Yet, previous research is focussed mainly on the Western context, with the studies generally reporting on lifetime prevalence without reference to whether the disorders are manifest even within the controlled environment. AIMS: To estimate the prevalence of substance use disorders evident while in prison in Ghana and associated risk indicators. For these purposes, substance use disorder was defined by any indication of dependency, or escalating use or socially problematic use during the 12 months of imprisonment prior to the interview. METHODS: The study involved 500 adults (443 men and 57 women) in a medium-security prison in Ghana who had served at least 1 year of a prison sentence. Participants' alcohol use disorder was assessed separately from other substance use disorders which included cannabis, cocaine and other stimulants using the Mini International Neuropsychiatric Interview (MINI); it is a structured interview and diagnostic tool for major psychiatric and substance use disorders in DSM-5 and ICD-10. RESULTS: Two percent of the 500 participants had used alcohol to the level of alcohol use disorder, and 6% had other substance use disorders in 12 months prior to interview and while in prison. Cannabis (4%) and stimulants (3%) were the most frequently reported substance use disorders. Logistic regression model estimates indicate that younger age, prior offending and alcohol use dependence were significantly associated with such disorders in prison. CONCLUSION: In spite of efforts to prevent substance use in prison, nearly one in 10 of these prisoners were using alcohol or illicit drugs to a level indicative of substance use disorders. Our findings suggest that prioritising brief assessment may help identify those in most need of clinical help to limit their alcohol and illicit substance use problems.


Subject(s)
Alcoholism , Illicit Drugs , Prisoners , Substance-Related Disorders , Male , Adult , Humans , Female , Prisons , Alcoholism/epidemiology , Prevalence , Substance-Related Disorders/epidemiology , Prisoners/psychology , Ethanol
19.
Environ Sci Technol ; 57(1): 852-861, 2023 01 10.
Article in English | MEDLINE | ID: mdl-36548198

ABSTRACT

Expressing temporal changes in the use of pesticides, based not only on amounts (masses) but also on their toxicity for different species groups, was proposed as a sensible approach for evaluating potential environmental risks. Here, we calculated the total applied toxicity (TAT) between 1995 and 2019 for Germany, mapped it, and compared it to the US TAT and other risk indicators. Results show that the German TAT for terrestrial vertebrates decreased over time by about 20%. The TAT increased by a factor of three for fishes, largely due to insecticides, by a factor of two for soil organisms, largely due to fungicides and insecticides, and, to a lower extent, for terrestrial plants, solely due to herbicides. Other species groups showed no trends in TAT, which for pollinators likely results from neonicotinoid use restrictions. Many TAT trends from Germany and the US differ, partly due to different insecticide and fungicide uses. TAT, SYNOPS risk indicators, and the EU Harmonized Risk Indicators, currently being used to assess the German National Action Plan's goal to reduce risks by 30% by 2023, lead to clearly different risk perceptions. Validated approaches are needed for evaluation of risk quantifications at the national scale.


Subject(s)
Fungicides, Industrial , Insecticides , Pesticides , Animals , Environmental Monitoring/methods , Pesticides/toxicity , Pesticides/analysis , Agriculture/methods , Fungicides, Industrial/toxicity
20.
Ther Innov Regul Sci ; 57(2): 295-303, 2023 03.
Article in English | MEDLINE | ID: mdl-36269551

ABSTRACT

BACKGROUND: Central monitoring, which typically includes the use of key risk indicators (KRIs), aims at improving the quality of clinical research by pro-actively identifying and remediating emerging issues in the conduct of a clinical trial that may have an adverse impact on patient safety and/or the reliability of trial results. However, there has to-date been a relative lack of direct quantitative evidence published supporting the claim that central monitoring actually leads to improved quality. MATERIAL AND METHODS: Nine commonly used KRIs were analyzed for evidence of quality improvement using data retrieved from a large central monitoring platform. A total of 212 studies comprising 1676 sites with KRI signals were used in the analysis, representing central monitoring activity from 23 different sponsor organizations. Two quality improvement metrics were assessed for each KRI, one based on a statistical score (p-value) and the other based on a KRI's observed value. RESULTS: Both KRI quality metrics showed improvement in a vast majority of sites (82.9% for statistical score, 81.1% for observed KRI value). Additionally, the statistical score and the observed KRI values improved, respectively by 66.1% and 72.4% on average towards the study average for those sites showing improvement. CONCLUSION: The results of this analysis provide clear quantitative evidence supporting the hypothesis that use of KRIs in central monitoring is leading to improved quality in clinical trial conduct and associated data across participating sites.


Subject(s)
Benchmarking , Humans , Reproducibility of Results
SELECTION OF CITATIONS
SEARCH DETAIL
...