Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
2.
Clin Pharmacol Ther ; 114(3): 604-613, 2023 09.
Article in English | MEDLINE | ID: mdl-37342987

ABSTRACT

During the coronavirus disease 2019 (COVID-19) pandemic, the urgency for updated evidence to inform public health and clinical care placed systematic literature reviews (SLRs) at the cornerstone of research. We aimed to summarize evidence on prognostic factors for COVID-19 outcomes through published SLRs and to critically assess quality elements in the findings' interpretation. An umbrella review was conducted via electronic databases from January 2020 to April 2022. All SLRs (and meta-analyses) in English were considered. Data screening and extraction were conducted by two independent reviewers. AMSTAR 2 tool was used to assess SLR quality. The study was registered with PROSPERO (CRD4202232576). Out of 4,564 publications, 171 SLRs were included of which 3 were umbrella reviews. Our primary analysis included 35 SLRs published in 2022, which incorporated studies since the beginning of the pandemic. Consistent findings showed that, for adults, older age, obesity, heart disease, diabetes, and cancer were more strongly predictive of risk of hospitalization, intensive care unit admission, and mortality due to COVID-19. Male sex was associated with higher risk of short-term adverse outcomes, but female sex was associated with higher risk of long COVID. For children, socioeconomic determinants that may unravel COVID-19 disparities were rarely reported. This review highlights key prognostic factors of COVID-19, which can help clinicians and health officers identify high-risk groups for optimal care. Findings can also help optimize confounding adjustment and patient phenotyping in comparative effectiveness research. A living SLR approach may facilitate dissemination of new findings. This paper is endorsed by the International Society for Pharmacoepidemiology.


Subject(s)
COVID-19 , Adult , Child , Humans , Male , Female , Post-Acute COVID-19 Syndrome , Pharmacoepidemiology , Prognosis , Hospitalization
3.
Int J Epidemiol ; 51(6): 1847-1861, 2022 12 13.
Article in English | MEDLINE | ID: mdl-36172959

ABSTRACT

BACKGROUND: Cardiovascular disease (CVD) has a disproportionate effect on mortality among the poorest people. We assessed the impact on CVD and all-cause mortality of the world's largest conditional cash transfer, Brazil's Bolsa Família Programme (BFP). METHODS: We linked administrative data from the 100 Million Brazilian Cohort with BFP receipt and national mortality data. We followed individuals who applied for BFP between 1 January 2011 and 31 December 2015, until 31 December 2015. We used marginal structural models to estimate the effect of BFP on all-age and premature (30-69 years) CVD and all-cause mortality. We conducted stratified analyses by levels of material deprivation and access to healthcare. We checked the robustness of our findings by restricting the analysis to municipalities with better mortality data and by using alternative statistical methods. RESULTS: We studied 17 981 582 individuals, of whom 4 855 324 were aged 30-69 years. Three-quarters (76.2%) received BFP, with a mean follow-up post-award of 2.6 years. We detected 106 807 deaths by all causes, of which 60 893 were premature; and 23 389 CVD deaths, of which 15 292 were premature. BFP was associated with reductions in premature all-cause mortality [hazard ratio (HR) = 0.96, 95% CI = 0.94-0.98], premature CVD (HR = 0.96, 95% CI = 0.92-1.00) and all-age CVD (HR = 0.96, 95% CI = 0.93-1.00) but not all-age all-cause mortality (HR = 1.00, 95% CI = 0.98-1.02). In stratified and robustness analyses, BFP was consistently associated with mortality reductions for individuals living in the two most deprived quintiles. CONCLUSIONS: BFP appears to have a small to null effect on premature CVD and all-cause mortality in the short term; the long-term impact remains unknown.


Subject(s)
Cardiovascular Diseases , Poverty , Humans , Brazil/epidemiology
4.
PLoS One ; 17(5): e0268500, 2022.
Article in English | MEDLINE | ID: mdl-35604890

ABSTRACT

BACKGROUND: Conditional Cash Transfer Programs have been developed in Latin America in response to poverty and marked social inequalities on the continent. In Brazil, the Bolsa Familia Program (BFP) was implemented to alleviate poverty and improve living conditions, health, and education for socioeconomically vulnerable populations. However, the effect of this intervention on maternal and child health is not well understood. METHODS: We will evaluate the effect of BFP on maternal and child outcomes: 1. Birth weight; 2. Preterm birth; 3. Maternal mortality; and 4. Child growth. Dynamic retrospective cohort data from the 100 Million Brazilian Cohort (2001 to 2015) will be linked to three different databases: Live Birth Information System (2004 to 2015); Mortality Information System (2011 to 2015); and Food and Nutritional Surveillance System (2008 to 2017). The definition of exposure to the BFP varies according to the outcome studied. Those who never received the benefit until the outcome or until the end of the follow-up will be defined as not exposed. The effects of BFP on maternal and child outcomes will be estimated by a combination of propensity score-based methods and weighted logistic regressions. The analyses will be further stratified to reflect changes in the benefit entitlement before and after 2012. DISCUSSION: Harnessing a large linked administrative cohort allows us to assess the effect of the BFP on maternal and child health, while considering a wide range of explanatory and confounding variables.


Subject(s)
Child Health , Premature Birth , Brazil/epidemiology , Child , Female , Humans , Infant, Newborn , Poverty , Retrospective Studies
5.
Trials ; 23(1): 209, 2022 Mar 12.
Article in English | MEDLINE | ID: mdl-35279215

ABSTRACT

BACKGROUND: Female sex workers (FSW) in sub-Saharan Africa are disproportionately affected by HIV and are critical to engage in HIV prevention, testing and care services. We describe the design of our evaluation of the 'AMETHIST' intervention, nested within a nationally-scaled programme for FSW in Zimbabwe. We hypothesise that the implementation of this intervention will result in a reduction in the risk of HIV transmission within sex work. METHODS: The AMETHIST intervention (Adapted Microplanning to Eliminate Transmission of HIV in Sex Transactions) is a risk-differentiated intervention for FSW, centred around the implementation of microplanning and self-help groups. It is designed to support uptake of, and adherence to, HIV prevention, testing and treatment behaviours among FSW. Twenty-two towns in Zimbabwe were randomised to receive either the Sisters programme (usual care) or the Sisters programme plus AMETHIST. The composite primary outcome is defined as the proportion of all FSW who are at risk of either HIV acquisition (HIV-negative and not fully protected by prevention interventions) or of HIV transmission (HIV-positive, not virally suppressed and not practicing consistent condom use). The outcome will be assessed after 2 years of intervention delivery in a respondent-driven sampling survey (total n = 4400; n = 200 FSW recruited at each site). Primary analysis will use the 'RDS-II' method to estimate cluster summaries and will adapt Hayes and Moulton's '2-step' method produce adjusted effect estimates. An in-depth process evaluation guided by our project trajectory will be undertaken. DISCUSSION: Innovative pragmatic trials are needed to generate evidence on effectiveness of combination interventions in HIV prevention and treatment in different contexts. We describe the design and analysis of such a study. TRIAL REGISTRATION: Pan African Clinical Trials Registry PACTR202007818077777 . Registered on 2 July 2020.


Subject(s)
HIV Infections , Sex Workers , Female , HIV Infections/diagnosis , HIV Infections/epidemiology , HIV Infections/prevention & control , Health Services , Humans , Randomized Controlled Trials as Topic , Safe Sex , Zimbabwe/epidemiology
6.
AIDS ; 36(8): 1141-1150, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35170527

ABSTRACT

OBJECTIVES: To estimate HIV incidence among female sex workers (FSW) in Zimbabwe: using HIV prevalence by age and number of years since started selling sex (YSSS). DESIGN: We pooled data from FSW aged 18-39 participating in respondent-driven sampling surveys conducted in Zimbabwe between 2011 and 2017. METHODS: For each year of age, we estimated: HIV prevalence ( Pt ) and the change in HIV prevalence from the previous age ( Pt - Pt -1 ). We then estimated the rate of new HIV infections during that year of age: It  =  Pt - Pt -1 /(1 - Pt -1 ), and calculated HIV incidence for 18-24 and 25-39 year-olds separately as the weighted average of It . We estimated HIV incidence for FSW 1-5 years and 6-15 years since first selling sex using the same approach, and compared HIV prevalence among FSW first selling sex at their current age with the general population. RESULTS: Among 9906 women, 50.2% were HIV positive. Based on HIV prevalence increases by age, we estimated an HIV incidence of 6.3/100 person-years at risk (pyar) (95% confidence interval [CI] 5.3, 7.6) among 18-24 year-olds, and 3.3/100 pyar (95% CI 1.3, 4.2) among 25-39 year-olds. Based on prevalence increases by YSSS, HIV incidence was 5.3/100 pyar (95% CI 4.3, 8.5) between 1 and 5 years since first selling sex, and 2.1/100 pyar (95% CI -1.3, 7.2) between 6 and 15 years. CONCLUSIONS: Our analysis is consistent with very high HIV incidence among FSW in Zimbabwe, especially among those who are young and recently started selling sex. There is a critical need to engage young entrants into sex work in interventions that reduce their HIV risk.


Subject(s)
HIV Infections , Sex Workers , Child , Female , HIV Infections/epidemiology , Humans , Incidence , Prevalence , Zimbabwe/epidemiology
7.
BMJ Evid Based Med ; 27(2): 109-119, 2022 04.
Article in English | MEDLINE | ID: mdl-33298465

ABSTRACT

INTRODUCTION: High-quality randomised controlled trials (RCTs) provide the most reliable evidence on the comparative efficacy of new medicines. However, non-randomised studies (NRS) are increasingly recognised as a source of insights into the real-world performance of novel therapeutic products, particularly when traditional RCTs are impractical or lack generalisability. This means there is a growing need for synthesising evidence from RCTs and NRS in healthcare decision making, particularly given recent developments such as innovative study designs, digital technologies and linked databases across countries. Crucially, however, no formal framework exists to guide the integration of these data types. OBJECTIVES AND METHODS: To address this gap, we used a mixed methods approach (review of existing guidance, methodological papers, Delphi survey) to develop guidance for researchers and healthcare decision-makers on when and how to best combine evidence from NRS and RCTs to improve transparency and build confidence in the resulting summary effect estimates. RESULTS: Our framework comprises seven steps on guiding the integration and interpretation of evidence from NRS and RCTs and we offer recommendations on the most appropriate statistical approaches based on three main analytical scenarios in healthcare decision making (specifically, 'high-bar evidence' when RCTs are the preferred source of evidence, 'medium,' and 'low' when NRS is the main source of inference). CONCLUSION: Our framework augments existing guidance on assessing the quality of NRS and their compatibility with RCTs for evidence synthesis, while also highlighting potential challenges in implementing it. This manuscript received endorsement from the International Society for Pharmacoepidemiology.


Subject(s)
Delivery of Health Care , Research Design , Decision Making , Humans , Randomized Controlled Trials as Topic
9.
Health Technol Assess ; 25(66): 1-126, 2021 11.
Article in English | MEDLINE | ID: mdl-34812138

ABSTRACT

BACKGROUND: Although routine NHS data potentially include all patients, confounding limits their use for causal inference. Methods to minimise confounding in observational studies of implantable devices are required to enable the evaluation of patients with severe systemic morbidity who are excluded from many randomised controlled trials. OBJECTIVES: Stage 1 - replicate the Total or Partial Knee Arthroplasty Trial (TOPKAT), a surgical randomised controlled trial comparing unicompartmental knee replacement with total knee replacement using propensity score and instrumental variable methods. Stage 2 - compare the risk benefits and cost-effectiveness of unicompartmental knee replacement with total knee replacement surgery in patients with severe systemic morbidity who would have been ineligible for TOPKAT using the validated methods from stage 1. DESIGN: This was a cohort study. SETTING: Data were obtained from the National Joint Registry database and linked to hospital inpatient (Hospital Episode Statistics) and patient-reported outcome data. PARTICIPANTS: Stage 1 - people undergoing unicompartmental knee replacement surgery or total knee replacement surgery who met the TOPKAT eligibility criteria. Stage 2 - participants with an American Society of Anesthesiologists grade of ≥ 3. INTERVENTION: The patients were exposed to either unicompartmental knee replacement surgery or total knee replacement surgery. MAIN OUTCOME MEASURES: The primary outcome measure was the postoperative Oxford Knee Score. The secondary outcome measures were 90-day postoperative complications (venous thromboembolism, myocardial infarction and prosthetic joint infection) and 5-year revision risk and mortality. The main outcome measures for the health economic analysis were health-related quality of life (EuroQol-5 Dimensions) and NHS hospital costs. RESULTS: In stage 1, propensity score stratification and inverse probability weighting replicated the results of TOPKAT. Propensity score adjustment, propensity score matching and instrumental variables did not. Stage 2 included 2256 unicompartmental knee replacement patients and 57,682 total knee replacement patients who had severe comorbidities, of whom 145 and 23,344 had linked Oxford Knee Scores, respectively. A statistically significant but clinically irrelevant difference favouring unicompartmental knee replacement was observed, with a mean postoperative Oxford Knee Score difference of < 2 points using propensity score stratification; no significant difference was observed using inverse probability weighting. Unicompartmental knee replacement more than halved the risk of venous thromboembolism [relative risk 0.33 (95% confidence interval 0.15 to 0.74) using propensity score stratification; relative risk 0.39 (95% confidence interval 0.16 to 0.96) using inverse probability weighting]. Unicompartmental knee replacement was not associated with myocardial infarction or prosthetic joint infection using either method. In the long term, unicompartmental knee replacement had double the revision risk of total knee replacement [hazard ratio 2.70 (95% confidence interval 2.15 to 3.38) using propensity score stratification; hazard ratio 2.60 (95% confidence interval 1.94 to 3.47) using inverse probability weighting], but half of the mortality [hazard ratio 0.52 (95% confidence interval 0.36 to 0.74) using propensity score stratification; insignificant effect using inverse probability weighting]. Unicompartmental knee replacement had lower costs and higher quality-adjusted life-year gains than total knee replacement for stage 2 participants. LIMITATIONS: Although some propensity score methods successfully replicated TOPKAT, unresolved confounding may have affected stage 2. Missing Oxford Knee Scores may have led to information bias. CONCLUSIONS: Propensity score stratification and inverse probability weighting successfully replicated TOPKAT, implying that some (but not all) propensity score methods can be used to evaluate surgical innovations and implantable medical devices using routine NHS data. Unicompartmental knee replacement was safer and more cost-effective than total knee replacement for patients with severe comorbidity and should be considered the first option for suitable patients. FUTURE WORK: Further research is required to understand the performance of propensity score methods for evaluating surgical innovations and implantable devices. TRIAL REGISTRATION: This trial is registered as EUPAS17435. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 66. See the NIHR Journals Library website for further project information.


We compared the risks and benefits of partial and total knee replacements in NHS patients with a complex medical history who would normally be excluded from randomised trials on this topic. We used information that was collected during hospital appointments for people who had a knee replacement between 2009 and 2016. It is difficult to directly compare the two groups because each individual patient has a different medical history. We tested advanced statistical methods to account for these differences. In stage 1, we showed that some of these advanced statistical methods could replicate the results of a recently published surgical trial using routine data from the NHS. We compared patients in the trial with similar patients who were operated on in the NHS. Three of the proposed methods showed results similar to those obtained from the Total or Partial Knee Arthroplasty Trial (TOPKAT). In stage 2, we used the successful methods from stage 1 to study the risks, benefits and costs of partial and total knee replacement surgery in patients with complex medical histories. Two of the statistical methods found that patients who had a partial knee replacement had less self-reported pain and better function after surgery than patients who had a total knee replacement. All three methods found that partial knee replacement was safer, was associated with a lower risk of blood clots (a known complication of knee surgery) and had lower mortality over 5 years. However, patients who had a partial knee replacement were twice as likely as those with a total knee replacement to need a second surgery within 5 years. We found that partial knee replacements were less costly to the NHS and were associated with better overall quality of life for patients than total knee replacement.


Subject(s)
Arthroplasty, Replacement, Knee , Cohort Studies , Cost-Benefit Analysis , Humans , Propensity Score , Quality of Life , Quality-Adjusted Life Years
11.
Health Technol Assess ; 25(17): 1-106, 2021 03.
Article in English | MEDLINE | ID: mdl-33739919

ABSTRACT

BACKGROUND: Bisphosphonates are contraindicated in patients with stage 4+ chronic kidney disease. However, they are widely used to prevent fragility fractures in stage 3 chronic kidney disease, despite a lack of good-quality data on their effects. OBJECTIVES: The aims of each work package were as follows. Work package 1: to study the relationship between bisphosphonate use and chronic kidney disease progression. Work package 2: to study the association between using bisphosphonates and fracture risk. Work package 3: to determine the risks of hypocalcaemia, hypophosphataemia, acute kidney injury and upper gastrointestinal events associated with using bisphosphonates. Work package 4: to investigate the association between using bisphosphonates and changes in bone mineral density over time. DESIGN: This was a new-user cohort study design with propensity score matching. SETTING AND DATA SOURCES: Data were obtained from UK NHS primary care (Clinical Practice Research Datalink GOLD database) and linked hospital inpatient records (Hospital Episode Statistics) for work packages 1-3 and from the Danish Odense University Hospital Databases for work package 4. PARTICIPANTS: Patients registered in the data sources who had at least one measurement of estimated glomerular filtration rate of < 45 ml/minute/1.73 m2 were eligible. A second estimated glomerular filtration rate value of < 45 ml/minute/1.73 m2 within 1 year after the first was requested for work packages 1 and 3. Patients with no Hospital Episode Statistics linkage were excluded from work packages 1-3. Patients with < 1 year of run-in data before index estimated glomerular filtration rate and previous users of anti-osteoporosis medications were excluded from work packages 1-4. INTERVENTIONS/EXPOSURE: Bisphosphonate use, identified from primary care prescriptions (for work packages 1-3) or pharmacy dispensations (for work package 4), was the main exposure. MAIN OUTCOME MEASURES: Work package 1: chronic kidney disease progression, defined as stage worsening or starting renal replacement. Work package 2: hip fracture. Work package 3: acute kidney injury, hypocalcaemia and hypophosphataemia identified from Hospital Episode Statistics, and gastrointestinal events identified from Clinical Practice Research Datalink or Hospital Episode Statistics. Work package 4: annualised femoral neck bone mineral density percentage change. RESULTS: Bisphosphonate use was associated with an excess risk of chronic kidney disease progression (subdistribution hazard ratio 1.12, 95% confidence interval 1.02 to 1.24) in work package 1, but did not increase the probability of other safety outcomes in work package 3. The results from work package 2 suggested that bisphosphonate use increased fracture risk (hazard ratio 1.25, 95% confidence interval 1.13 to 1.39) for hip fractures, but sensitivity analyses suggested that this was related to unresolved confounding. Conversely, work package 4 suggested that bisphosphonates improved bone mineral density, with an average 2.65% (95% confidence interval 1.32% to 3.99%) greater gain in femoral neck bone mineral density per year in bisphosphonate users than in matched non-users. LIMITATIONS: Confounding by indication was a concern for the clinical effectiveness (i.e. work package 2) data. Bias analyses suggested that these findings were due to inappropriate adjustment for pre-treatment risk. work packages 3 and 4 were based on small numbers of events and participants, respectively. CONCLUSIONS: Bisphosphonates were associated with a 12% excess risk of chronic kidney disease progression in participants with stage 3B+ chronic kidney disease. No other safety concerns were identified. Bisphosphonate therapy increased bone mineral density, but the research team failed to demonstrate antifracture effectiveness. FUTURE WORK: Randomised controlled trial data are needed to demonstrate antifracture efficacy in patients with stage 3B+ chronic kidney disease. More safety analyses are needed to characterise the renal toxicity of bisphosphonates in stage 3A chronic kidney disease, possibly using observational data. STUDY REGISTRATION: This study is registered as EUPAS10029. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 17. See the NIHR Journals Library website for further project information. The project was also supported by the National Institute for Health Research Biomedical Research Centre, Oxford.


RATIONALE AND AIMS: Bisphosphonates are used to prevent fractures in people with fragile bones. People with chronic kidney disease have a high risk of fracturing, but the safety and effectiveness of bisphosphonates in severe chronic kidney disease is unclear. The aim of this study was to assess the benefits (e.g. bone strength improvement and fracture prevention) and the risks of unwanted effects associated with bisphosphonates for people with moderate to severe chronic kidney disease. METHODS: Anonymised primary and secondary care electronic medical records data from the UK NHS were used, as well as a Danish equivalent that included bone density scans. Anyone in these databases with a measure of reduced kidney function that suggested moderate to severe chronic kidney disease was eligible, which was > 220,000 people from the UK. Over 20,000 of them used bisphosphonates. Bisphosphonate users were matched to non-users with similar age, sex and other characteristics. RESULTS: Bisphosphonate users had a 12% higher risk of their chronic kidney disease getting worse than non-users. Their risks of other side effects, such as acute kidney injuries and gastrointestinal problems, did not change. Bisphosphonate users had a 25% higher risk of fractures than non-users in the UK database, probably because the matching methods did not create similar-enough groups of users and non-users. However, it was found that bisphosphonate improved bone density in the Danish database. Bone density is a proxy for bone strength, so better bone density should mean fewer fractures. CONCLUSIONS: These results suggest that bisphosphonate therapy may make moderate to severe chronic kidney disease worse. More studies are needed on how bisphosphonates affect milder chronic kidney disease. Bisphosphonates were associated with better bone strength, but it could not be demonstrated that they reduced fracture risk. More data are required, probably from a placebo-controlled trial, to determine whether or not bisphosphonates prevent fractures in people with moderate to severe chronic kidney disease and whether or not this is worth the risk of their chronic kidney disease worsening.


Subject(s)
Fractures, Bone , Renal Insufficiency, Chronic , Cohort Studies , Diphosphonates/adverse effects , Fractures, Bone/epidemiology , Humans , Propensity Score , Renal Insufficiency, Chronic/complications , Renal Insufficiency, Chronic/epidemiology
12.
J Bone Miner Res ; 36(5): 820-832, 2021 05.
Article in English | MEDLINE | ID: mdl-33373491

ABSTRACT

Bisphosphonates are the first-line treatment for preventing fractures in osteoporosis patients. However, their use is contraindicated or to be used with caution in chronic kidney disease (CKD) patients, primarily because of a lack of information about their safety and effectiveness. We aimed to investigate the safety of oral bisphosphonates in patients with moderate to severe CKD, using primary-care electronic records from two cohorts, CPRD GOLD (1997-2016) and SIDIAP (2007-2015) in the UK and Catalonia, respectively. Both databases were linked to hospital records. SIDIAP was also linked to end-stage renal disease registry data. Patients with CKD stages 3b to 5, based on two or more estimated glomerular filtration rate measurements less than 45 mL/min/1.73 m2 , aged 40 years or older were identified. New bisphosphonate users were propensity score-matched with up to five non-users to minimize confounding within this population. Our primary outcome was CKD stage worsening (estimated glomerular filtration rate [eGFR] decline or renal replacement therapy). Secondary outcomes were acute kidney injury, gastrointestinal bleeding/ulcers, and severe hypocalcemia. Hazard ratios (HRs) were estimated using Cox regression and Fine and Gray sub-HRs were calculated for competing risks. We matched 2447 bisphosphonate users with 8931 non-users from CPRD and 1399 users with 6547 non-users from SIDIAP. Bisphosphonate use was associated with greater risk of CKD progression in CPRD (sub-HR [95% CI]: 1.14 [1.04, 1.26]) and SIDIAP (sub-HR: 1.15 [1.04, 1.27]). No risk differences were found for acute kidney injury, gastrointestinal bleeding/ulcers, or hypocalcemia. Hence, we can conclude a modest (15%) increased risk of CKD progression was identified in association with bisphosphonate use. No other safety concerns were identified. Our findings should be considered before prescribing bisphosphonates to patients with moderate to severe CKD. © 2020 American Society for Bone and Mineral Research (ASBMR).


Subject(s)
Osteoporosis , Renal Insufficiency, Chronic , Cohort Studies , Diphosphonates/adverse effects , Glomerular Filtration Rate , Humans , Osteoporosis/drug therapy , Osteoporosis/epidemiology , Renal Insufficiency, Chronic/complications , Renal Insufficiency, Chronic/drug therapy , Renal Insufficiency, Chronic/epidemiology , Risk Factors
13.
BMC Med Inform Decis Mak ; 20(1): 289, 2020 11 09.
Article in English | MEDLINE | ID: mdl-33167998

ABSTRACT

BACKGROUND: Record linkage is the process of identifying and combining records about the same individual from two or more different datasets. While there are many open source and commercial data linkage tools, the volume and complexity of currently available datasets for linkage pose a huge challenge; hence, designing an efficient linkage tool with reasonable accuracy and scalability is required. METHODS: We developed CIDACS-RL (Centre for Data and Knowledge Integration for Health - Record Linkage), a novel iterative deterministic record linkage algorithm based on a combination of indexing search and scoring algorithms (provided by Apache Lucene). We described how the algorithm works and compared its performance with four open source linkage tools (AtyImo, Febrl, FRIL and RecLink) in terms of sensitivity and positive predictive value using gold standard dataset. We also evaluated its accuracy and scalability using a case-study and its scalability and execution time using a simulated cohort in serial (single core) and multi-core (eight core) computation settings. RESULTS: Overall, CIDACS-RL algorithm had a superior performance: positive predictive value (99.93% versus AtyImo 99.30%, RecLink 99.5%, Febrl 98.86%, and FRIL 96.17%) and sensitivity (99.87% versus AtyImo 98.91%, RecLink 73.75%, Febrl 90.58%, and FRIL 74.66%). In the case study, using a ROC curve to choose the most appropriate cut-off value (0.896), the obtained metrics were: sensitivity = 92.5% (95% CI 92.07-92.99), specificity = 93.5% (95% CI 93.08-93.8) and area under the curve (AUC) = 97% (95% CI 96.97-97.35). The multi-core computation was about four times faster (150 seconds) than the serial setting (550 seconds) when using a dataset of 20 million records. CONCLUSION: CIDACS-RL algorithm is an innovative linkage tool for huge datasets, with higher accuracy, improved scalability, and substantially shorter execution time compared to other existing linkage tools. In addition, CIDACS-RL can be deployed on standard computers without the need for high-speed processors and distributed infrastructures.


Subject(s)
Datasets as Topic , Information Storage and Retrieval , Medical Record Linkage , Algorithms , Cohort Studies , Humans , Medical Records Systems, Computerized
14.
Arch Osteoporos ; 15(1): 81, 2020 06 01.
Article in English | MEDLINE | ID: mdl-32483674

ABSTRACT

Bisphosphonates are contraindicated in moderate-to-severe chronic kidney disease patients. However, they are used to prevent fragility fractures in patients with impaired kidney function, despite a lack of evidence on their effects on bone density in these patients. We demonstrated that Alendronate had a positive effect on bone in these patients. PURPOSE: This study aimed to assess the association between alendronate use and bone mineral density (BMD) change in subjects with moderate-severe chronic kidney disease (CKD). METHODS: We created a cohort of CKD stage 3B-5 patients by linking all DXA-based measurements in the Funen area, Denmark, to biochemistry, national health registries and filled prescriptions. Exposure was dispensation of alendronate and the outcome was annualized percentage change in BMD at the femoral neck, total hip and lumbar spine. Individuals were followed from first BMD to the latest of subsequent DXA measurements. Alendronate non-users were identified using incidence density sampling and matched groups were created using propensity scores. Linear regression was used to estimate average differences in the annualized BMD. RESULTS: Use of alendronate was rare in this group of patients: propensity score matching (PSM) resulted in 71 alendronate users and 142 non-users with stage 3B-5 CKD (as in the 1 year before DXA). Whilst alendronate users gained an average 1.07% femoral neck BMD per year, non-users lost an average of 1.59% per annum. The PSM mean differences in annualized BMD were + 2.65% (1.32%, 3.99%), + 3.01% (1.74%, 4.28%) and + 2.12% (0.98%, 3.25%) at the femoral neck, total hip and spine BMD, respectively, all in favour of alendronate users. CONCLUSION: In a real-world cohort of women with stage 3B-5 CKD, use of alendronate appears associated with a significant improvement of 2-3% per year in the femoral neck, total hip and spine BMD. More data are needed on the anti-fracture effectiveness and safety of bisphosphonate therapy in moderate-severe CKD.


Subject(s)
Renal Insufficiency, Chronic , Alendronate/therapeutic use , Bone Density , Bone Density Conservation Agents/therapeutic use , Denmark/epidemiology , Female , Humans , Propensity Score , Renal Insufficiency, Chronic/drug therapy , Renal Insufficiency, Chronic/epidemiology
15.
J Bone Miner Res ; 35(5): 894-900, 2020 05.
Article in English | MEDLINE | ID: mdl-31968134

ABSTRACT

Oral bisphosphonates (oBPs) have been associated with reduced fractures and mortality. However, their risks and benefits are unclear in patients with moderate-severe CKD. This study examined the association between oBPs and all-cause mortality in G3B-5D CKD. This is a population-based cohort study including all subjects with an estimated glomerular filtration rate (eGFR) <45/mL/min/1.73 m2 (G3B: eGFR <45/mL/min/1.73 m2 G4: eGFR 15-29/mL/min/1.73 m2 G5: eGFR <15/mL/min/1.73 m2 G5D: hemodialysis) aged 40+ years from the UK Clinical Practice Research Datalink (CPRD) and the Catalan Information System for Research in Primary Care (SIDIAP). Previous and current users of other anti-osteoporosis drugs were excluded. oBP use was modeled as a time-varying exposure to avoid immortal time bias. Treatment episodes in oBP users were created by concatenating prescriptions until patients switched or stopped therapy or were censored or died. A washout period of 180 days was added to (date of last prescription +180 days). Propensity scores (PSs) were calculated using prespecified predictors of mortality including age, gender, baseline eGFR, socioeconomic status, comorbidities, previous fracture, co-medications, and number of hospital admissions in the previous year. Cox models were used for PS adjustment before and after PS trimming (the first and last quintiles). In the CPRD, of 19,351 oBP users and 210,954 non-oBP users, 5234 (27%) and 85,105 (40%) deaths were recorded over 45,690 and 915,867 person-years of follow-up, respectively. oBP users had 8% lower mortality risk compared to non-oBP users (hazard ratio [HR] 0.92; 95% CI, 0.89 to 0.95). Following PS trimming, this became nonsignificant (HR 0.98; 95% CI, 0.94 to 1.04). In the SIDIAP, of 4146 oBP users and 86,127 non-oBP users, 1330 (32%) and 36,513 (42%) died, respectively. oBPs were not associated with mortality in PS adjustment and trimming (HR 1.04; 95% CI, 0.99 to 1.1 and HR 0.95; 95% CI, 0.89 to 1.01). In this observational, patient-based cohort study, oBPs were not associated with increased mortality among patients with moderate-severe CKD. However, further studies are needed on other effects of oBPs in CKD patients. © 2020 American Society for Bone and Mineral Research.


Subject(s)
Osteoporosis , Renal Insufficiency, Chronic , Cohort Studies , Diphosphonates/therapeutic use , Glomerular Filtration Rate , Humans , Osteoporosis/drug therapy , Osteoporosis/epidemiology , Renal Insufficiency, Chronic/drug therapy , Renal Insufficiency, Chronic/epidemiology
16.
Front Pharmacol ; 10: 973, 2019.
Article in English | MEDLINE | ID: mdl-31619986

ABSTRACT

Randomized clinical trials (RCT) are accepted as the gold-standard approaches to measure effects of intervention or treatment on outcomes. They are also the designs of choice for health technology assessment (HTA). Randomization ensures comparability, in both measured and unmeasured pretreatment characteristics, of individuals assigned to treatment and control or comparator. However, even adequately powered RCTs are not always feasible for several reasons such as cost, time, practical and ethical constraints, and limited generalizability. RCTs rely on data collected on selected, homogeneous population under highly controlled conditions; hence, they provide evidence on efficacy of interventions rather than on effectiveness. Alternatively, observational studies can provide evidence on the relative effectiveness or safety of a health technology compared to one or more alternatives when provided under the setting of routine health care practice. In observational studies, however, treatment assignment is a non-random process based on an individual's baseline characteristics; hence, treatment groups may not be comparable in their pretreatment characteristics. As a result, direct comparison of outcomes between treatment groups might lead to biased estimate of the treatment effect. Propensity score approaches have been used to achieve balance or comparability of treatment groups in terms of their measured pretreatment covariates thereby controlling for confounding bias in estimating treatment effects. Despite the popularity of propensity scores methods and recent important methodological advances, misunderstandings on their applications and limitations are all too common. In this article, we present a review of the propensity scores methods, extended applications, recent advances, and their strengths and limitations.

17.
Front Pharmacol ; 10: 984, 2019.
Article in English | MEDLINE | ID: mdl-31607900

ABSTRACT

Health technology assessment (HTA) is the systematic evaluation of the properties and impacts of health technologies and interventions. In this article, we presented a discussion of HTA and its evolution in Brazil, as well as a description of secondary data sources available in Brazil with potential applications to generate evidence for HTA and policy decisions. Furthermore, we highlighted record linkage, ongoing record linkage initiatives in Brazil, and the main linkage tools developed and/or used in Brazilian data. Finally, we discussed the challenges and opportunities of using secondary data for research in the Brazilian context. In conclusion, we emphasized the availability of high quality data and an open, modern attitude toward the use of data for research and policy. This is supported by a rigorous but enabling legal framework that will allow the conduct of large-scale observational studies to evaluate clinical, economical, and social impacts of health technologies and social policies.

18.
Clin Epidemiol ; 11: 197-205, 2019.
Article in English | MEDLINE | ID: mdl-30881136

ABSTRACT

Interrupted time series (ITS) analysis is being increasingly used in epidemiology. Despite its growing popularity, there is a scarcity of guidance on power and sample size considerations within the ITS framework. Our aim of this study was to assess the statistical power to detect an intervention effect under various real-life ITS scenarios. ITS datasets were created using Monte Carlo simulations to generate cumulative incidence (outcome) values over time. We generated 1,000 datasets per scenario, varying the number of time points, average sample size per time point, average relative reduction post intervention, location of intervention in the time series, and reduction mediated via a 1) slope change and 2) step change. Performance measures included power and percentage bias. We found that sample size per time point had a large impact on power. Even in scenarios with 12 pre-intervention and 12 post-intervention time points with moderate intervention effect sizes, most analyses were underpowered if the sample size per time point was low. We conclude that various factors need to be collectively considered to ensure adequate power for an ITS study. We demonstrate a means of providing insight into underlying sample size requirements in ordinary least squares (OLS) ITS analysis of cumulative incidence measures, based on prespecified parameters and have developed Stata code to estimate this.

19.
Rheumatology (Oxford) ; 58(7): 1168-1175, 2019 07 01.
Article in English | MEDLINE | ID: mdl-30649521

ABSTRACT

OBJECTIVES: Previous ecological data suggest a decline in the need for joint replacements in RA patients following the introduction of TNF inhibitor (TNFi) therapy, although patient-level data are lacking. Our primary aim was to estimate the association between TNFi use and subsequent incidence of total hip replacement (THR) and total knee replacement. METHODS: A propensity score matched cohort was analysed using the British Society for Rheumatology Biologics Registry (2001-2016) for RA data. Propensity score estimates were used to match TNFi users to similar conventional synthetic DMARD users (with replacement) using a 1:1 ratio. Weighted multivariable Cox regression was used to estimate the impact of TNFi on study outcomes. Effect modification by baseline age and disease severity were investigated. Joint replacement at other sites was also analysed. An instrumental variable sensitivity analysis was also performed. RESULTS: The matched analysis contained a total of 19 116 patient records. Overall, there was no significant association between TNFi use vs conventional synthetic DMARD on rates of THR (hazard ratios = 0.86 [95% CI: 0.60, 1.22]) although there was significant effect modification by age (P < 0.001). TNFi was associated with a reduction in THR among those >60 years old (hazard ratio = 0.60 [CI: 0.41, 0.87]) but not in younger patients. No significant associations were found for total knee replacement or other joint replacement. CONCLUSION: Overall, no association was found between the use of TNFi and subsequent incidence of joint replacement. However, TNFi was associated with a 40% relative reduction in THR rates among older patients.


Subject(s)
Antirheumatic Agents/therapeutic use , Arthritis, Rheumatoid/drug therapy , Arthroplasty, Replacement, Hip/statistics & numerical data , Arthroplasty, Replacement, Knee/statistics & numerical data , Tumor Necrosis Factor Inhibitors/therapeutic use , Adult , Age Factors , Aged , Arthritis, Rheumatoid/epidemiology , Arthritis, Rheumatoid/surgery , Cohort Studies , Female , Follow-Up Studies , Humans , Male , Middle Aged , Registries , Sensitivity and Specificity , United Kingdom/epidemiology
20.
JBMR Plus ; 2(4): 187-194, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30283902

ABSTRACT

Pharmacoepidemiology is used extensively in osteoporosis research and involves the study of the use and effects of drugs in large numbers of people. Randomized controlled trials are considered the gold standard in assessing treatment efficacy and safety. However, their results can have limited external validity when applied to day-to-day patients. Pharmacoepidemiological studies aim to assess the effect/s of treatments in actual practice conditions, but they are limited by the quality, completeness, and inherent bias due to confounding. Sources of information include prospectively collected (primary) as well as readily available routinely collected (secondary) (eg, electronic medical records, administrative/claims databases) data. Although the former enable the collection of ad hoc measurements, the latter provide a unique opportunity for the study of large representative populations and for the assessment of rare events at relatively low cost. Observational cohort and case-control studies, the most commonly implemented study designs in pharmacoepidemiology, each have their strengths and limitations. However, the choice of the study design depends on the research question that needs to be answered. Despite the many advantages of observational studies, they also have limitations. First, missing data is a common issue in routine data, frequently dealt with using multiple imputation. Second, confounding by indication arises because of the lack of randomization; multivariable regression and more specific techniques such as propensity scores (adjustment, matching, stratification, trimming, or weighting) are used to minimize such biases. In addition, immortal time bias (time period during which a subject is artefactually event-free by study design) and time-varying confounding (patient characteristics changing over time) are other types of biases usually accounted for using time-dependent modeling. Finally, residual "uncontrolled" confounding is difficult to assess, and hence to account for it, sensitivity analyses and specific methods (eg, instrumental variables) should be considered.

SELECTION OF CITATIONS
SEARCH DETAIL
...