ABSTRACT
BACKGROUND: COVID-19 vaccines effectively prevent infection and hospitalization. However, few population-based studies have compared the clinical characteristics and outcomes of patients hospitalized for COVID-19 using advanced statistical methods. Our objective is to address this evidence gap by comparing vaccinated and unvaccinated patients hospitalized for COVID-19. METHODS: This retrospective cohort included adult COVID-19 patients admitted from March 2021 to August 2022 from 27 hospitals. Clinical characteristics, vaccination status, and outcomes were extracted from medical records. Vaccinated and unvaccinated patients were compared using propensity score analyses, calculated based on variables associated with vaccination status and/or outcomes, including waves. The vaccination effect was also assessed by covariate adjustment and feature importance by permutation. RESULTS: From the 3,188 patients, 1,963 (61.6%) were unvaccinated and 1,225 (38.4%) were fully vaccinated. Among these, 558 vaccinated individuals were matched with 558 unvaccinated ones. Vaccinated patients had lower rates of mortality (19.4% vs. 33.3%), invasive mechanical ventilation (IMV-18.3% vs. 34.6%), noninvasive mechanical ventilation (NIMV-10.6% vs. 22.0%), intensive care unit admission (ICU-32.0% vs. 44.1%) vasoactive drug use (21.1% vs. 32.6%), dialysis (8.2% vs. 14.7%) hospital length of stay (7.0 vs. 9.0 days), and thromboembolic events (3.9% vs.7.7%), p < 0.05 for all. Risk-adjusted multivariate analysis demonstrated a significant inverse association between vaccination and in-hospital mortality (adjusted odds ratio [aOR] = 0.42, 95% confidence interval [CI]: 0.31-0.56; p < 0.001) as well as IMV (aOR = 0.40, 95% CI: 0.30-0.53; p < 0.001). These results were consistent in all analyses, including feature importance by permutation. CONCLUSION: Vaccinated patients admitted to hospital with COVID-19 had significantly lower mortality and other severe outcomes than unvaccinated ones during the Delta and Omicron waves. These findings have important implications for public health strategies and support the critical importance of vaccination efforts, particularly in low-income countries, where vaccination coverage remains suboptimal.
Subject(s)
COVID-19 Vaccines , COVID-19 , Hospitalization , Propensity Score , SARS-CoV-2 , Vaccination , Humans , COVID-19/prevention & control , COVID-19/mortality , COVID-19/epidemiology , COVID-19 Vaccines/administration & dosage , Male , Female , Retrospective Studies , Middle Aged , Hospitalization/statistics & numerical data , Aged , Vaccination/statistics & numerical data , SARS-CoV-2/immunology , Adult , Respiration, Artificial/statistics & numerical dataABSTRACT
Background: Using a previously unreported Peruvian registry of patients treated for early-stage non-small cell lung cancer (NSCLC), this study explored whether wedge resection and lobectomy were equivalent regarding survival and impact on radiologic-pathologic variables. Methods: This observational, analytical, longitudinal study used propensity score-matched (PSM) analysis of a single-center retrospective registry of 2,570 patients with pathologic stage I-II NSCLC who were treated with wedge resection (n=1,845) or lobectomy (n=725) during 2000-2020. After PSM, 650 cases were analyzed (resection, n=325; lobectomy, n=325) through preoperative and clinical variables, including patients with ≥1 lymph node removed. Kaplan-Meier curves and multivariable Cox proportional hazard models were created for 5-year overall survival (OS), disease-free survival (DFS), and locoregional-recurrence-free survival (LRFS). Results: The principal complication was operative pain persisting >7 days for lobectomy versus wedge resection (58% vs. 23%, p=0.034) and shorter hospital stays for resection than for lobectomy (5.3 days vs. 12.8 days, p=0.009). The 5-year OS (84.3% vs. 81.2%, p=0.09) and DFS (79.1% vs. 74.1%, p=0.07) were similar and statistically insignificant between resections and lobectomies, respectively. LRFS was worse overall following wedge resection than lobectomy (79.8% vs. 91.1%, p<0.02). Nevertheless, in the PSM analysis, both groups experienced similar LRFS when the resection margin was >10 mm (90.9% vs. 87.3%, p<0.048) and ≥4 lymph nodes were removed (82.8% vs. 79.1%, p<0.011). Conclusion: Both techniques led to similar OS and DFS at 5 years; however, successful LRFS required a wedge resection with a surgical margin and adequate lymph node removal to obtain outcomes similar to lobectomy.
ABSTRACT
BACKGROUND: To investigate clinical characteristics, treatment, outcomes, and prognostic risk factors of metachronous bilateral breast carcinoma (MBBC) and provide a theoretical basis for clinical management of MBBC. METHODS: This was a retrospective study. From January 1, 2010 to March 31, 2022, a total of 23,010 patients with breast cancer underwent surgical treatment at the Breast Center of the Fourth Hospital of Hebei Medical University, including 386 patients with MBBC. Propensity score matching (PSM) was performed on MBBC patients and unilateral breast cancer (UBC) patients in a 1:1 ratio, and 210 UBC patients and 210 MBBC patients were finally matched. Clinical medical records of all patients were collected, including age of onset, family history of breast cancer, tumor size, lymph node status, TNM stage, mode of surgery, menstruation, pathological type, immunohistochemical (IHC) typing, treatment, disease-free survival (DFS), and overall survival (OS). RESULTS: The result showed that age of onset of the second primary cancer (SPC) was significantly older than that of the first primary cancer (FPC) (P = 0.024). Baseline data from MPPC patients showed that the tumor size of FPC was significantly larger than that of SPC (P = 0.043), and the proportion of PR ( +) in FPC is significantly higher than that in SPC (P = 0.045). Among MBBC patients with FPC for estrogen receptor (ER) or progesterone receptor (PR) ( +) and Her-2 (-), clinical characteristics and treatment results showed that the proportion of PR ( +) in the drug-resistant group was significantly lower than that in the non-drug-resistant group. The 2-year OS rate of SPC in the drug-resistant group was significantly shorter than those of the non-drug-resistant group (78.9% vs 100%, P < 0.05). The result of PSM-based comparison between MBBC patients and UBC patients showed significantly lower proportion of MBBC patients with SPC received chemotherapy compared to UBC patients (P = 0.026), and there was no significant difference in OS and DFS between SPC course of MBBC patients and UBC patients (P > 0.05). The univariate analysis showed that high TNM stage was a risk factor for death and disease progression in MBBC patients, with the risk of death in stage III MBBC patients being about 5 times higher than that in stage I MBBC patients (HR = 4.97, 95%CI = 1.42-17.31, P = 0.012), and the risk of disease recurrence being about 3.5 times higher than that in stage I MBBC patients (HR = 3.55, 95%CI = 1.07-11.81, P = 0.039). CONCLUSION: In summary, this study presented clinical characteristics, treatment options, and outcomes of MBBC patients and patients with MBBC who were resistant to endocrine therapy have a worse SPC survival prognosis. The course of SPC in MBBC patients was similar to that of UBC in terms of prognosis and survival, which suggested that SPC can be treated according to UBC treatment regimen. High TNM stage was a prognostic risk factor for SPC patients.
ABSTRACT
BACKGROUND AND AIMS: The association of blood transfusion with an increase in medium- and short-term mortality in specific populations has been confirmed. However, the correlation between blood transfusion and long-term mortality in the general population remains unclear. This cohort study evaluated the correlation between blood transfusion and overall and cause-specific mortality in the general American adult population. METHODS: The authors utilized 10 sets of 2-year cycle data (1999-2018) from the National Health and Nutrition Examination Survey on the outcomes of adults who did and did not receive blood transfusions. Propensity score-matching (1:1) was performed based on age, sex, race, education level, marital status, poverty-income ratio, arteriosclerotic cardiovascular disease, cancer, anemia, hypertension, and diabetes status. After controlling for demographic characteristics and clinical risk factors, Cox regression analysis was performed to evaluate the correlation between blood transfusion and all-cause and cause-specific mortality. RESULTS: The study included 48,004 adult participants. The risk of all-cause mortality increased by 101 % with blood transfusion, and the risk of cardiovascular mortality increased by 165 %. After propensity score-matching, 6,116 pairs of cases were retained, and the risk of all-cause mortality increased by 84 % with blood transfusion, and the risk of cardiovascular mortality increased by 137 %. The sensitivity analysis results were robust. CONCLUSIONS: In the general American population, blood transfusion significantly impacts long-term all-cause and cardiovascular mortality and may be an unacknowledged risk factor for death. Thus, the effective management of blood transfusion in the general population may be beneficial.
Subject(s)
Blood Transfusion , Cardiovascular Diseases , Nutrition Surveys , Propensity Score , Humans , Male , Female , Cardiovascular Diseases/mortality , United States/epidemiology , Middle Aged , Adult , Blood Transfusion/statistics & numerical data , Blood Transfusion/mortality , Cause of Death , Risk Factors , Aged , Cohort StudiesABSTRACT
Negative control variables are sometimes used in nonexperimental studies to detect the presence of confounding by hidden factors. A negative control outcome (NCO) is an outcome that is influenced by unobserved confounders of the exposure effects on the outcome in view, but is not causally impacted by the exposure. Tchetgen Tchetgen (2013) introduced the Control Outcome Calibration Approach (COCA) as a formal NCO counterfactual method to detect and correct for residual confounding bias. For identification, COCA treats the NCO as an error-prone proxy of the treatment-free counterfactual outcome of interest, and involves regressing the NCO on the treatment-free counterfactual, together with a rank-preserving structural model, which assumes a constant individual-level causal effect. In this work, we establish nonparametric COCA identification for the average causal effect for the treated, without requiring rank-preservation, therefore accommodating unrestricted effect heterogeneity across units. This nonparametric identification result has important practical implications, as it provides single-proxy confounding control, in contrast to recently proposed proximal causal inference, which relies for identification on a pair of confounding proxies. For COCA estimation we propose 3 separate strategies: (i) an extended propensity score approach, (ii) an outcome bridge function approach, and (iii) a doubly-robust approach. Finally, we illustrate the proposed methods in an application evaluating the causal impact of a Zika virus outbreak on birth rate in Brazil.
Subject(s)
Propensity Score , Humans , Confounding Factors, Epidemiologic , Zika Virus Infection/epidemiology , Causality , Models, Statistical , Bias , Brazil/epidemiology , Computer Simulation , Female , PregnancyABSTRACT
BACKGROUND: Robot-assisted gastrectomy (RG) has been shown to be safe and feasible in the treatment of gastric cancer (GC). However, it is unclear whether RG is equivalent to laparoscopic gastrectomy (LG), especially in the Western world. Our objective was to compare the outcomes of RG and LG in GC patients. METHODS: We reviewed all gastric adenocarcinoma patients who underwent curative gastrectomy by minimally invasive approach in our institution from 2009 to 2022. Propensity score matching (PSM) analysis was conducted to reduce selection bias. DaVinci Si platform was used for RG. RESULTS: A total of 156 patients were eligible for inclusion (48 RG and 108 LG). Total gastrectomy was performed in 21.3% and 25% of cases in LG and RG, respectively. The frequency of stage pTNM II/III was 48.1%, and 54.2% in the LG and RG groups (p = 0.488). After PSM, 48 patients were matched in each group. LG and RG had a similar number of dissected lymph nodes (p = 0.759), operative time (p = 0.421), and hospital stay (p = 0.353). Blood loss was lower in the RG group (p = 0.042). The major postoperative complications rate was 16.7% for LG and 6.2% for RG (p = 0.109). The 30-day mortality rate was 2.1% and 0% for LG and RG, respectively (p = 1.0). There was no significant difference between the LG and RG groups for disease-free survival (79.6% vs. 61.2%, respectively; p = 0.155) and overall survival (75.9% vs. 65.7%, respectively; p = 0.422). CONCLUSION: RG had similar surgical and long-term outcomes compared to LG, with less blood loss observed in RG.
ABSTRACT
BACKGROUND: Preterm births increase mortality and morbidity during childhood and later life, which is closely associated with poverty and the quality of prenatal care. Therefore, income redistribution and poverty reduction initiatives may be valuable in preventing this outcome. We assessed whether receipt of the Brazilian conditional cash transfer programme - Bolsa Familia Programme, the largest in the world - reduces the occurrence of preterm births, including their severity categories, and explored how this association differs according to prenatal care and the quality of Bolsa Familia Programme management. METHODS: A retrospective cohort study was performed involving the first live singleton births to mothersenrolled in the 100 Million Brazilian Cohort from 2004 to 2015, who had at least one child before cohort enrollment. Only the first birth during the cohort period was included, but born from 2012 onward. A deterministic linkage with the Bolsa Familia Programme payroll dataset and a similarity linkage with the Brazilian Live Birth Information System were performed. The exposed group consisted of newborns to mothers who received Bolsa Familia from conception to delivery. Our outcomes were infants born with a gestational age < 37 weeks: (i) all preterm births, (ii) moderate-to-late (32-36), (iii) severe (28-31), and (iv) extreme (< 28) preterm births compared to at-term newborns. We combined propensity score-based methods and weighted logistic regressions to compare newborns to mothers who did and did not receive Bolsa Familia, controlling for socioeconomic conditions. We also estimated these effects separately, according to the adequacy of prenatal care and the index of quality of Bolsa Familia Programme management. RESULTS: 1,031,053 infants were analyzed; 65.9% of the mothers were beneficiaries. Bolsa Familia Programme was not associated with all sets of preterm births, moderate-to-late, and severe preterm births, but was associated with a reduction in extreme preterm births (weighted OR: 0.69; 95%CI: 0.63-0.76). This reduction can also be observed among mothers receiving adequate prenatal care (weighted OR: 0.66; 95%CI: 0.59-0.74) and living in better Bolsa Familia management municipalities (weighted OR: 0.56; 95%CI: 0.43-0.74). CONCLUSIONS: An income transfer programme for pregnant women of low-socioeconomic status, conditional to attending prenatal care appointments, has been associated with a reduction in extremely preterm births. These programmes could be essential in achieving Sustainable Development Goals.
Subject(s)
Premature Birth , Infant, Newborn , Pregnancy , Child , Infant , Female , Humans , Retrospective Studies , Longitudinal Studies , Brazil/epidemiology , Premature Birth/epidemiology , Premature Birth/prevention & control , FertilizationABSTRACT
INTRODUCTION AND OBJECTIVES: Acute kidney injury (AKI) is prevalent and has deleterious effects on postoperative outcomes following liver transplantation (LT). The impact of nonselective beta-blockers (NSBBs) in patients with liver cirrhosis remains controversial. This study investigated the association between preoperative NSBB use and AKI after living donor LT (LDLT). PATIENTS AND METHODS: We evaluated 2,972 adult LDLT recipients between January 2012 and July 2022. The patients were divided into two groups based on the preoperative NSBB use. Propensity score matched (PSM) and inverse probability of treatment weighting (IPTW) analyses were performed to evaluate the association between preoperative NSBB use and postoperative AKI. Multiple logistic regression analyses were also used to identify the risk factors for AKI. RESULTS: The overall incidence of AKI was 1,721 (57.9%) cases. The NSBB group showed a higher incidence of AKI than the non-NSBB group (62.4% vs. 56.7%; P = 0.011). After PSM and IPTW analyses, no significant difference in the incidence of AKI was found between the two groups (Odds ratio, OR 1.13, 95% confidence interval, CI 0.93-1.37, P = 0.230, PSM analysis; OR 1.20, 95% CI 0.99-1.44, P = 0.059, IPTW analysis). In addition, preoperative NSBB use was not associated with AKI after multivariate logistic regression analysis (OR 1.16, 95% CI 0.96-1.40, P = 0.118). CONCLUSIONS: Preoperative NSBB use was not associated with AKI after LDLT. Further studies are needed to validate our results.
Subject(s)
Acute Kidney Injury , Adrenergic beta-Antagonists , Liver Transplantation , Living Donors , Propensity Score , Humans , Acute Kidney Injury/epidemiology , Acute Kidney Injury/chemically induced , Acute Kidney Injury/diagnosis , Acute Kidney Injury/etiology , Liver Transplantation/adverse effects , Female , Male , Middle Aged , Incidence , Risk Factors , Adrenergic beta-Antagonists/therapeutic use , Adrenergic beta-Antagonists/adverse effects , Retrospective Studies , Adult , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Preoperative Care/methods , Liver Cirrhosis/surgery , Liver Cirrhosis/complications , Risk AssessmentABSTRACT
Introducción. En Colombia, solo un 24 % de los pacientes en lista recibieron un trasplante renal, la mayoría de donante cadavérico. Para la asignación de órganos se considera el HLA A-B-DR, pero la evidencia reciente sugiere que el HLA A-B no está asociado con los desenlaces del trasplante. El objetivo de este estudio fue evaluar la relevancia del HLA A-B-DR en la sobrevida del injerto de los receptores de trasplante renal. Métodos. Estudio de cohorte retrospectivo que incluyó pacientes trasplantados renales con donante cadavérico en Colombiana de Trasplantes, desde 2008 a 2023. Se aplicó un propensity score matching (PSM) para ajustar las covariables en grupos de comparación por compatibilidad y se evaluó la relación del HLA A-B-DR con la sobrevida del injerto renal por medio de la prueba de log rank y la regresión de Cox. Resultados. Se identificaron 1337 pacientes transplantados renales, de los cuales fueron mujeres un 38,7 %, con mediana de edad de 47 años y de índice de masa corporal de 23,8 kg/m2. Tras ajustar por PSM las covariables para los grupos de comparación, la compatibilidad del HLA A-B no se relacionó significativamente con la pérdida del injerto, con HR de 0,99 (IC95% 0,71-1,37) para HLA A y 0,75 (IC95% 0,55-1,02) para HLA B. Solo la compatibilidad por HLA DR fue significativa para pérdida del injerto con un HR de 0,67 (IC95% 0,46-0,98). Conclusión. Este estudio sugiere que la compatibilidad del HLA A-B no influye significativamente en la pérdida del injerto, mientras que la compatibilidad del HLA DR sí mejora la sobrevida del injerto en trasplante renal con donante cadavérico
Introduction. In Colombia, only 24% of patients on the waiting list received a renal transplant, most of them from cadaveric donors. HLA A-B-DR is considered for organ allocation, but recent evidence suggests that HLA A-B is not associated with transplant outcomes. The objective of this study was to evaluate the relevance of HLA A-B-DR on graft survival in kidney transplant recipients. Methods. Retrospective cohort study that included kidney transplant recipients with a cadaveric donor in Colombiana de Trasplantes from 2008 to 2023. A propensity score matching (PSM) was applied to adjust the covariates in comparison groups for compatibility, and the relationship of HLA A-B-DR with kidney graft survival was evaluated using the log rank test and Cox regression. Results. A total of 1337 kidney transplant patients were identified; of those, 38.7% were female, with median age of 47 years, and BMI 23.8 kg/m2. After adjusting the covariates with PSM for the comparison groups, HLA A-B matching was not significantly related to graft loss, with HR of 0.99 (95% CI 0.71-1.37) and 0.75 (95% CI 0.55-1.02), respectively. Only HLA DR matching was significant for graft loss with an HR of 0.67 (95% CI 0.46-0.98). Conclusions. This study suggests that HLA A-B matching does not significantly influence graft loss, whereas HLA DR matching does improve graft survival in renal transplantation with a cadaveric donor.
Subject(s)
Humans , Kidney Transplantation , Graft Rejection , HLA Antigens , Survival Analysis , Organ Transplantation , Propensity ScoreABSTRACT
ABSTRACT Introduction: The impact of mitral regurgitation (MR) on valve-in-valve transcatheter aortic valve implantation (VIV-TAVI) in patients with failed bioprostheses remains unclear. The purpose of this study was to assess the prognostic impact of residual moderate MR following VIV-TAVI. Methods: We retrospectively analyzed 127 patients who underwent VIV-TAVI between March 2010 and November 2021. At least moderate MR was observed in 51.2% of patients before the procedure, and MR improved in 42.1% of all patients. Patients with postoperative severe MR, previous mitral valve intervention, and patients who died before postoperative echocardiography were excluded from further analyses. The remaining 114 subjects were divided into two groups according to the degree of postprocedural MR: none-mild MR (73.7%) or moderate MR (26.3%). Propensity score matching yielded 23 pairs for final comparison. Results: No significant differences were found between groups before and after matching in early results. In the matched cohort, survival probabilities at one, three, and five years were 95.7% vs. 87.0%, 85.0% vs. 64.5%, and 85.0% vs. 29.0% in the none-mild MR group vs. moderate MR-group, respectively (log-rank P=0.035). Among survivors, patients with moderate MR had worse functional status according to New York Heart Association (NYHA) class at follow-up (P=0.006). Conclusion: MR is common in patients with failed aortic bioprostheses, and improvement in MR-status was observed in over 40% of patients following VIV-TAVI. Residual moderate MR after VIV-TAVI is not associated with worse early outcomes, however, it was associated with increased mortality at five years of follow-up and worse NYHA class among survivors.
ABSTRACT
Abstract Background and aims The association of blood transfusion with an increase in medium- and short-term mortality in specific populations has been confirmed. However, the correlation between blood transfusion and long-term mortality in the general population remains unclear. This cohort study evaluated the correlation between blood transfusion and overall and cause-specific mortality in the general American adult population. Methods The authors utilized 10 sets of 2-year cycle data (1999-2018) from the National Health and Nutrition Examination Survey on the outcomes of adults who did and did not receive blood transfusions. Propensity score-matching (1:1) was performed based on age, sex, race, education level, marital status, poverty-income ratio, arteriosclerotic cardiovascular disease, cancer, anemia, hypertension, and diabetes status. After controlling for demographic characteristics and clinical risk factors, Cox regression analysis was performed to evaluate the correlation between blood transfusion and all-cause and cause-specific mortality. Results The study included 48,004 adult participants. The risk of all-cause mortality increased by 101 % with blood transfusion, and the risk of cardiovascular mortality increased by 165 %. After propensity score-matching, 6,116 pairs of cases were retained, and the risk of all-cause mortality increased by 84 % with blood transfusion, and the risk of cardiovascular mortality increased by 137 %. The sensitivity analysis results were robust. Conclusions In the general American population, blood transfusion significantly impacts long-term all-cause and cardiovascular mortality and may be an unacknowledged risk factor for death. Thus, the effective management of blood transfusion in the general population may be beneficial.
ABSTRACT
Introduction: Culture plays a fundamental role in shaping human behavior, with individualism and collectivism being key cultural dimensions. However, existing scales for measuring these constructs, such as the INDCOL scale, have demonstrated issues when applied in diverse cultural contexts. To address this, we present the translation and adaptation of the Mexican Vertical and Horizontal Individualism and Collectivism Scale (MXINDCOL) into English, aiming to identify both universal and culture-specific elements. Methods: Data were collected from 1124 participants (371 from the United States, 753 from Mexico) using the MXINDCOL and INDCOL scales. Propensity score matching was applied to balance demographic differences between the samples. Confirmatory Factor Analysis (CFA) assessed model fit, and cross-cultural measurement invariance was examined. Reliability, convergent and discriminant validity were also assessed. Results: The English-translated MXINDCOL scale demonstrated good model fit in both US and Mexican samples, outperforming the INDCOL scale. Reliability values were higher for the MXINDCOL scale compared to INDCOL. Cross-cultural measurement invariance was established, allowing for meaningful comparisons between the two cultures. US participants scored higher on vertical collectivism, while Mexican participants scored higher on horizontal collectivism and horizontal individualism. Discussion: The MXINDCOL scale offers a culturally sensitive measurement of individualism and collectivism, addressing issues found in existing scales. It provides a more accurate assessment of cultural orientations and enriches the understanding of cultural dimensions by incorporating idiosyncratic elements. Further research in diverse cultural contexts is recommended to validate and refine the scale, contributing to a more nuanced understanding of cultural variations in individualism and collectivism.
ABSTRACT
According to an extensive database, the Objective is to compare surgical versus non-surgical treatment through Propensity Score (PS) for patients with Oropharyngeal Squamous Cell Carcinoma (OPSCC). METHODS: We retrospectively evaluated epidemiological data from 8075 patients with OPSCC diagnosed between 2004 and 2014 and used PS matching to analyze possible prognostic factors for its outcomes with regression analyses. RESULTS: Cox multiple regression analysis to study survival after PS matching shows that type of treatment was associated with death with a hazard ratio of 1.753 (p<0.05) of non-surgical treatment. However, it was not associated with recurrence (p>0.05). In the surgical treatment group, overall survival was 79.9% at one year, 36.4% at five years, and 20.5% at ten years. Disease-free survival was 90.1%, 64.8%, and 56.0% at 1, 5, and 10-years, respectively. In the non-surgical treatment group, overall survival was 60.6% at one year, 21.8% at five years, and 12.7% at ten years. Disease-free survival was 90.8%, 67.2%, and 57.8% at 1, 5, and 10-years, respectively. CONCLUSION: Patients in the surgical treatment group had better outcomes related to survival. Recurrence is associated with the survival of OPSCC cancer. Recurrence-free survival is similar to both treatments. LEVEL OF EVIDENCE: 2C.
ABSTRACT
INTRODUCTION: The impact of mitral regurgitation (MR) on valve-in-valve transcatheter aortic valve implantation (VIV-TAVI) in patients with failed bioprostheses remains unclear. The purpose of this study was to assess the prognostic impact of residual moderate MR following VIV-TAVI. METHODS: We retrospectively analyzed 127 patients who underwent VIV-TAVI between March 2010 and November 2021. At least moderate MR was observed in 51.2% of patients before the procedure, and MR improved in 42.1% of all patients. Patients with postoperative severe MR, previous mitral valve intervention, and patients who died before postoperative echocardiography were excluded from further analyses. The remaining 114 subjects were divided into two groups according to the degree of postprocedural MR: none-mild MR (73.7%) or moderate MR (26.3%). Propensity score matching yielded 23 pairs for final comparison. RESULTS: No significant differences were found between groups before and after matching in early results. In the matched cohort, survival probabilities at one, three, and five years were 95.7% vs. 87.0%, 85.0% vs. 64.5%, and 85.0% vs. 29.0% in the none-mild MR group vs. moderate MR-group, respectively (log-rank P=0.035). Among survivors, patients with moderate MR had worse functional status according to New York Heart Association (NYHA) class at follow-up (P=0.006). CONCLUSION: MR is common in patients with failed aortic bioprostheses, and improvement in MR-status was observed in over 40% of patients following VIV-TAVI. Residual moderate MR after VIV-TAVI is not associated with worse early outcomes, however, it was associated with increased mortality at five years of follow-up and worse NYHA class among survivors.
Subject(s)
Aortic Valve Stenosis , Heart Valve Prosthesis Implantation , Heart Valve Prosthesis , Mitral Valve Insufficiency , Transcatheter Aortic Valve Replacement , Humans , Transcatheter Aortic Valve Replacement/methods , Prognosis , Retrospective Studies , Treatment Outcome , Mitral Valve Insufficiency/diagnostic imaging , Mitral Valve Insufficiency/etiology , Mitral Valve Insufficiency/surgery , Aortic Valve Stenosis/surgery , Aortic Valve/surgery , Heart Valve Prosthesis Implantation/adverse effects , Heart Valve Prosthesis Implantation/methods , Heart Valve Prosthesis/adverse effectsABSTRACT
INTRODUCTION: Deep sternal wound infections (DSWI) are so serious and costly that hospital services continue to strive to control and prevent these outcomes. Microcosting is the more accurate approach in economic healthcare evaluation, but there are no studies in this field applying this method to compare DSWI after isolated coronary artery bypass grafting (CABG). This study aims to evaluate the incremental risk-adjusted costs of DSWI on isolated CABG. METHODS: This is a retrospective, single-center observational cohort study with a propensity score matching for infected and non-infected patients to compare incremental risk-adjusted costs between groups. Data to homogeneity sample was obtained from a multicentric database, REPLICCAR II, and additional sources of information about costs were achieved with the electronic hospital system (Si3). Inflation variation and dollar quotation in the study period were corrected using the General Market Price Index. Groups were compared using analysis of variance, and multiple linear regression was performed to evaluate the cost drivers related to the event. RESULTS: As expected, infections were costly; deep infection increased the costs by 152% and mediastinitis by 188%. Groups differed among hospital stay, exams, medications, and multidisciplinary labor, and hospital stay costs were the most critical cost driver. CONCLUSION: In summary, our results demonstrate the incremental costs of a detailed microcosting evaluation of infections on CABG patients in São Paulo, Brazil. Hospital stay was an important cost driver identified, demonstrating the importance of evaluating patients' characteristics and managing risks for a faster, safer, and more effective discharge.
Subject(s)
Cardiac Surgical Procedures , Surgical Wound Infection , Humans , Retrospective Studies , Brazil/epidemiology , Coronary Artery Bypass/adverse effects , Sternum/surgery , Risk FactorsABSTRACT
RESUMEN Antecedentes : las ventajas de la hepatectomía videolaparoscópica (HVL) hicieron que gane cada vez más campo para el tratamiento de los tumores hepáticos benignos (THB). Objetivo : comparar los resultados perioperatorios de pacientes sometidos a HVL con los de los operados con hepatectomía abierta (HA) por THB, emparejados con propensity score matching (PSM). Material y métodos : estudio descriptivo, retrospectivo y comparativo de HA y HVL por THB entre agosto de 2010 y junio de 2021. Se analizaron variables demográficas, preoperatorias, intraoperatorias y posoperatorias. Para evitar sesgos de las distintas covariables entre los grupos se realizó un PSM 1:1. Resultados : de 403 hepatectomías, se analizaron 82 por THB. De ellas 36 (44%) fueron HA y 46 (56%) HVL. Edad media 45 ±14 años, 65% mujeres. Tras realizar el PSM, quedaron dos grupos de 28 pacientes cada uno. En HA, 5 (18%) pacientes requirieron transfusiones y ninguno en HVL (p = 0,01). Las complicaciones mayores se presentaron en 4 (14%) pacientes en HA, y ninguna en HVL (p = 0,03). Se reoperaron 4 (14%) pacientes con HA y ningún paciente con HVL (p = 0,03). La estadía hospitalaria total fue significativamente mayor en las HA (p = 0,04). No se registraron muertes a los 90 días en ninguno de los dos grupos. Conclusión : la HVL por THB es una técnica segura y eficaz, ya que los pacientes presentaron menor requerimiento transfusional, número de reoperaciones, de complicaciones mayores y de estadía hospitalaria que con HA. Por las ventajas encontradas, la HVL podría ser considerada la técnica de elección en cirugía por THB.
ABSTRACT Background : The advantages of laparoscopic liver resection (LLR) have increased its use for the treatment of benign liver tumors (BLTs). Objective : The aim of this study was to compare the perioperative outcomes of patients undergoing LLR with those operated on with open liver resection (OLR) for BLTs using propensity score matching (PSM). Material and methods : We conducted a descriptive and retrospective study comparing OLRs with LLRs performed between August 2010 and June 2021. The demographic, perioperative, intraoperative and postoperative variables were analyzed. We used PSM with 1:1 matching to avoid biases of the different covariates between the groups. Results : Of 303 liver resections, 82 corresponded to BLTs and were included in the analysis; 36 (44%) were OLRs and 46 (56%) were LLRs. Mean age was 45 ±14 years and 65% were women. After PSM, two groups of 28 patients each were constituted. Five patients (18%) in the OLR group and none in the LLR required transfusions (p = 0.01). Major complications, occurred in 4 (14%) patients in the OLR group and in no cases in the LLR group (p = 0.03). Four (14%) undergoing OLR required reoperation versus no patients with LLR (p = 0.03). Total length of hospital stay was significantly longer in OLR (p = 0.04). There were no deaths in any of the groups within 90 days. Conclusion : LLR for BLTs is a safe and effective technique, with lower requirement for transfusions, fewer reoperations and major complications and shorter length of hospital stay than OLR, Therefore, LLR could be considered the surgical technique of choice for BLTs.
ABSTRACT
We assess the impact of the Brazilian government's conditional cash transfer program Bolsa Família on unhealthy consumption by households, proxied by expenses with ultra-processed food, alcohol, and tobacco products. Using machine learning techniques to improve the propensity score estimation, we analyze the intensive and extensive margin effects of participating in the program on the household purchase of unhealthy products. Our results reveal that program participants spend more on food in general, but not necessarily more on unhealthy options. While we find evidence that participants increase their probability of spending more on food away from home, they do not significantly alter their expenditures on packaged food, alcohol, or tobacco products.
ABSTRACT
Abstract Objective The role of Primary Tumor Volume (PTV) in Nasopharyngeal Carcinoma (NPC) treated with Volumetric Modulated Arc Therapy (VMAT) is still unclear. The aim of this study was to access the effect of PTV in prognosis prediction of nasopharyngeal carcinoma in era of VMAT. Methods Between January 20 and November 2011, 498 consecutive NPC patients with stage I-IVA disease who received VMAT at a single center were retrospectively analyzed. Receiver Operating Characteristic (ROC) was performed to access the cut-off point of PTV. Univariate Kaplan-Meier and multivariate Cox regression analyses were used to evaluate prognostic value for PTV. The Propensity Score Matching (PSM) was used to adjust baseline potential confounders. Results The 5-year Locol-Regional Failure-Free (L-FFR), Distant Failure-Free Survival (D-FFR), Disease-Free Survival (DFS) and Overall Survival (OS) were 90.6%, 83.7%, 71.5% and 79.3%, respectively. Before PSM, the 5-year L-FFR, D-FFR, DFS, OS rates for NPC patients with PTV ≤ 38 mL vs. PTV > 38 mL were 94.1% vs. 90.4% (p= 0.063), 87.9% vs. 76.3% (p< 0.001), 78.5% vs. 58.5% (p< 0.001) and 86.3% vs. 66.7% (p< 0.001) respectively. Multivariate analysis showed PTV was an independent prognostic factor for D-FFS (p= 0.034), DFS (p= 0.002) and OS (p= 0.001). PTV classified was still an independent prognostic factor for OS after PSM (HR = 2.034, p= 0.025. Conclusions PTV had a substantial impact on the prognosis of NPC patients treated with VMAT before and after PSM simultaneously. PTV > 38 mL may be considered as an indicator of the clinical stage of nasopharyngeal carcinoma. Level of evidence III.
ABSTRACT
The relationship between class size and school performance has always been ambiguous and the current literature has found no direct connection between them, especially in the Brazilian context. Therefore, this study aimed to verify whether the number of students per class influences school performance. We used Microdata from the Prova Brazil of 2017. Using the propensity score matching statistical model, with the nearest neighbor matching estimator, we grouped the classes into clusters by similarity. The metric used to group the clusters was the Euclidean distance. We attempted to verify adherence to the normal distribution of data using the Kolmogorov Smirnov test and tested the null hypothesis of the medians using the Wilcoxon test. All the statistical analysis were performed using SPSS Statistic version 20. The results showed that the number of students per class has little influence on performance and, when the influence exists, larger classes perform better.
ABSTRACT
PURPOSE: The aim of this study is to investigate whether previous abdominal surgery (PAS) affected stage I-III colorectal cancer (CRC) patients who underwent radical resection. METHODS: Stage I-III CRC patients who received surgery at a single clinical center from Jan 2014 to Dec 2022 were retrospectively included in this study. Baseline characteristics and short-term outcomes were compared between the PAS group and the non-PAS group. Univariate and multivariate logistic regression analyses were used to find risk factors for overall complications and major complications. A 1:1 ratio propensity score matching (PSM) was used to minimize the selection bias between the two groups. Statistical analysis was performed using SPSS (version 22.0) software. RESULTS: A total of 5895 stage I-III CRC patients were included according to the inclusion and exclusion criteria. The PAS group had 1336 (22.7%) patients, and the non-PAS group had 4559 (77.3%) patients. After the PSM, there were 1335 patients in each group, and no significant difference was found in all baseline characteristics between the two groups (P > 0.05). After comparing the short-term outcomes, the PAS group had a longer operation time (before PSM, P < 0.01; after PSM, P < 0.01) and more overall complications (before PSM, P = 0.027; after PSM, P = 0.022) whether before or after PSM. In univariate and multivariate logistic regression analyses, PAS was an independent risk factor for overall complications (univariate analysis, P = 0.022; multivariate analysis, P = 0.029) but not for major complications (univariate analysis, P = 0.688). CONCLUSION: Stage I-III CRC patients with PAS might experience longer operation time and have a higher risk of postoperative overall complications. However, it did not appear to significantly affect the major complications. Surgeons should take steps to improve surgical outcomes for patients with PAS.