Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 538
Filter
1.
Sultan Qaboos Univ Med J ; 24(2): 177-185, 2024 May.
Article in English | MEDLINE | ID: mdl-38828238

ABSTRACT

Objectives: This study aimed to estimate the door-to-balloon (DTB) time and determine the organisational-level factors that influence delayed DTB times among patients with ST-elevation myocardial infarction in Oman. Methods: A cross-sectional retrospective study was conducted on all patients who presented to the emergency department at Sultan Qaboos University Hospital and Royal Hospital, Muscat, Oman, and underwent primary percutaneous coronary interventions during 2018-2019. Results: The sample included 426 patients and the median DTB time was 142 minutes. The result of the bivariate logistic regression showed that patients who presented to the emergency department with atypical symptoms were 3 times more likely to have a delayed DTB time, when compared to patients who presented with typical symptoms (odds ratio [OR] = 3.003, 95% confidence interval [CI]: 1.409-6.400; P = 0.004). In addition, patients who presented during off-hours were 2 times more likely to have a delayed DTB time, when compared to patients who presented during regular working hours (OR = 2.291, 95% CI: 1.284-4.087; P = 0.005). Conclusion: To meet the DTB time recommendation, it is important to ensure adequate staffing during both regular and irregular working hours. Results from this study can be used as a baseline for future studies and inform strategies for improving the quality of care.


Subject(s)
Emergency Service, Hospital , ST Elevation Myocardial Infarction , Time-to-Treatment , Humans , Female , Cross-Sectional Studies , Male , Retrospective Studies , ST Elevation Myocardial Infarction/therapy , Oman , Middle Aged , Time-to-Treatment/statistics & numerical data , Time-to-Treatment/standards , Aged , Emergency Service, Hospital/statistics & numerical data , Emergency Service, Hospital/organization & administration , Time Factors , Percutaneous Coronary Intervention/statistics & numerical data , Percutaneous Coronary Intervention/methods , Adult , Logistic Models
2.
Crit Care Nurse ; 44(3): 28-35, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38821525

ABSTRACT

BACKGROUND: The mortality rate of pediatric patients who require continuous renal replacement therapy is approximately 42%, and outcomes vary considerably depending on underlying disease, illness severity, and time of dialysis initiation. Delay in the initiation of such therapy may increase mortality risk, prolong intensive care unit stay, and worsen clinical outcomes. LOCAL PROBLEM: In the pediatric intensive care unit of an urban level I trauma children's hospital, continuous renal replacement therapy initiation times and factors associated with delays in therapy were unknown. METHODS: This quality improvement process involved a retrospective review of data on patients who received continuous dialysis in the pediatric intensive care unit from January 1, 2017, to December 31, 2021. The objectives were to examine the characteristics of the children requiring continuous renal replacement therapy, therapy initiation times, and factors associated with initiation delays that might affect unit length of stay and mortality. RESULTS: During the study period, 175 patients received continuous renal replacement therapy, with an average initiation time of 11.9 hours. Statistically significant associations were found between the degree of fluid overload and mortality (P < .001) and between the presence of acute kidney injury and prolonged length of stay (P = .04). No significant association was found between therapy initiation time and unit length of stay or mortality, although the average initiation time of survivors was 5.9 hours shorter than that of nonsurvivors. CONCLUSION: Future studies are needed to assess real time delays and to evaluate if the implementation of a standardized initiation process decreases initiation time.


Subject(s)
Acute Kidney Injury , Continuous Renal Replacement Therapy , Critical Illness , Intensive Care Units, Pediatric , Humans , Male , Female , Retrospective Studies , Child , Acute Kidney Injury/therapy , Acute Kidney Injury/mortality , Child, Preschool , Infant , Adolescent , Length of Stay/statistics & numerical data , Time Factors , Renal Replacement Therapy , Infant, Newborn , Quality Improvement , Time-to-Treatment/standards
3.
Crit Care ; 28(1): 176, 2024 05 24.
Article in English | MEDLINE | ID: mdl-38790061

ABSTRACT

BACKGROUND: Bacteraemia is a critical condition that generally leads to substantial morbidity and mortality. It is unclear whether delayed antimicrobial therapy (and/or source control) has a prognostic or defervescence effect on patients with source-control-required (ScR) or unrequired (ScU) bacteraemia. METHODS: The multicenter cohort included treatment-naïve adults with bacteraemia in the emergency department. Clinical information was retrospectively obtained and etiologic pathogens were prospectively restored to accurately determine the time-to-appropriate antibiotic (TtAa). The association between TtAa or time-to-source control (TtSc, for ScR bacteraemia) and 30-day crude mortality or delayed defervescence were respectively studied by adjusting independent determinants of mortality or delayed defervescence, recognised by a logistic regression model. RESULTS: Of the total 5477 patients, each hour of TtAa delay was associated with an average increase of 0.2% (adjusted odds ratio [AOR], 1.002; P < 0.001) and 0.3% (AOR 1.003; P < 0.001) in mortality rates for patients having ScU (3953 patients) and ScR (1524) bacteraemia, respectively. Notably, these AORs were augmented to 0.4% and 0.5% for critically ill individuals. For patients experiencing ScR bacteraemia, each hour of TtSc delay was significantly associated with an average increase of 0.31% and 0.33% in mortality rates for overall and critically ill individuals, respectively. For febrile patients, each additional hour of TtAa was significantly associated with an average 0.2% and 0.3% increase in the proportion of delayed defervescence for ScU (3085 patients) and ScR (1266) bacteraemia, respectively, and 0.5% and 0.9% for critically ill individuals. For 1266 febrile patients with ScR bacteraemia, each hour of TtSc delay respectively was significantly associated with an average increase of 0.3% and 0.4% in mortality rates for the overall population and those with critical illness. CONCLUSIONS: Regardless of the need for source control in cases of bacteraemia, there seems to be a significant association between the prompt administration of appropriate antimicrobials and both a favourable prognosis and rapid defervescence, particularly among critically ill patients. For ScR bacteraemia, delayed source control has been identified as a determinant of unfavourable prognosis and delayed defervescence. Moreover, this association with patient survival and the speed of defervescence appears to be augmented among critically ill patients.


Subject(s)
Bacteremia , Emergency Service, Hospital , Humans , Bacteremia/drug therapy , Bacteremia/mortality , Male , Female , Middle Aged , Emergency Service, Hospital/organization & administration , Emergency Service, Hospital/statistics & numerical data , Aged , Retrospective Studies , Adult , Anti-Bacterial Agents/therapeutic use , Time Factors , Cohort Studies , Anti-Infective Agents/therapeutic use , Time-to-Treatment/statistics & numerical data , Time-to-Treatment/standards
4.
J Trauma Nurs ; 31(3): 158-163, 2024.
Article in English | MEDLINE | ID: mdl-38742724

ABSTRACT

BACKGROUND: Early administration of antibiotics in the presence of open fractures is critical in reducing infections and later complications. Current guidelines recommend administering antibiotics within 60 min of patient arrival to the emergency department, yet trauma centers often struggle to meet this metric. OBJECTIVES: This study aims to evaluate the impact of a nurse-initiated evidence-based treatment protocol on the timeliness of antibiotic administration in pediatric patients with open fractures. METHODS: A retrospective pre-post study of patients who met the National Trauma Data Standard registry inclusion criteria for open fractures of long bones, amputations, or lawn mower injuries was performed at a Midwestern United States Level II pediatric trauma center. The time of patient arrival and time of antibiotic administration from preimplementation (2015-2020) to postimplementation (2021-2022) of the protocol were compared. Patients transferred in who received antibiotics at an outside facility were excluded. RESULTS: A total of N = 73 participants met the study inclusion criteria, of which n = 41 were in the preimplementation group and n = 32 were in the postimplementation group. Patients receiving antibiotics within 60 min of arrival increased from n = 24/41 (58.5%) preimplementation to n = 26/32 (84.4%) postimplementation (p< .05). CONCLUSIONS: Our study demonstrates that initiating evidence-based treatment orders from triage helped decrease the time from arrival to time of antibiotic administration in patients with open fractures. We sustained improvement for 24 months after the implementation of our intervention.


Subject(s)
Anti-Bacterial Agents , Fractures, Open , Trauma Centers , Humans , Fractures, Open/nursing , Fractures, Open/drug therapy , Retrospective Studies , Anti-Bacterial Agents/administration & dosage , Male , Child , Female , Child, Preschool , Clinical Protocols , Adolescent , Time-to-Treatment/standards , Time Factors , Midwestern United States
5.
J Tissue Viability ; 33(2): 345-354, 2024 May.
Article in English | MEDLINE | ID: mdl-38594149

ABSTRACT

OBJECTIVE: A systematic review was conducted to evaluate the time delays in the management of diabetic foot and explore influencing factors of these delays and potential outcomes. METHODS: The researchers searched several electronic databases (Pubmed, Web of Science, Cochrane Library, EMbase, CNKI, WanFang, CBM and VIP) for English and Chinese studies that examined time delays in the management pathway of diabetic foot. Two authors independently screened and extracted data, and assessed the quality of the included studies using the Newcastle-Ottawa Scale and the Agency for Health Research and Quality checklist. Due to heterogeneity among the studies, descriptive analysis was performed. RESULTS: The review included 28 articles, comprising 20 cohort studies and 8 cross-sectional studies, that met the inclusion criteria. Among these, 14 were deemed of high quality. The median times from symptom onset to primary health care or specialist care varied from 3 to 46.69 days. The median delay in referral by primary care specialists ranged from 7 to 31 days, and subsequent median times to definitive treatment ranged from 6.2 to 56 days. Multiple complex factors were found to contribute to these delays, including patient demographics (older age, lower education level and income level) and poor patient health-seeking behaviors (inaccurate self-treatment, incorrect recognition and interpretation of symptoms), inaccurate assessment or initial treatment by health primary professionals, complex referral pathways and clinical characteristics of diabetic foot (number of foot ulcers, Wagner grade scale, and hemoglobin A1c index). Negative outcomes associated with these delays included increased risk of major amputation and mortality, decreased wound healing rate, prolonged hospital stay, and increased hospital costs. CONCLUSIONS: Time delays in the diabetic foot management pathway were both common and serious, contributing to negative health outcomes for patients with diabetic foot. Many complex factors related to patient's poor patient health-seeking behaviors, health system, and clinical characteristics of diabetic foot are responsible for these delays. Therefore, it is necessary to develop new strategies for standard referral practices and strengthen patient awareness of seeking care.


Subject(s)
Diabetic Foot , Humans , Diabetic Foot/therapy , Time Factors , Time-to-Treatment/statistics & numerical data , Time-to-Treatment/standards
6.
J Surg Res ; 298: 24-35, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38552587

ABSTRACT

INTRODUCTION: Survival following emergency department thoracotomy (EDT) for patients in extremis is poor. Whether intervention in the operating room instead of EDT in select patients could lead to improved outcomes is unknown. We hypothesized that patients who underwent intervention in the operating room would have improved outcomes compared to those who underwent EDT. METHODS: We conducted a retrospective review of the Trauma Quality Improvement Program database from 2017 to 2021. All adult patients who underwent EDT, operating room thoracotomy (ORT), or sternotomy as the first form of surgical intervention within 1 h of arrival were included. Of patients without prehospital cardiac arrest, propensity score matching was utilized to create three comparable groups. The primary outcome was survival. Secondary outcomes included time to procedure. RESULTS: There were 1865 EDT patients, 835 ORT patients, and 456 sternotomy patients who met the inclusion criteria. There were 349 EDT, 344 ORT, and 408 sternotomy patients in the matched analysis. On Cox multivariate regression, there was an increased risk of mortality with EDT versus sternotomy (HR 4.64, P < 0.0001), EDT versus ORT (HR 1.65, P < 0.0001), and ORT versus sternotomy (HR 2.81, P < 0.0001). Time to procedure was shorter with EDT versus sternotomy (22 min versus 34 min, P < 0.0001) and versus ORT (22 min versus 37 min, P < 0.0001). CONCLUSIONS: There was an association between sternotomy and ORT versus EDT and improved mortality. In select patients, operative approaches rather than the traditional EDT could be considered.


Subject(s)
Databases, Factual , Emergency Service, Hospital , Propensity Score , Quality Improvement , Sternotomy , Thoracotomy , Humans , Thoracotomy/mortality , Thoracotomy/statistics & numerical data , Female , Male , Retrospective Studies , Middle Aged , Emergency Service, Hospital/statistics & numerical data , Adult , Sternotomy/statistics & numerical data , Databases, Factual/statistics & numerical data , Aged , Time-to-Treatment/statistics & numerical data , Time-to-Treatment/standards , Operating Rooms/statistics & numerical data , Operating Rooms/organization & administration , Operating Rooms/standards
7.
Anticancer Res ; 43(11): 5025-5030, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37909973

ABSTRACT

BACKGROUND/AIM: The purpose of this study was to determine socioeconomic and demographic factors which may contribute to inequities in time to treat thyroid cancer. PATIENTS AND METHODS: We used data from the National Cancer Database, 2004-2019, to conduct an analysis of thyroid cancer patients. All (434,083) patients with thyroid cancer, including papillary (395,598), follicular (23,494), medullary (7,638), and anaplastic (7,353) types were included. We compared the wait time from diagnosis to first treatment, surgery, radiotherapy, and chemotherapy for patients based on age, race, sex, location, and socioeconomic status (SES). RESULTS: A total of 434,083 patients with thyroid cancer were included. Hispanic patients had significantly longer wait times to all treatments compared to non-Hispanic patients (first treatment 33.44 vs. 20.45 days, surgery 40.06 vs. 26.49 days, radiotherapy 114.68 vs. 96.42 days, chemotherapy 92.70 vs. 58.71 days). Uninsured patients, patients at academic facilities, and patients in metropolitan areas also had the longest wait times to treatment. CONCLUSION: This study identified multiple disparities related to SES and demographics that correspond to delays in time to treatment. It is crucial that this topic is investigated further to help mitigate these incongruities in thyroid cancer care in the future.


Subject(s)
Healthcare Disparities , Thyroid Neoplasms , Treatment Delay , Humans , Databases, Factual/statistics & numerical data , Hispanic or Latino/statistics & numerical data , Radiation Oncology , Thyroid Neoplasms/diagnosis , Thyroid Neoplasms/epidemiology , Thyroid Neoplasms/ethnology , Thyroid Neoplasms/therapy , Healthcare Disparities/ethnology , Healthcare Disparities/standards , Healthcare Disparities/statistics & numerical data , Time-to-Treatment/standards , Time-to-Treatment/statistics & numerical data , Treatment Delay/standards , Treatment Delay/statistics & numerical data
8.
BMC Cancer ; 22(1): 220, 2022 Feb 28.
Article in English | MEDLINE | ID: mdl-35227226

ABSTRACT

BACKGROUND: Cancer patient pathways (CPPs) were implemented in Norway to reduce unnecessary waiting times, regional variations, and to increase the predictability of cancer care for the patients. This study aimed to determine if 70% of cancer patients started treatment within the recommended time frames, and to identify potential delays. METHODS: Patients registered with a colorectal, lung, breast, or prostate cancer diagnosis at the Cancer Registry of Norway in 2015-2016 were linked with the Norwegian Patient Registry and Statistics Norway. Adjusting for sociodemographic variables, multivariable quantile (median) regressions were used to examine the association between place of residence and median time to start of examination, treatment decision, and start of treatment. RESULTS: The study included 20 668 patients. The proportions of patients who went through the CPP within the recommended time frames were highest among colon (84%) and breast (76%) cancer patients who underwent surgery and lung cancer patients who started systemic anticancer treatment (76%), and lowest for prostate cancer patients who underwent surgery (43%). The time from treatment decision to start of treatment was the main source of delay for all cancers. Travelling outside the resident health trust prolonged waiting time and was associated with a reduced odds of receiving surgery and radiotherapy for lung and rectal cancer patients, respectively. CONCLUSIONS: Achievement of national recommendations of the CCP times differed by cancer type and treatment. Identified bottlenecks in the pathway should be targeted to decrease waiting times. Further, CPP guidelines should be re-examined to determine their ongoing relevance.


Subject(s)
Critical Pathways/statistics & numerical data , Neoplasms/therapy , Patient Acceptance of Health Care/statistics & numerical data , Patient Compliance/statistics & numerical data , Time-to-Treatment/statistics & numerical data , Adult , Aged , Aged, 80 and over , Critical Pathways/standards , Female , Geography , Humans , Information Storage and Retrieval , Male , Middle Aged , Norway , Registries , Time Factors , Time-to-Treatment/standards , Waiting Lists
9.
CMAJ Open ; 10(1): E27-E34, 2022.
Article in English | MEDLINE | ID: mdl-35042692

ABSTRACT

BACKGROUND: In 2010, HIV treatment as prevention (TasP), encompassing widespread HIV testing and immediate initiation of free antiretroviral treatment (ART), was piloted under the Seek and Treat for Optimal Prevention of HIV/AIDS initiative (STOP) in British Columbia, Canada. We compared the time from HIV diagnosis to treatment initiation, and from treatment initiation to first virologic suppression, before (2005-2009) and after (2010-2016) the implementation of STOP. METHODS: In this population-based cohort study, we used longitudinal data of all people living with an HIV diagnosis in BC from 1996 to 2017. We included those aged 18 years or older who had never received ART and had received an HIV diagnosis in the 2005-2016 period. We defined the virologic suppression date as the first date of at least 2 consecutive test results within 4 months with a viral load of less than 200 copies/mL. Negative binomial regression models assessed the effect of STOP on the time to ART initiation and suppression, adjusting for confounders. All p values were 2-sided, and we set the significance level at 0.05. RESULTS: Participants who received an HIV diagnosis before STOP (n = 1601) were statistically different from those with a diagnosis after STOP (n = 1700); 81% versus 84% were men (p = 0.0187), 30% versus 15% had ever injected drugs (p < 0.0001), and 27% versus 49% had 350 CD4 cells/µL or more at diagnosis (p < 0.0001). The STOP initiative was associated with a 64% shorter time from diagnosis to treatment (adjusted mean ratio 0.36, 95% confidence interval [CI] 0.34-0.39) and a 21% shorter time from treatment to suppression (adjusted mean ratio 0.79, 95% CI 0.73-0.85). INTERPRETATION: In a population with universal health coverage, a TasP intervention was associated with shorter times from HIV diagnosis to treatment initiation, and from treatment initiation to viral suppression. Our results show accelerating progress toward the United Nations' 90-90-90 target of people with HIV who have a diagnosis, those who are on antiretroviral therapy and those who are virologically suppressed, and support the global expansion of TasP to accelerate the control of HIV/AIDS.


Subject(s)
Antiretroviral Therapy, Highly Active , HIV Infections , Post-Exposure Prophylaxis , Preventive Health Services , Time-to-Treatment , Adult , Antiretroviral Therapy, Highly Active/methods , Antiretroviral Therapy, Highly Active/statistics & numerical data , British Columbia/epidemiology , Cohort Studies , Early Diagnosis , Female , HIV Infections/epidemiology , HIV Infections/therapy , Humans , Male , Outcome and Process Assessment, Health Care , Post-Exposure Prophylaxis/methods , Post-Exposure Prophylaxis/organization & administration , Preventive Health Services/methods , Preventive Health Services/organization & administration , Sustained Virologic Response , Time-to-Treatment/organization & administration , Time-to-Treatment/standards
10.
Am J Emerg Med ; 53: 68-72, 2022 03.
Article in English | MEDLINE | ID: mdl-34999563

ABSTRACT

OBJECTIVE: Strict control measures under the COVID epidemic have brought an inevitable impact on ST-segment elevation myocardial infarction (STEMI)'s emergency treatment. We investigated the impact of the COVID on the treatment of patients with STEMI undergoing primary PCI. METHODS: In this single center cohort study, we selected a time frame of 6 month after declaration of COVID-19 infection (Jan 24-July 24, 2020); a group of STEMI patients in the same period of 2019 was used as control. Finally, a total of 246 STEMI patients, who were underwent primary PCI, were enrolled into the study (136 non COVID-19 outbreak periods and 110 COVID-19 outbreak periods). The impact of COVID on the time of symptom onset to the first medical contact (symptom-to-FMC) and door to balloon (D-to-B) was investigated. Moreover, the primary outcome was in-hospital major adverse cardiac events (MACE), defined as a composite of cardiac death, heart failure and malignant arrhythmia. RESULTS: Compared with the same period in 2019, there was a 19% decrease in the total number of STEMI patients undergoing primary PCI at the peak of the pandemic in 2020. The delay in symptom-to-FMC was significantly longer in COVID Outbreak period (180 [68.75, 342] vs 120 [60,240] min, P = 0.003), and the D-to-B times increased significantly (148 [115-190] vs 84 [70-120] min, P < 0.001). However, among patients with STEMI, MACE was similar in both time periods (18.3% vs 25.7%, p = 0.168). On multivariable analysis, COVID was not independently associated with MACE; the history of diabetes, left main disease and age>65 years were the strongest predictors of MACE in the overall population. CONCLUSIONS: The COVID pandemic was not independently associated with MACE; suggesting that active primary PCI treatment preserved high-quality standards even when challenged by a severe epidemic. CLINICAL TRIAL REGISTRATION: URL: https://ClinicalTrials.gov Unique identifier: NCT04427735.


Subject(s)
COVID-19/prevention & control , Percutaneous Coronary Intervention/statistics & numerical data , ST Elevation Myocardial Infarction/therapy , Aged , Beijing/epidemiology , COVID-19/complications , COVID-19/transmission , Cohort Studies , Female , Humans , Male , Middle Aged , Percutaneous Coronary Intervention/trends , Retrospective Studies , ST Elevation Myocardial Infarction/epidemiology , Time Factors , Time-to-Treatment/standards , Time-to-Treatment/statistics & numerical data , Treatment Outcome
11.
J Thorac Cardiovasc Surg ; 163(1): 111-119.e2, 2022 01.
Article in English | MEDLINE | ID: mdl-32327186

ABSTRACT

OBJECTIVES: To evaluate the association between low left ventricular ejection fraction (LVEF), complication rescue, and long-term survival after isolated coronary artery bypass grafting. METHODS: National cohort study of patients who underwent isolated coronary artery bypass grafting (2000-2016) using Veterans Affairs Surgical Quality Improvement Program data. Left ventricular ejection fraction was categorized as ≥35% (n = 55,877), 25%-34% (n = 3893), or <25% (n = 1707). Patients were also categorized as having had no complications, 1 complication, or more than 1 complication. The association between LVEF, complication rescue, and risk of death was evaluated with multivariable Cox regression. RESULTS: Among 61,477 patients, 6586 (10.7%) had a perioperative complication and 2056 (3.3%) had multiple complications. Relative to LVEF ≥35%, decreasing ejection fraction was associated with greater odds of complications (25%-34%, odds ratio, 1.30 [1.18-1.42]; <25%, odds ratio, 1.65 [1.43-1.92]). There was a dose-response relationship between decreasing LVEF and overall risk of death (≥35% [ref]; 25%-35%, hazard ratio, 1.46 [1.37-1.55]; <25%, hazard ratio, 1.68 [1.58-1.79]). Among patients who were rescued from complications, there were decreases in 10-year survival, regardless of LVEF. Among those rescued after multiple complications, LVEF was no longer associated with risk of death. CONCLUSIONS: While decreasing LVEF is associated with post-coronary artery bypass grafting complications, patients rescued from complications have worse long-term survival, regardless of left ventricular function. Prevention and timely treatment of complications should remain a focus of quality improvement initiatives, and future work is needed to mitigate their long-term detrimental impact on survival.


Subject(s)
Coronary Artery Bypass , Coronary Artery Disease , Long Term Adverse Effects , Postoperative Complications , Ventricular Dysfunction, Left , Coronary Artery Bypass/adverse effects , Coronary Artery Bypass/methods , Coronary Artery Disease/complications , Coronary Artery Disease/diagnosis , Coronary Artery Disease/physiopathology , Coronary Artery Disease/surgery , Early Medical Intervention/standards , Female , Humans , Long Term Adverse Effects/diagnosis , Long Term Adverse Effects/mortality , Long Term Adverse Effects/physiopathology , Long Term Adverse Effects/prevention & control , Male , Middle Aged , Postoperative Complications/diagnosis , Postoperative Complications/etiology , Postoperative Complications/physiopathology , Postoperative Complications/therapy , Preventive Health Services , Quality Improvement , Risk Assessment , Stroke Volume , Survival Analysis , Time-to-Treatment/standards , United States , United States Department of Veterans Affairs , Ventricular Dysfunction, Left/complications , Ventricular Dysfunction, Left/diagnosis , Ventricular Dysfunction, Left/physiopathology , Ventricular Dysfunction, Left/therapy
12.
J Thorac Cardiovasc Surg ; 163(1): 28-35.e1, 2022 Jan.
Article in English | MEDLINE | ID: mdl-32331819

ABSTRACT

OBJECTIVE: To examine whether there is an association between prehospital transfer distance and surgical mortality in emergency thoracic aortic surgery. METHODS: A retrospective cohort study using a national clinical database in Japan was conducted. Patients who underwent emergency thoracic aortic surgery from January 1, 2014, to December 31, 2016, were included. Patients with type B dissection were excluded. A multilevel logistic regression analysis was performed to examine the association between prehospital transfer distance and surgical mortality. In addition, an instrumental variable analysis was performed to address unmeasured confounding. RESULTS: A total of 12,004 patients underwent emergency thoracic aortic surgeries at 495 hospitals. Surgical mortality was 13.8%. The risk-adjusted mortality odds ratio for standardized distance (mean 12.8 km, standard deviation 15.2 km) was 0.94 (95% confidence interval, 0.87-1.01; P = .09). Instrumental variable analysis did not reveal a significant association between transfer distance and surgical mortality as well. CONCLUSIONS: No significant association was found between surgical mortality and prehospital transfer distance in emergency thoracic aortic surgery cases. Suspected cases of acute thoracic aortic syndrome may be transferred safely to distant high-volume hospitals.


Subject(s)
Aorta, Thoracic/surgery , Aortic Diseases , Emergency Medical Services , Thoracic Surgical Procedures , Triage , Acute Disease , Aged , Aortic Diseases/diagnosis , Aortic Diseases/mortality , Aortic Diseases/physiopathology , Aortic Diseases/surgery , Emergencies/epidemiology , Emergency Medical Services/organization & administration , Emergency Medical Services/statistics & numerical data , Female , Health Services Accessibility/statistics & numerical data , Hospitals, High-Volume , Humans , Japan , Male , Outcome and Process Assessment, Health Care , Retrospective Studies , Risk Adjustment/methods , Risk Factors , Thoracic Surgical Procedures/methods , Thoracic Surgical Procedures/mortality , Thoracic Surgical Procedures/statistics & numerical data , Time-to-Treatment/standards , Time-to-Treatment/statistics & numerical data , Triage/organization & administration , Triage/standards
13.
J Trauma Acute Care Surg ; 92(1): 21-27, 2022 01 01.
Article in English | MEDLINE | ID: mdl-34670960

ABSTRACT

BACKGROUND: Timing of extremity fracture fixation in patients with an associated major vascular injury remains controversial. Some favor temporary fracture fixation before definitive vascular repair to limit potential graft complications. Others advocate immediate revascularization to minimize ischemic time. The purpose of this study was to evaluate the timing of fracture fixation on outcomes in patients with concomitant long bone fracture and major arterial injury. METHODS: Patients with a combined long bone fracture and major arterial injury in the same extremity requiring operative repair over 11 years were identified and stratified by timing of fracture fixation. Vascular-related morbidity (rhabdomyolysis, acute kidney injury, graft failure, extremity amputation) and mortality were compared between patients who underwent fracture fixation prerevascularization (PRE) or postrevascularization (POST). RESULTS: One hundred four patients were identified: 19 PRE and 85 POST. Both groups were similar with respect to age, sex, Injury Severity Score, admission base excess, 24-hour packed red blood cells, and concomitant venous injury. The PRE group had fewer penetrating injuries (32% vs. 60%, p = 0.024) and a longer time to revascularization (9.5 vs. 5.8 hours, p = 0.0002). Although there was no difference in mortality (0% vs. 2%, p > 0.99), there were more vascular-related complications in the PRE group (58% vs. 32%, p = 0.03): specifically, rhabdomyolysis (42% vs. 19%, p = 0.029), graft failure (26% vs. 8%, p = 0.026), and extremity amputation (37% vs. 13%, p = 0.013). Multivariable logistic regression identified fracture fixation PRE as the only independent predictor of graft failure (odds ratio, 3.98; 95% confidence interval, 1.11-14.33; p = 0.03) and extremity amputation (odds ratio, 3.924; 95% confidence interval, 1.272-12.111; p = 0.017). CONCLUSION: Fracture fixation before revascularization contributes to increased vascular-related morbidity and was consistently identified as the only modifiable risk factor for both graft failure and extremity amputation in patients with a combined long bone fracture and major arterial injury. For these patients, delaying temporary or definitive fracture fixation until POST should be the preferred approach. LEVEL OF EVIDENCE: Prognostic study, Level IV.


Subject(s)
Arteries , Extremities , Fracture Fixation , Ischemia , Multiple Trauma , Vascular Surgical Procedures , Vascular System Injuries , Adult , Amputation, Surgical/statistics & numerical data , Arteries/injuries , Arteries/surgery , Extremities/blood supply , Extremities/injuries , Extremities/surgery , Female , Fracture Fixation/adverse effects , Fracture Fixation/methods , Graft Survival , Humans , Ischemia/etiology , Ischemia/prevention & control , Male , Multiple Trauma/diagnosis , Multiple Trauma/surgery , Outcome and Process Assessment, Health Care , Prognosis , Rhabdomyolysis/diagnosis , Rhabdomyolysis/etiology , Rhabdomyolysis/prevention & control , Risk Adjustment/methods , Time-to-Treatment/standards , Time-to-Treatment/statistics & numerical data , Vascular Surgical Procedures/adverse effects , Vascular Surgical Procedures/methods , Vascular Surgical Procedures/statistics & numerical data , Vascular System Injuries/diagnosis , Vascular System Injuries/surgery , Wounds, Penetrating/diagnosis , Wounds, Penetrating/surgery
15.
CMAJ Open ; 9(4): E1120-E1127, 2021.
Article in English | MEDLINE | ID: mdl-34848553

ABSTRACT

BACKGROUND: Delays in cancer diagnosis have been associated with reduced survival, decreased quality of life after treatment, and suboptimal patient experience. The objective of the study was to explore the perspectives of a group of family physicians and other specialists regarding potentially avoidable delays in diagnosing cancer, and approaches that may help expedite the process. METHODS: We conducted a qualitative study using interviews with physicians practising in primary and outpatient care settings in Alberta between July and September 2019. We recruited family physicians and specialists who were in a position to discuss delays in cancer diagnosis by email via the Cancer Strategic Clinical Network and the Alberta Medical Association. We conducted semistructured interviews over the phone, and analyzed data using thematic analysis. RESULTS: Eleven family physicians and 22 other specialists (including 7 surgeons or surgical oncologists, 3 pathologists, 3 radiologists, 2 emergency physicians and 2 hematologists) participated in interviews; 22 were male (66.7%). We identified 4 main themes describing 9 factors contributing to potentially avoidable delays in diagnosis, namely the nature of primary care, initial presentation, investigation, and specialist advice and referral. We also identified 1 theme describing 3 suggestions for improvement, including system integration, standardized care pathways and a centralized advice, triage and referral support service for family physicians. INTERPRETATION: These findings suggest the need for enhanced support for family physicians, and better integration of primary and specialty care before cancer diagnosis. A multifaceted and coordinated approach to streamlining cancer diagnosis is required, with the goals of enhancing patient outcomes, reducing physician frustration and optimizing efficiency.


Subject(s)
Critical Pathways/standards , Delayed Diagnosis/prevention & control , Neoplasms , Physicians, Family/statistics & numerical data , Primary Health Care , Specialization/statistics & numerical data , Triage , Alberta/epidemiology , Delivery of Health Care, Integrated/methods , Health Services Needs and Demand , Humans , Neoplasms/diagnosis , Neoplasms/therapy , Physician's Role , Primary Health Care/methods , Primary Health Care/organization & administration , Primary Health Care/standards , Qualitative Research , Quality Improvement , Referral and Consultation/organization & administration , Time-to-Treatment/standards , Triage/organization & administration , Triage/standards
17.
Can J Surg ; 64(5): E510-E515, 2021 10.
Article in English | MEDLINE | ID: mdl-34598928

ABSTRACT

BACKGROUND: Direct oral anticoagulants (DOACs) are rapidly replacing warfarin for therapeutic anticoagulation; however, many DOACs are irreversible and may complicate bleeding in emergent situations such as hip fracture. In this setting, there is a lack of clear guidelines for the timing of surgery. The purpose of this study was to evaluate the current practices of Canadian orthopedic surgeons who manage patients with hip fracture receiving anticoagulation. METHODS: In January-March 2018, we administered a purpose-specific cross-sectional survey to all currently practising orthopedic surgeons in Canada who had performed hip fracture surgery in 2017. The survey evaluated approaches to decision-making and timing of surgery in patients with hip fracture receiving anticoagulation. RESULTS: A total of 280 surgeons representing a mix of academic and community practice, seniority and fellowship training responded. Nearly one-quarter of respondents (66 [23.4%]) were members of the Canadian Orthopaedic Trauma Society (COTS). Almost three-quarters (206 [73.6%]) felt that adequate clinical guidelines for patients with hip fracture receiving anticoagulation did not exist, and 177 (61.9%) indicated that anesthesiology or internal medicine had a greater influence on the timing of surgery than the attending surgeon. A total of 117/273 respondents (42.9%) indicated that patients taking warfarin should have immediate surgery (with or without reversal), compared to 63/270 (23.3%) for patients taking a DOAC (p < 0.001). Members of COTS were more likely than nonmembers to advocate for immediate surgery in all patients (p < 0.05). CONCLUSION: There is wide variability in Canada in the management of patients with hip fracture receiving anticoagulation. Improved multidisciplinary communication, prospectively evaluated treatment guidelines and focus on knowledge translation may add clarity to this issue. LEVEL OF EVIDENCE: IV.


Subject(s)
Anticoagulants/therapeutic use , Hip Fractures/surgery , Orthopedic Procedures/statistics & numerical data , Orthopedic Surgeons/statistics & numerical data , Practice Patterns, Physicians'/statistics & numerical data , Time-to-Treatment/statistics & numerical data , Antithrombins/therapeutic use , Canada , Cross-Sectional Studies , Humans , Orthopedic Procedures/standards , Orthopedic Surgeons/standards , Practice Guidelines as Topic/standards , Practice Patterns, Physicians'/standards , Time Factors , Time-to-Treatment/standards
18.
Cancer Treat Res Commun ; 29: 100477, 2021.
Article in English | MEDLINE | ID: mdl-34700140

ABSTRACT

BACKGROUND: Small-cell lung cancer (SCLC) is an aggressive, rapidly progressive malignancy. Thus, expedient diagnosis and treatment initiation is important. This study identifies and quantifies factors associated with delayed diagnosis and treatment initiation in patients with SCLC and compares time to treatment in SCLC with a cohort of patients with non-small cell lung cancer (NSCLC). MATERIALS AND METHODS: The study included all patients diagnosed with SCLC at a hospital in southern Norway in a ten-year period (2007-2016), and all NSCLC patients during the period 2013-2016. Total time to treatment (TTT), was defined as the number of days from date of referral due to suspicion of lung cancer to first day of treatment. Factors associated with prolonged TTT were estimated using multivariate median regression analysis. RESULTS: The median TTT and interquartile range (IQR) for the 183 patients with SCLC was 16 (10-23) days. Factors associated with delayed TTT included outpatient versus inpatient evaluation (+8.4 days), number of diagnostic procedures (+4.3 days per procedure), stage I-III versus stage IV (+3.6 days) and age (+2.1 days per 10 years). In 2013-16, TTT in SCLC was 3.5 days shorter than in the period before and less than half that of NSCLC in the same period, 15 (9-22) versus 33 (22-50) days (p = 0.001). CONCLUSION: Shorter TTT is seen in higher stage, while longer TTT is a result of increasing complexity of the diagnostic process and treatment decisions of patients with curative intent treatment. Knowledge on delaying factors can shorten TTT and improve clinical practice.


Subject(s)
Lung Neoplasms/therapy , Small Cell Lung Carcinoma/therapy , Time-to-Treatment/standards , Aged , Cohort Studies , Female , Humans , Lung Neoplasms/pathology , Male , Small Cell Lung Carcinoma/pathology
19.
Proc Natl Acad Sci U S A ; 118(35)2021 08 31.
Article in English | MEDLINE | ID: mdl-34408076

ABSTRACT

Slower than anticipated, COVID-19 vaccine production and distribution have impaired efforts to curtail the current pandemic. The standard administration schedule for most COVID-19 vaccines currently approved is two doses administered 3 to 4 wk apart. To increase the number of individuals with partial protection, some governments are considering delaying the second vaccine dose. However, the delay duration must take into account crucial factors, such as the degree of protection conferred by a single dose, the anticipated vaccine supply pipeline, and the potential emergence of more virulent COVID-19 variants. To help guide decision-making, we propose here an optimization model based on extended susceptible, exposed, infectious, and removed (SEIR) dynamics that determines the optimal delay duration between the first and second COVID-19 vaccine doses. The model assumes lenient social distancing and uses intensive care unit (ICU) admission as a key metric while selecting the optimal duration between doses vs. the standard 4-wk delay. While epistemic uncertainties apply to the interpretation of simulation outputs, we found that the delay is dependent on the vaccine mechanism of action and first-dose efficacy. For infection-blocking vaccines with first-dose efficacy ≥50%, the model predicts that the second dose can be delayed by ≥8 wk (half of the maximal delay), whereas for symptom-alleviating vaccines, the same delay is recommended only if the first-dose efficacy is ≥70%. Our model predicts that a 12-wk second-dose delay of an infection-blocking vaccine with a first-dose efficacy ≥70% could reduce ICU admissions by 400 people per million over 200 d.


Subject(s)
COVID-19 Vaccines/administration & dosage , COVID-19/prevention & control , Hospitalization/statistics & numerical data , Intensive Care Units/statistics & numerical data , SARS-CoV-2/immunology , Time-to-Treatment/standards , Vaccination/methods , Algorithms , Brazil/epidemiology , COVID-19/epidemiology , COVID-19/immunology , COVID-19 Vaccines/supply & distribution , Humans , Treatment Outcome , Vaccination/statistics & numerical data
20.
Crit Care ; 25(1): 307, 2021 08 26.
Article in English | MEDLINE | ID: mdl-34446092

ABSTRACT

Sepsis is a common consequence of infection, associated with a mortality rate > 25%. Although community-acquired sepsis is more common, hospital-acquired infection is more lethal. The most common site of infection is the lung, followed by abdominal infection, catheter-associated blood steam infection and urinary tract infection. Gram-negative sepsis is more common than gram-positive infection, but sepsis can also be due to fungal and viral pathogens. To reduce mortality, it is necessary to give immediate, empiric, broad-spectrum therapy to those with severe sepsis and/or shock, but this approach can drive antimicrobial overuse and resistance and should be accompanied by a commitment to de-escalation and antimicrobial stewardship. Biomarkers such a procalcitonin can provide decision support for antibiotic use, and may identify patients with a low likelihood of infection, and in some settings, can guide duration of antibiotic therapy. Sepsis can involve drug-resistant pathogens, and this often necessitates consideration of newer antimicrobial agents.


Subject(s)
Anti-Infective Agents/therapeutic use , Sepsis/drug therapy , Time Factors , Anti-Infective Agents/administration & dosage , Biomarkers/analysis , Biomarkers/blood , Humans , Time-to-Treatment/standards , Time-to-Treatment/statistics & numerical data
SELECTION OF CITATIONS
SEARCH DETAIL
...