Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
J Heart Lung Transplant ; 42(10): 1378-1396, 2023 10.
Article in English | MEDLINE | ID: mdl-37127072

ABSTRACT

BACKGROUND: Some degree of ischemia is inevitable in organ transplantation, and for most, if not all organs, there is a relationship between ischemic time and transplant outcome. The contribution of ischemic time to lung injury is unclear, with conflicting recent data. In this study, we investigate the impact of ischemia time on survival after lung transplantation in a large national cohort. METHODS: We studied the outcomes for 1,565 UK adult lung transplants over a 12-year period, for whom donor, transplant, and recipient data were available from the UK Transplant Registry. We examined the effect of ischemia time (defined as donor cross-clamp to recipient reperfusion) and whether standard cardiopulmonary bypass was used using Cox proportional hazards models, adjusting for other risk factors. RESULTS: The total ischemic time increased from a median under 5 hours in 2003 to over 6.2 hours in 2013. Our findings show that, when the cardiopulmonary bypass was used, there was an increase in the hazard of death (of 13% [95% CI: 5%-21%] for 1-year patient survival) for each hour of total ischemic time. However, if the cardiopulmonary bypass was not used for implantation, this link disappeared-there was no statistically significant change in mortality with increasing ischemic time. CONCLUSIONS: We document that avoidance of bypass may remove ischemic time, within the limits of our observed range of ischemic times, as a risk factor for poor outcomes. Our data add to the evidence that bypass may be harmful to the donor lung.


Subject(s)
Cardiopulmonary Bypass , Lung Transplantation , Adult , Humans , Time Factors , Ischemia , United Kingdom/epidemiology , Tissue Donors , Retrospective Studies
2.
Transplantation ; 107(6): 1311-1321, 2023 06 01.
Article in English | MEDLINE | ID: mdl-36728501

ABSTRACT

BACKGROUND: Deceased donor livers are prone to biliary complications, which may necessitate retransplantation, and we, and others, have suggested that these complications are because of peribiliary vascular fibrin microthrombi. We sought to determine the prevalence and consequence of occult fibrin within deceased donor livers undergoing normothermic ex situ perfusion (NESLiP) and evaluate a role for fibrinolysis. METHODS: D-dimer concentrations, products of fibrin degradation, were assayed in the perfusate of 163 livers taken after 2 h of NESLiP, including 91 that were transplanted. These were related to posttransplant outcomes. Five different fibrinolytic protocols during NESLiP using alteplase were evaluated, and the transplant outcomes of these alteplase-treated livers were reviewed. RESULTS: Perfusate D-dimer concentrations were lowest in livers recovered using in situ normothermic regional perfusion and highest in alteplase-treated livers. D-dimer release from donation after brain death livers was significantly correlated with the duration of cold ischemia. In non-alteplase-treated livers, Cox proportional hazards regression analysis showed that D-dimer levels were associated with transplant survival ( P = 0.005). Treatment with alteplase and fresh frozen plasma during NESLiP was associated with significantly more D-dimer release into the perfusate and was not associated with excess bleeding postimplantation; 8 of the 9 treated livers were free of cholangiopathy, whereas the ninth had a proximal duct stricture. CONCLUSIONS: Fibrin is present in many livers during cold storage and is associated with poor posttransplant outcomes. The amount of D-dimer released after fibrinolytic treatment indicates a significant occult fibrin burden and suggests that fibrinolytic therapy during NESLiP may be a promising therapeutic intervention.


Subject(s)
Liver Transplantation , Humans , Liver Transplantation/adverse effects , Fibrin/metabolism , Organ Preservation/methods , Liver/blood supply , Perfusion/methods
3.
Transplantation ; 107(2): 438-448, 2023 02 01.
Article in English | MEDLINE | ID: mdl-35993664

ABSTRACT

BACKGROUND: . We evaluated whether the use of normothermic regional perfusion (NRP) was associated with increased organ recovery and improved transplant outcomes from controlled donation after circulatory death (cDCD). METHODS: . This is a retrospective analysis of UK adult cDCD donors' where at least 1 abdominal organ was accepted for transplantation between January 1, 2011, and December 31, 2019. RESULTS: . A mean of 3.3 organs was transplanted when NRP was used compared with 2.6 organs per donor when NRP was not used. When adjusting for organ-specific donor risk profiles, the use of NRP increased the odds of all abdominal organs being transplanted by 3-fold for liver ( P < 0.0001; 95% confidence interval [CI], 2.20-4.29), 1.5-fold for kidney ( P = 0.12; 95% CI, 0.87-2.58), and 1.6-fold for pancreas ( P = 0.0611; 95% CI, 0.98-2.64). Twelve-mo liver transplant survival was superior for recipients of a cDCD NRP graft with a 51% lower risk-adjusted hazard of transplant failure (HR = 0.494). In risk-adjusted analyses, NRP kidneys had a 35% lower chance of developing delayed graft function than non-NRP kidneys (odds ratio, 0.65; 95% CI, 0.465-0.901)' and the expected 12-mo estimated glomerular filtration rate was 6.3 mL/min/1.73 m 2 better if abdominal NRP was used ( P < 0.0001). CONCLUSIONS: . The use of NRP during DCD organ recovery leads to increased organ utilization and improved transplant outcomes compared with conventional organ recovery.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Humans , Retrospective Studies , Organ Preservation , Extracorporeal Circulation , Perfusion/adverse effects , Liver Transplantation/adverse effects , Graft Survival , Tissue Donors , Death
4.
Lancet Rheumatol ; 5(8): e461-e473, 2023 Aug.
Article in English | MEDLINE | ID: mdl-38251578

ABSTRACT

BACKGROUND: In the UK, additional COVID-19 vaccine booster doses and treatments are offered to people who are immunosuppressed to protect against severe COVID-19, but how best to choose the individuals that receive these vaccine booster doses and treatments is unclear. We investigated the association between seropositivity to SARS-CoV-2 spike protein with demographic, disease, and treatment-related characteristics after at least three COVID-19 vaccines in three cohorts of people who are immunosuppressed. METHODS: In a cross-sectional study using UK national disease registries, we identified, contacted, and recruited recipients of solid organ transplants, participants with rare autoimmune rheumatic diseases, and participants with lymphoid malignancies who were 18 years or older, resident in the UK, and who had received at least three doses of a COVID-19 vaccine. The study was open to recruitment from Dec 7, 2021, to June 26, 2022. Participants received a lateral flow immunoassay test for SARS-CoV-2 spike antibodies to complete at home, and an online questionnaire. Multivariable logistic regression was used to estimate the mutually adjusted odds of seropositivity against each characteristic. FINDINGS: Between Feb 14 and June 26, 2022, we screened 101 972 people (98 725 invited, 3247 self-enrolled) and recruited 28 411 (27·9%) to the study. 23 036 (81·1%) recruited individuals provided serological data. Of these, 9927 (43·1%) were recipients of solid organ transplants, 6516 (28·3%) had rare autoimmune rheumatic diseases, and 6593 (28·6%) had lymphoid malignancies. 10 485 (45·5%) participants were men and 12 535 (54·4%) were women (gender was not reported for 16 [<0·1%] participants), and 21661 (94·0%) participants were of White ethnicity. The median age of participants with solid organ transplants was 60 years (SD 50-67), with rare autoimmune rheumatic diseases was 65 years (54-73), and with lymphoid malignancy was 69 years (61-75). Of the 23 036 participants with serological data, 6583 (28·6%) had received three vaccine doses, 14 234 (61·8%) had received four vaccine doses, and 2219 (9·6%) had received five or more vaccine doses. IgG anti-spike antibodies were undetectable in 2310 (23·3%) of 9927 patients with solid organ transplants, 922 (14·1%) of 6516 patients with rare autoimmune rheumatic diseases, and 1366 (20·7%) of 6593 patients with lymphoid malignancies. In all groups, seropositivity was associated with younger age, higher number of vaccine doses (ie, five vs three), and previous COVID-19. Immunosuppressive medication reduced the likelihood of seropositivity: the lowest odds of seropositivity were found in recipients of solid organ transplants receiving a combination of an anti-proliferative agent, a calcineurin inhibitor, and steroids, and those with rare autoimmune rheumatic diseases or lymphoid malignancies treated with anti-CD20 therapies. INTERPRETATION: Approximately one in five recipients of solid organ transplants, individuals with rare autoimmune rheumatic diseases, and individuals with lymphoid malignancies have no detectable IgG anti-spike antibodies despite three or more vaccine doses, but this proportion decreases with sequential booster doses. Choice of immunosuppressant and disease type is strongly associated with serological response. Antibody testing using lateral flow immunoassay tests could enable rapid identification of individuals who are most likely to benefit from additional COVID-19 interventions. FUNDING: UK Research and Innovation, Kidney Research UK, Blood Cancer UK, Vasculitis UK and the Cystic Fibrosis Trust.


Subject(s)
COVID-19 , Immunization, Secondary , Neoplasms , Rheumatic Diseases , Spike Glycoprotein, Coronavirus , Male , Humans , Female , Middle Aged , COVID-19 Vaccines , Cross-Sectional Studies , Prevalence , COVID-19/epidemiology , SARS-CoV-2 , Immunoglobulin G , Antibodies, Viral , United Kingdom/epidemiology
5.
Ann Surg ; 274(5): 859-865, 2021 11 01.
Article in English | MEDLINE | ID: mdl-34334648

ABSTRACT

OBJECTIVE: To assess the impact of CIT on living donor kidney transplantation (LDKT) outcomes in the UKLKSS versus outside the scheme. BACKGROUND: LDKT provides the best treatment option for end-stage kidney disease patients. end-stage kidney disease patients with an incompatible living donor still have an opportunity to be transplanted through Kidney Exchange Programmes (KEP). In KEPs where kidneys travel rather than donors, cold ischaemia time (CIT) can be prolonged. METHODS: Data from all UK adult LDKT between 2007 and 2018 were analysed. RESULTS: 9969 LDKT were performed during this period, of which 1396 (14%) were transplanted through the UKLKSS, which we refer to as KEP. Median CIT was significantly different for KEP versus non-KEP (339 versus 182 minutes, P < 0.001). KEP LDKT had a higher incidence of delayed graft function (DGF) (2.91% versus 5.73%, P < 0.0001), lower 1-year (estimated Glomerular Filtration Rate (eGFR) 57.90 versus 55.25 ml/min, P = 0.04) and 5-year graft function (eGFR 55.62 versus 53.09 ml/min, P = 0.01) compared to the non-KEP group, but 1- and 5-year graft survival were similar. Within KEP, a prolonged CIT was associated with more DGF (3.47% versus 1.95%, P = 0.03), and lower graft function at 1 and 5-years (eGFR = 55 vs 50 ml/min, P = 0.02), but had no impact on graft survival. CONCLUSION: Whilst CIT was longer in KEP, associated with more DGF and lower graft function, excellent 5-year graft survival similar to non-KEP was found.


Subject(s)
Cold Ischemia/standards , Delayed Graft Function/prevention & control , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Living Donors , Organ Preservation/methods , Adult , Delayed Graft Function/epidemiology , Delayed Graft Function/physiopathology , Female , Follow-Up Studies , Glomerular Filtration Rate/physiology , Graft Survival , Humans , Incidence , Male , Middle Aged , Retrospective Studies , United Kingdom/epidemiology
6.
Animals (Basel) ; 11(5)2021 May 10.
Article in English | MEDLINE | ID: mdl-34068606

ABSTRACT

The horse-rider relationship is fundamental to ethical equestrianism wherein equine health and welfare are prioritized as core dimensions of sporting success. Equestrianism represents a unique and important form of interspecies activity in which relationships are commonly idealized as central to sporting performance but have been largely unexplored in the sport psychology literature. Horse-rider relationships warrant particular consideration in the elite sporting context, given the tension between constructions of "partnership" between horse and rider, and the pragmatic pressures of elite sport on horse and rider and their relationship. The current study examined the link between sporting performance and the horse-rider relationship in an elite equestrian sporting context. Thirty-six international elite riders from eight countries and six equestrian disciplines participated in a single in-depth interview. A social constructionist, grounded theory methodology was used to analyze this data. The horse-rider relationship was positioned in three different ways in relation to elite sporting outcomes: as pivotal to success; non-essential to success; or as antithetical to success. Participants shifted between these positions, expressing nuanced, ambivalent attitudes that reflected their sporting discipline and their personal orientation to equestrianism. Competitive success was also defined in fluid terms, with participants differentiating between intrinsic and extrinsic markers of success. These findings suggest a complex and multifaceted connection between interspecies performance and relationships in elite sport. Where strong horse-rider relationships are antithetical to performance, a threat to the welfare and ethics of equestrian sport exists. Relevant sporting governing bodies must attend to this problem to ensure the centrality of animal welfare, wellbeing, and performance longevity to equestrian sports.

7.
Autism ; 25(6): 1553-1564, 2021 08.
Article in English | MEDLINE | ID: mdl-33722069

ABSTRACT

LAY ABSTRACT: Most autism spectrum condition research addresses the neurological and biological causes of autism spectrum condition, focusing upon deficits associated with autism spectrum condition and behavioural interventions designed to minimise these deficits. Little is known about the lived experiences of adult women on the autism spectrum and how they navigate social expectations around gender, autism spectrum condition and gendered understandings of autism spectrum condition. The lived experiences of eight women on the AS will be shared here, with attention to how gendered expectations influence women's experiences of autism spectrum condition, their sense of self and well-being. Findings showed these women struggled to reconcile the expectations of others, particularly early in life. The women had difficultly conforming to stereotypical ideals of femininity, yet as they aged, they felt less need to conform, valuing their unique style and behaviours. The women also rejected deficit-oriented descriptions of autism spectrum condition generated by the medical community, preferring to focus on their strengths and unique characteristics. It is hoped this article helps psychologists and the wider community to understand and meet the needs of women on the AS.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Adult , Aged , Female , Gender Identity , Humans , Qualitative Research
8.
Clin Transplant ; 35(5): e14261, 2021 05.
Article in English | MEDLINE | ID: mdl-33608916

ABSTRACT

BACKGROUND: We aim to evaluate practice and understand the impact of the first wave of the SARS-CoV-2 pandemic on heart transplantation in the UK. METHODS: A retrospective review of the UK Transplant Registry (UKTR) and a national survey of UK heart transplant centers have been performed. The early pandemic period is defined here as 1 March to 31 May 2020. RESULTS: There was geographic variation in the prevalence of COVID-19 across the UK. All centers reported adaptations to maintain the safety of their staff, candidate, and recipient populations. The number of donors fell by 31% during the early pandemic period. Heart utilization increased to 35%, compared to 26% during the same period of 2019. The number of heart transplants was well maintained, across all centers, with 38 performed, compared to 41 during the same period of 2019, with no change in 30-day survival. Twenty-seven heart transplant recipients with confirmed COVID-19 infection were reported during the study period. CONCLUSION: All UK heart transplant centers have successfully adapted their programs to overcome the challenges of staff redeployment and ICU and hospital resource limitation, associated with the pandemic, whilst continuing heart transplant activity. On-going evaluation of practice changes, with sharing of lessons learned, is required as the pandemic continues.


Subject(s)
COVID-19 , Heart Transplantation , Humans , Pandemics , Retrospective Studies , SARS-CoV-2 , United Kingdom/epidemiology
9.
Clin Transplant ; 35(3): e14210, 2021 03.
Article in English | MEDLINE | ID: mdl-33368697

ABSTRACT

BACKGROUND: Lung transplantation is particularly susceptible to the impact of the severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) pandemic, and evaluation of changes to practice is required to inform future decision-making. METHODS: A retrospective review of the UK Transplant Registry (UKTR) and national survey of UK lung transplant centers has been performed. RESULTS: There was geographic variation in the prevalence of COVID-19 infection across the UK. The number of donors fell by 48% during the early pandemic period. Lung utilization fell to 10% (compared with 24% for the same period of 2019). The number of lung transplants performed fell by 77% from 53, March to May 2019, to 12. Seven (58%) of these were performed in a single-center, designated "COVID-light." The number of patients who died on the lung transplant waiting list increased, compared to the same period of 2019 (p = .0118). Twenty-six lung transplant recipients with confirmed COVID-19 infection were reported during the study period. CONCLUSION: As the pandemic continues, reviewing practice and implementing the lessons learned during this period, including the use of robust donor testing strategies and the provision of "COVID-light" hospitals, are vital in ensuring the safe continuation of our lung transplant program.


Subject(s)
COVID-19/epidemiology , Lung Transplantation , Pandemics , Registries , Tissue Donors , Transplant Recipients/statistics & numerical data , Waiting Lists , Comorbidity , Female , Humans , Lung Diseases/epidemiology , Lung Diseases/surgery , Male , Retrospective Studies , SARS-CoV-2 , United Kingdom/epidemiology
10.
Thorax ; 74(1): 60-68, 2019 01.
Article in English | MEDLINE | ID: mdl-30282722

ABSTRACT

BACKGROUND: The demand for lung transplantation vastly exceeds the availability of donor organs. This translates into long waiting times and high waiting list mortality. We set out to examine factors influencing patient outcomes from the time of listing for lung transplantation in the UK, examining for differences by patient characteristics, lung disease category and transplant centre. METHODS: Data were obtained from the UK Transplant Registry held by NHS Blood and Transplant for adult lung-only registrations between 1January 2004 and 31 March 2014. Pretransplant and post-transplant outcomes were evaluated against lung disease category, blood group and height. RESULTS: Of the 2213 patient registrations, COPD comprised 28.4%, pulmonary fibrosis (PF) 26.2%, cystic fibrosis (CF) 25.4% and other lung pathologies 20.1%. The chance of transplantation after listing differed by the combined effect of disease category and centre (p<0.001). At 3 years postregistration, 78% of patients with COPD were transplanted followed by 61% of patients with CF, 59% of other lung pathology patients and 48% of patients with PF, who also had the highest waiting list mortality (37%). The chance of transplantation also differed by height with taller patients having a greater chance of transplant (HR: 1.03, 95% CI: 1.02 to 1.04, p<0.001). Patients with blood group O had the highest waiting mortality at 3 years postregistration compared with all other blood groups (27% vs 20%, p<0.001). CONCLUSIONS: The way donor lungs were allocated in the UK resulted in discrepancies between the risk profile and probability of lung transplantation. A new donor lung allocation scheme was introduced in 2017 to try to address these shortcomings.


Subject(s)
ABO Blood-Group System , Lung Diseases/blood , Lung Diseases/surgery , Lung Transplantation/statistics & numerical data , Waiting Lists , Allografts/supply & distribution , Body Height , Cystic Fibrosis/blood , Cystic Fibrosis/surgery , Health Care Rationing/methods , Health Facilities/statistics & numerical data , Humans , Postoperative Period , Preoperative Period , Pulmonary Disease, Chronic Obstructive/blood , Pulmonary Disease, Chronic Obstructive/surgery , Pulmonary Fibrosis/blood , Pulmonary Fibrosis/surgery , Registries , Survival Rate , Time-to-Treatment , United Kingdom/epidemiology , Waiting Lists/mortality
11.
Heart ; 105(4): 291-296, 2019 02.
Article in English | MEDLINE | ID: mdl-30121631

ABSTRACT

OBJECTIVE: To study the survival and patient outcome in a population of UK patients supported by an implantable left ventricular assist device (LVAD) as a bridge to heart transplantation. METHODS: Data on all adult patients (n=342) who received a HeartMate II or HVAD as a first long-term LVAD between January 2007 and 31 December 2013 were extracted from the UK Ventricular Assist Device (VAD) Database in November 2015. Outcomes analysed include survival on a LVAD, time to urgent listing, heart transplantation and complications including those needing a pump exchange. RESULTS: 112 patients were supported with the Thoratec HeartMate II and 230 were supported with the HeartWare HVAD. Median duration of support was 534 days. During the study period, 81 patients required moving to the UK urgent waiting list for heart transplantation. Of the 342 patients, 85 (24.8%) received a heart transplant, this included 63 on the urgent list. Thirty-day survival was 88.9%, while overall patient survival at 3 years from LVAD implant was 49.6%. 156 patients (46%) died during LVAD support; the most common cause of death on a VAD was a cerebrovascular accident. There was no significant difference between the two devices used in any outcome. CONCLUSIONS: In a population of patients with advanced heart failure, who have a very poor prognosis, support with an implantable LVAD allowed a quarter to receive a heart transplant in a 3-year period. Overall survival of the cohort was about 50%. With improvement in technology and in post-LVAD management, it is likely that outcomes will improve further.


Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices/statistics & numerical data , Time-to-Treatment , Adult , Cause of Death , Databases, Factual/statistics & numerical data , Female , Heart Failure/complications , Heart Failure/mortality , Heart Failure/therapy , Heart Transplantation/methods , Heart Transplantation/statistics & numerical data , Humans , Male , Outcome and Process Assessment, Health Care , Quality Improvement/organization & administration , Survival Analysis , Time Factors , Time-to-Treatment/standards , Time-to-Treatment/statistics & numerical data , United Kingdom/epidemiology , Waiting Lists/mortality
12.
Interact Cardiovasc Thorac Surg ; 28(4): 594-601, 2019 04 01.
Article in English | MEDLINE | ID: mdl-30351360

ABSTRACT

OBJECTIVES: Left ventricular assist devices are funded in the UK exclusively as a bridge to transplant (BTT). However, patients who potentially could receive a transplant may develop reversible contraindications to transplant. Bridge to candidacy (BTC) has sometimes been controversial, given the uncertain clinical efficacy of BTC and the risk that reimbursement could be denied. We analysed the UK ventricular assist device database to understand how common BTC was and to assess patient survival rates and incidences of transplants. METHODS: We identified BTC implants in patients with pulmonary hypertension, chronic kidney disease and obesity using the UK guidelines for heart transplants. RESULTS: A total of 306 of 540 patients had complete data and 157 were identified as BTC (51%). Overall, there was no difference in survival rates between patients designated as BTC and those designated at BTT (71.9 vs 72.9% at 1 year, respectively; P = 0.82). However, the survival rate was lower at all time points in those with an estimated glomerular filtration rate (eGFR) <40 and in patients with a body mass index (BMI) >32 up to 1-year postimplant. There were no significant differences in the incidence of transplant between patients who were BTC and BTT or for any subgroup up to 5 years. However, we noted a diverging trend towards a lower cumulative incidence of transplant for patients with a BMI >32. CONCLUSIONS: BTC is common in the UK and appears clinically effective, given that the survival rates and the incidence of transplants were comparable with those for BTT. Patients with a high BMI have a worse survival rate through to 1 year and a trend for a lower incidence of a transplant. Patients with a low eGFR also have a worse survival rate, but a similar proportion received transplants.


Subject(s)
Heart Failure/surgery , Heart Transplantation , Heart-Assist Devices , Transplant Recipients , Adult , Female , Heart Failure/mortality , Heart Failure/physiopathology , Humans , Male , Middle Aged , Survival Rate/trends , Treatment Outcome , United Kingdom/epidemiology
13.
Gut ; 67(4): 654-662, 2018 04.
Article in English | MEDLINE | ID: mdl-28148540

ABSTRACT

OBJECTIVE: Lower GI bleeding (LGIB) is a common reason for emergency hospital admission, although there is paucity of data on presentations, interventions and outcomes. In this nationwide UK audit, we describe patient characteristics, interventions including endoscopy, radiology and surgery as well as clinical outcomes. DESIGN: Multicentre audit of adults presenting with LGIB to UK hospitals over 2 months in 2015. Consecutive cases were prospectively enrolled by clinical teams and followed for 28 days. RESULTS: Data on 2528 cases of LGIB were provided by 143 hospitals. Most were elderly (median age 74 years) with major comorbidities, 29.4% taking antiplatelets and 15.9% anticoagulants. Shock was uncommon (58/2528, 2.3%), but 666 (26.3%) received a red cell transfusion. Flexible sigmoidoscopy was the most common investigation (21.5%) but only 2.1% received endoscopic haemostasis. Use of embolisation or surgery was rare, used in 19 (0.8%) and 6 (0.2%) cases, respectively. 48% patients underwent no inpatient investigations. The most common diagnoses were diverticular bleeding (26.4%) and benign anorectal conditions (16.7%). Median length of stay was 3 days, 13.6% patients rebled during admission and 4.4% were readmitted with bleeding within 28 days. In-hospital mortality was 85/2528 (3.4%) and was highest in established inpatients (17.8%, p<0.0001) and in patients experiencing rebleeding (7.1%, p<0.0001). CONCLUSIONS: Patients with LGIB have a high burden of comorbidity and frequent antiplatelet or anticoagulant use. Red cell transfusion was common but most patients were not shocked and required no endoscopic, radiological or surgical treatment. Nearly half were not investigated. In-hospital mortality was related to comorbidity, not severe haemorrhage.


Subject(s)
Blood Transfusion , Colonoscopy , Embolization, Therapeutic , Gastrointestinal Hemorrhage/diagnosis , Gastrointestinal Hemorrhage/therapy , Hemostasis, Endoscopic , Inpatients , Acute Disease , Aged , Aged, 80 and over , Colonoscopy/methods , Embolization, Therapeutic/methods , Emergencies , Female , Gastrointestinal Hemorrhage/mortality , Hemostasis, Endoscopic/methods , Humans , Length of Stay , Male , Medical Audit , Middle Aged , Patient Readmission , Prospective Studies , Risk Factors , Sigmoidoscopy/methods , Treatment Outcome , United Kingdom
14.
Health Aff (Millwood) ; 35(11): 2014-2019, 2016 11 01.
Article in English | MEDLINE | ID: mdl-27834241

ABSTRACT

Community networks that include nonprofit, public, and private organizations have formed around many health issues, such as chronic disease management and healthy living and eating. Despite the increases in the numbers of and funding for cross-sector networks, and the growing literature about them, there are limited data and methods that can be used to assess their effectiveness and analyze their designs. We addressed this gap in knowledge by analyzing the characteristics of 260 cross-sector community health networks that collectively consisted of 7,816 organizations during the period 2008-15. We found that nonprofit organizations were more prevalent than private firms or government agencies in these networks. Traditional types of partners in community health networks such as hospitals, community health centers, and public health agencies were the most trusted and valued by other members of their networks. However, nontraditional partners, such as employer or business groups and colleges or universities, reported contributing relatively high numbers of resources to their networks. Further evidence is needed to inform collaborative management processes and policies as a mechanism for building what the Robert Wood Johnson Foundation describes as a culture of health.


Subject(s)
Community Networks/organization & administration , Delivery of Health Care, Integrated/organization & administration , Government Agencies/organization & administration , Organizations, Nonprofit/organization & administration , Private Sector/organization & administration , Community-Institutional Relations/economics , Cooperative Behavior , Delivery of Health Care, Integrated/economics , Humans , Population Health , Public Health , Surveys and Questionnaires
15.
Am J Public Health ; 105(8): 1646-52, 2015 Aug.
Article in English | MEDLINE | ID: mdl-26066929

ABSTRACT

OBJECTIVES: We investigated changes in hospital participation in local public health systems and the delivery of public health activities over time and assessed the relationship between hospital participation and the scope of activities available in local public health systems. METHODS: We used longitudinal observations from the National Longitudinal Survey of Public Health Systems to examine how hospital contributions to the delivery of core public health activities varied in 1998, 2006, and 2012. We then used multivariate regression to assess the relationship between the level of hospital contributions and the overall availability of public health activities in the system. RESULTS: Hospital participation in public health activities increased from 37% in 1998 to 41% in 2006 and down to 39% in 2012. Regression results indicated a positive association between hospital participation in public health activities and the total availability of public health services in the systems. CONCLUSIONS: Hospital collaboration does play an important role in the overall availability of public health services in local public health systems. Efforts to increase hospital participation in public health may have a positive impact on the scope of services provided and population health in US communities.


Subject(s)
Hospitals, Urban/organization & administration , Public Health Administration/methods , Cooperative Behavior , Hospitals, Urban/statistics & numerical data , Humans , Longitudinal Studies , Public Health/methods , Public Health/statistics & numerical data , Public Health Administration/statistics & numerical data , United States
16.
Am J Public Health ; 105 Suppl 2: S280-7, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25689201

ABSTRACT

OBJECTIVES: We examined public health system responses to economic shocks using longitudinal observations of public health activities implemented in US metropolitan areas from 1998 to 2012. METHODS: The National Longitudinal Survey of Public Health Systems collected data on the implementation of 20 core public health activities in a nationally representative cohort of 280 metropolitan areas in 1998, 2006, and 2012. We used generalized estimating equations to estimate how local economic shocks relate to the scope of activities implemented in communities, the mix of organizations performing them, and perceptions of the effectiveness of activities. RESULTS: Public health activities fell by nearly 5% in the average community between 2006 and 2012, with the bottom quintile of communities losing nearly 25% of their activities. Local public health delivery fell most sharply among communities experiencing the largest increases in unemployment and the largest reductions in governmental public health spending. CONCLUSIONS: Federal resources and private sector contributions failed to avert reductions in local public health protections during the recession. New financing mechanisms may be necessary to ensure equitable public health protections during economic downturns.


Subject(s)
Economic Recession/statistics & numerical data , Public Health Administration/economics , Public Health Practice/economics , Urban Population , Humans , Longitudinal Studies
17.
Am J Prev Med ; 45(6): 752-62, 2013 Dec.
Article in English | MEDLINE | ID: mdl-24237919

ABSTRACT

BACKGROUND: Research on how best to deliver efficacious public health strategies in heterogeneous community and organizational contexts remains limited. Such studies require the active engagement of public health practice settings in the design, implementation, and translation of research. Practice-based research networks (PBRNs) provide mechanisms for research engagement, but until now they have not been tested in public health settings. PURPOSE: This study uses data from participants in 14 public health PBRNs and a national comparison group of public health agencies to study processes influencing the engagement of public health settings in research implementation and translation activities. METHODS: A cross-sectional network analysis survey was fielded with participants in public health PBRNs approximately 1 year after network formation (n=357) and with a nationally representative comparison group of U.S. local health departments not participating in PBRNs (n=625). Hierarchic regression models were used to estimate how organizational attributes and PBRN network structures influence engagement in research implementation and translation activities. Data were collected in 2010-2012 and analyzed in 2012. RESULTS: Among PBRN participants, both researchers and practice agencies reported high levels of engagement in research activities. Local public health agencies participating in PBRNs were two to three times more likely than nonparticipating agencies to engage in research implementation and translation activities (p<0.05). Participants in less densely connected PBRN networks and in more peripheral locations within these networks reported higher levels of research engagement, greater perceived benefits from engagement, and greater likelihood of continued participation. CONCLUSIONS: PBRN networks can serve as effective mechanisms for facilitating research implementation and translation among public health practice settings.


Subject(s)
Community Networks/organization & administration , Health Services Research/organization & administration , Public Health , Translational Research, Biomedical/organization & administration , Cross-Sectional Studies , Humans , Public Health Practice , Regression Analysis , United States
18.
J Public Health Manag Pract ; 18(6): 485-98, 2012 Nov.
Article in English | MEDLINE | ID: mdl-23023272

ABSTRACT

BACKGROUND: Delivery system research to identify how best to organize, finance, and implement health improvement strategies has focused heavily on clinical practice settings, with relatively little attention paid to public health settings-where research is made more difficult by wide heterogeneity in settings and limited sources of existing data and measures. This study examines the approaches used by public health practice-based research networks (PBRNs) to expand delivery system research and evidence-based practice in public health settings. METHODS: Practice-based research networks employ quasi-experimental research designs, natural experiments, and mixed-method analytic techniques to evaluate how community partnerships, economic shocks, and policy changes impact delivery processes in public health settings. In addition, network analysis methods are used to assess patterns of interaction between practitioners and researchers within PBRNs to produce and apply research findings. RESULTS: Findings from individual PBRN studies elucidate the roles of information exchange, community resources, and leadership and decision-making structures in shaping implementation outcomes in public health delivery. Network analysis of PBRNs reveals broad engagement of both practitioners and researchers in scientific inquiry, with practitioners in the periphery of these networks reporting particularly large benefits from research participation. CONCLUSIONS: Public health PBRNs provide effective mechanisms for implementing delivery system research and engaging practitioners in the process. These networks also hold promise for accelerating the translation and application of research findings into public health settings.


Subject(s)
Delivery of Health Care , Health Services Research/organization & administration , Public Health , Humans , Translational Research, Biomedical
19.
Br J Hosp Med (Lond) ; 68(12): 674-6, 2007 Dec.
Article in English | MEDLINE | ID: mdl-18186409

ABSTRACT

A survey of 764 medical undergraduates explored their preferred careers information sources. Students wished to access careers information from a variety of sources but most valued face-to-face interviews with both trained careers counsellors and senior medical colleagues.


Subject(s)
Career Choice , Information Services/statistics & numerical data , Specialization , Students, Medical/psychology , Counseling , Female , Humans , Male
SELECTION OF CITATIONS
SEARCH DETAIL
...