Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 105
Filter
2.
J Adv Nurs ; 2024 Jul 24.
Article in English | MEDLINE | ID: mdl-39046217

ABSTRACT

BACKGROUND: National health and social care standards are complex, quality improvement interventions. Standards typically describe a process and/or outcome of safe, quality, person-centred care according to best evidence. Currently, there are 11 national standards that apply to diverse services in Ireland including residential centres, acute hospitals and rehabilitation and community inpatient healthcare services. A better understanding of contextual factors influencing implementation will inform decision-making when selecting implementation strategies to enhance the implementation of standards. AIM: To explore experiences of implementing national health and social care standards and secondly, to identify enablers and barriers to implementation with stakeholders from across multiple levels of the health system. DESIGN: A qualitative descriptive study. METHODS: We conducted six focus groups and eight individual interviews from October to November 2021 with stakeholders at system level (n = 14), organizational level (n = 14) and individual level (n = 10). Focus groups and interviews were audio-recorded, transcribed verbatim and analysed using reflexive thematic analysis. RESULTS: Six themes were generated; (1) Top-down, bottom-up, a team approach: everybody together, we are all involved, we are all responsible, (2) Support tools: accessible tools and bite-size material pertaining to standards will support us to implement standards, (3) Empower with knowledge: increase awareness and understanding of standards, make them relatable in practice so we can make sense of them, (4) A system-wide malaise: we do not have the bandwidth to implement standards, (5) Follow the leader: we need a lead person at every level to inspire implementation, (6) A bi-directional influence: we know inspections drive quality improvements but we still feel trepidation around inspection outcomes. CONCLUSION: Key enablers identified related to teamwork, support tools, leadership and inspections. Key barriers related to workforce issues, a lack of awareness of standards and fear of inspection outcomes. Our findings can be incorporated into strategies to support implementation of standards, ultimately for the benefit of service-users. IMPLICATIONS FOR PRACTICE: The enablers and barriers described in this study reflect the importance of organizational factors in the implementation of standards. Interdisciplinary teams can infer from these findings, which enablers and barriers apply to their own context. These findings can inform decision-making when selecting strategies that can be effective in supporting the implementation of standards. REPORTING METHOD: We have adhered to the Standards for Reporting Qualitative Research (SRQR) guidelines. PATIENT OR PUBLIC CONTRIBUTION: No patient or public contribution.

3.
Transplantation ; 2024 Jun 13.
Article in English | MEDLINE | ID: mdl-38867347

ABSTRACT

BACKGROUND: Although kidney transplantation (KT) has become the standard of care for people living with HIV (PLWH) suffering from renal failure, early experiences revealed unanticipated higher rejection rates than those observed in HIV- recipients. The cause of increased acute rejection (AR) in PLWH was assessed by performing a transcriptomic analysis of biopsy specimens, comparing HIV+ to HIV- recipients. METHODS: An analysis of 68 (34 HIV+, 34 HIV-) formalin-fixed paraffin-embedded (FFPE) renal biopsies matched for degree of inflammation was performed from KT recipients with acute T cell-mediated rejection (aTCMR), borderline for aTCMR (BL), and normal findings. Gene expression was measured using the NanoString platform on a custom gene panel to assess differential gene expression (DE) and pathway analysis (PA). RESULTS: DE analysis revealed multiple genes with significantly increased expression in the HIV+ cohort in aTCMR and BL relative to the HIV- cohort. PA of these genes showed enrichment of various inflammatory pathways, particularly innate immune pathways associated with Toll-like receptors. CONCLUSIONS: Upregulation of the innate immune pathways in the biopsies of PLWH with aTCMR and BL is suggestive of a unique immune response that may stem from immune dysregulation related to HIV infection. These findings suggest that these unique HIV-driven pathways may in part be contributory to the increased incidence of allograft rejection after renal transplantation in PLWH.

4.
Future Healthc J ; 11(2): 100131, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38751491

ABSTRACT

Background: Postgraduate leadership education is an evolving field. Locally we have an established 'Chief Residency' programme where centres have two to four senior trainees completing leadership duties alongside clinical workload, supported by local directors of medical education. This is twinned with a 4-day central training programme and peer-support network. Methods: To assess perspectives of the CR role, we adopted a qualitative case-study design using an electronic questionnaire delivered to previous chief residents between 2020 and 2023. Results were analysed using thematic analysis. Results: Trainees valued involvement within quality improvement and trainee support, demonstrating successful multi-departmental projects. Leadership education was viewed ubiquitously positively but participants felt further work is needed to address role legitimacy locally. A proposed solution was junior doctor leadership teams to address workload and emotional challenges. Conclusion: This model provides further evidence of the value in investing in trainee leadership positions, demonstrating organisational impact. Future work will research hospital peer leadership teams.

5.
Eur Child Adolesc Psychiatry ; 33(1): 255-266, 2024 Jan.
Article in English | MEDLINE | ID: mdl-36773126

ABSTRACT

The strengths and difficulties questionnaire (SDQ) consist of five sub-scales that have been used to measure internalising and externalising symptoms in children, typically by combining sum scores of two sub-scales each, and pro-social behaviours. However, the different possible factorial structures that represent these symptoms have not been formally tested in a nationally representative sample of UK children. In addition, it is necessary to assess whether the SDQ is interpreted similarly across subgroups of the population. Exploratory and confirmatory factor analysis were used to test three competing structures for the parent-reported SDQ collected at age 11, the start of adolescence, in the UK Millennium Cohort Study (n = 11,519), and measurement invariance was assessed according to sex and a measure of deprivation of the area in which households lived. Internal consistency using ordinal alpha, internal convergent validity and external discriminant validity using average variance explained (AVE), and predictive validity were assessed. A five-factor model and a model with two second-order factors for internalising and externalising symptoms had better model fit than a three-factor model. For both structures, invariance was demonstrated across sex and area-level deprivation. AVE scores for the five-factor model indicated that peer and emotional problems factors were measuring a similar construct, as were the hyperactivity and conduct factors. In the second-order model, AVE scores indicated internalising and externalising symptoms were distinct constructs. A second-order model with two factors for internalising and externalising symptoms is appropriate for use in a cohort of UK children born in 2001/02, and our finding of invariance across sex and area-level deprivation indicate that the SDQ can be used in analysis investigating differences in symptoms across subgroups of the population.


Subject(s)
Child Behavior Disorders , Parents , Child , Adolescent , Humans , Young Adult , Adult , Cohort Studies , Surveys and Questionnaires , Parents/psychology , Child Behavior Disorders/diagnosis , Psychometrics , United Kingdom
6.
Lancet Psychiatry ; 11(1): 47-55, 2024 01.
Article in English | MEDLINE | ID: mdl-38101872

ABSTRACT

BACKGROUND: Globally, more adolescents are having depressive symptoms than in the past. High BMI is a risk factor for depressive symptoms, potentially acting via increased body dissatisfaction. Robust longitudinal evidence of these associations could help to inform preventive interventions, but such evidence remains scarce. We investigated the longitudinal associations between BMI at age 7 years and depressive symptoms at age 14 years (objective 1), BMI at age 7 years and body dissatisfaction at age 11 years (objective 2), and body dissatisfaction at age 11 years and depression at age 14 years (objective 3). We also investigated the extent to which body dissatisfaction mediated the association between BMI and depressive symptoms (objective 4). METHODS: This study used data from the Millennium Cohort Study, a representative longitudinal general population cohort of UK children born between Sept 1, 2000, and Jan 11, 2002. We used univariable and multivariable linear regression models to investigate the associations in objectives 1-3 adjusting for a range of child-level and family-level confounders. For mediation analyses we used non-parametric g-formula (objective 4). We reported stratified results in presence of sex differences. All analyses were based on participants with complete BMI data and imputed confounders and outcomes. FINDINGS: Our sample included 13 135 participants. Of these, 6624 (50·4%) were male participants and 6511 (49·6%) were female participants; 11 096 (84·4%) were of White ethnicity and 2039 (15·6%) were from a minority ethnic background. At baseline, mean age was 7·2 years (SD 0·25, range 6·3-8·3). In multivariable models, an SD increase in BMI at age 7 years was associated with greater depressive symptoms at age 14 years (estimated regression coefficient [coeff]: 0·30, 95% CI 0·17-0·43) and greater body dissatisfaction at age 11 years (coeff 0·15, 0·12-0·18). Greater body dissatisfaction at age 11 years was associated with higher depressive symptoms at age 14 years (coeff 0·60, 0·52-0·68). All these associations were twice as large in girls as in boys. Body dissatisfaction explained 43% of the association between BMI and depression in girls. INTERPRETATION: Our findings bear relevance for interventions aimed at reducing weight in childhood and reducing body dissatisfaction. Implementation of evidence-based body image interventions and identification of drivers of weight stigma should be key public health priorities. Interventions aiming to reduce weight in childhood need to avoid increasing body dissatisfaction and should target environmental drivers of weight rather than individuals. FUNDING: Wellcome Trust; The Royal Society; Economic and Social Research Council; and the National Institute for Health and Care Research.


Subject(s)
Body Dissatisfaction , Humans , Male , Female , Adolescent , Child , Cohort Studies , Body Mass Index , Depression/epidemiology , United Kingdom/epidemiology , Longitudinal Studies
7.
BMJ Qual Saf ; 32(12): 750-762, 2023 12.
Article in English | MEDLINE | ID: mdl-37290917

ABSTRACT

BACKGROUND: Health and social care standards have been widely adopted as a quality improvement intervention. Standards are typically made up of evidence-based statements that describe safe, high-quality, person-centred care as an outcome or process of care delivery. They involve stakeholders at multiple levels and multiple activities across diverse services. As such, challenges exist with their implementation. Existing literature relating to standards has focused on accreditation and regulation programmes and there is limited evidence to inform implementation strategies specifically tailored to support the implementation of standards. This systematic review aimed to identify and describe the most frequently reported enablers and barriers to implementing (inter)nationally endorsed standards, in order to inform the selection of strategies that can optimise their implementation. METHODS: Database searches were conducted in Medline, CINAHL (Cumulative Index to Nursing and Allied Health Literature), SocINDEX, Google Scholar, OpenGrey and GreyNet International, complemented by manual searches of standard-setting bodies' websites and hand searching references of included studies. Primary qualitative, quantitative descriptive and mixed methods studies that reported enablers and barriers to implementing nationally or internationally endorsed standards were included. Two researchers independently screened search outcomes and conducted data extraction, methodological appraisal and CERQual (Confidence in Evidence from Reviews of Qualitative research) assessments. An inductive analysis was conducted using Sandelowski's meta-summary and measured frequency effect sizes (FES) for enablers and barriers. RESULTS: 4072 papers were retrieved initially with 35 studies ultimately included. Twenty-two thematic statements describing enablers were created from 322 descriptive findings and grouped under six themes. Twenty-four thematic statements describing barriers were created from 376 descriptive findings and grouped under six themes. The most prevalent enablers with CERQual assessments graded as high included: available support tools at local level (FES 55%); training courses to increase awareness and knowledge of the standards (FES 52%) and knowledge sharing and interprofessional collaborations (FES 45%). The most prevalent barriers with CERQual assessments graded as high included: a lack of knowledge of what standards are (FES 63%), staffing constraints (FES 46%), insufficient funds (FES 43%). CONCLUSIONS: The most frequently reported enablers related to available support tools, education and shared learning. The most frequently reported barriers related to a lack of knowledge of standards, staffing issues and insufficient funds. Incorporating these findings into the selection of implementation strategies will enhance the likelihood of effective implementation of standards and subsequently, improve safe, quality care for people using health and social care services.


Subject(s)
Delivery of Health Care , Quality of Health Care , Humans , Social Support , Quality Improvement
8.
Ann Surg Oncol ; 30(9): 5433-5442, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37266808

ABSTRACT

BACKGROUND: CRS-HIPEC provides oncologic benefit in well-selected patients with peritoneal carcinomatosis; however, it is a morbid procedure. Decision tools for preoperative patient selection are limited. We developed a risk score to predict severity of 90 day complications for cytoreductive surgery with hyperthermic intraperitoneal chemotherapy (CRS-HIPEC). PATIENTS AND METHODS: Adults who underwent CRS-HIPEC at the University of Pittsburgh Medical Center (March 2001-April 2020) were analyzed as part of this study. Primary endpoint was severe complications within 90 days following CRS-HIPEC, defined using Comprehensive Complication Index (CCI) scores as a dichotomous (determined using restricted cubic splines) and continuous variable. Data were divided into training and test sets. Several machine learning and traditional algorithms were considered. RESULTS: For the 1959 CRS-HIPEC procedures included, CCI ranged from 0 to 100 (median 32.0). Adjusted restricted cubic splines model defined severe complications as CCI > 61. A minimum of 20 variables achieved optimal performance of any of the models. Linear regression achieved the highest area under the receiving operator characteristic curve (AUC, 0.74) and outperformed the NSQIP Surgical Risk calculator (AUC 0.80 vs. 0.66). Factors most positively associated with severe complications included peritoneal carcinomatosis index score, symptomatic status, and undergoing pancreatectomy, while American Society of Anesthesiologists 2 class, appendiceal diagnosis, and preoperative albumin were most negatively associated with severe complications. CONCLUSIONS: This study refines our ability to predict severe complications within 90 days of discharge from a hospitalization in which CRS-HIPEC was performed. This advancement is timely and relevant given the growing interest in this procedure and may have implications for patient selection, patient and referring provider comfort, and survival.


Subject(s)
Hyperthermia, Induced , Peritoneal Neoplasms , Adult , Humans , Peritoneal Neoplasms/therapy , Combined Modality Therapy , Antineoplastic Combined Chemotherapy Protocols/adverse effects , Chemotherapy, Adjuvant , Cytoreduction Surgical Procedures/adverse effects , Judgment , Hyperthermia, Induced/adverse effects , Survival Rate , Retrospective Studies
9.
Transpl Int ; 36: 11367, 2023.
Article in English | MEDLINE | ID: mdl-37359825

ABSTRACT

Long-term success in beta-cell replacement remains limited by the toxic effects of calcineurin inhibitors (CNI) on beta-cells and renal function. We report a multi-modal approach including islet and pancreas-after-islet (PAI) transplant utilizing calcineurin-sparing immunosuppression. Ten consecutive non-uremic patients with Type 1 diabetes underwent islet transplant with immunosuppression based on belatacept (BELA; n = 5) or efalizumab (EFA; n = 5). Following islet failure, patients were considered for repeat islet infusion and/or PAI transplant. 70% of patients (four EFA, three BELA) maintained insulin independence at 10 years post-islet transplant, including four patients receiving a single islet infusion and three patients undergoing PAI transplant. 60% remain insulin independent at mean follow-up of 13.3 ± 1.1 years, including one patient 9 years after discontinuing all immunosuppression for adverse events, suggesting operational tolerance. All patients who underwent repeat islet transplant experienced graft failure. Overall, patients demonstrated preserved renal function, with a mild decrease in GFR from 76.5 ± 23.1 mL/min to 50.2 ± 27.1 mL/min (p = 0.192). Patients undergoing PAI showed the greatest degree of renal impairment following initiation of CNI (56% ± 18.7% decrease in GFR). In our series, repeat islet transplant is ineffective at maintaining long-term insulin independence. PAI results in durable insulin independence but is associated with impaired renal function secondary to CNI dependence.


Subject(s)
Diabetes Mellitus, Type 1 , Islets of Langerhans Transplantation , Pancreas Transplantation , Humans , Diabetes Mellitus, Type 1/drug therapy , Diabetes Mellitus, Type 1/surgery , Insulin/therapeutic use , Calcineurin , Immunosuppression Therapy/methods , Islets of Langerhans Transplantation/methods , Calcineurin Inhibitors/therapeutic use , Immunosuppressive Agents/therapeutic use
12.
World J Surg ; 47(3): 750-758, 2023 03.
Article in English | MEDLINE | ID: mdl-36402918

ABSTRACT

BACKGROUND: Hand-assisted laparoscopic distal pancreatectomy (HALDP) is suggested to offer similar outcomes to pure laparoscopic distal pancreatectomy (LDP). However, given the longer midline incision, it is unclear whether HALDP increases the risk of postoperative hernia. Our aim was to determine the risk of postoperative incisional hernia development after HALDP. METHODS: We retrospectively collected data from patients undergoing HALDP or LDP at a single center (2012-2020). Primary endpoints were postoperative incisional hernia and operative time. All patients had at minimum six months of follow-up. Outcomes were compared using unadjusted and multivariable regression analyses. RESULTS: Ninety-five patients who underwent laparoscopic distal pancreatectomy were retrospectively identified. Forty-one patients (43%) underwent HALDP. Patients with HALDP were older (median, 67 vs. 61 years, p = 0.02). Sex, race, Body Mass Index (median, 27 vs. 26), receipt of neoadjuvant chemotherapy, gland texture, wound infection rates, postoperative pancreatic fistula, overall complications, and hospital length-of-stay were similar between HALDP and LDP (all p > 0.05). In unadjusted analysis, operative times were shorter for HALDP (164 vs. 276 min, p < 0.001), but after adjustment, did not differ significantly (MR 0.73; 0.49-1.07, p = 0.1). Unadjusted incidence of hernia was higher in HALDP versus LDP (60% vs. 24%, p = 0.004). After adjustment, HALDP was associated with an increased odds of developing hernia (OR 7.52; 95% CI 1.54-36.8, p = 0.014). After propensity score matching, odds of hernia development remained higher for HALDP (OR 4.62; 95% CI 1.28-16.65, p = 0.031) p = 0.03). CONCLUSIONS: Compared with LDP, HALDP was associated with increased likelihood of postoperative hernia with insufficient evidence that HALDP shortens operative times. Our results suggest that HALDP may not be equivalent to LDP.


Subject(s)
Incisional Hernia , Laparoscopy , Pancreatic Neoplasms , Humans , Pancreatic Neoplasms/surgery , Pancreatic Neoplasms/complications , Incisional Hernia/surgery , Retrospective Studies , Treatment Outcome , Pancreatectomy/adverse effects , Pancreatectomy/methods , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Postoperative Complications/surgery , Laparoscopy/methods , Operative Time , Length of Stay
13.
Int J Health Plann Manage ; 38(1): 40-52, 2023 Jan.
Article in English | MEDLINE | ID: mdl-36128602

ABSTRACT

Setting standards is a quality improvement mechanism and an important means for shaping the provision of health and social care services. Standards comprise statements describing a process or outcome of care. Setting standards is a global practice. It would be useful to have an understanding of the underpinning definitions of standards used internationally. Therefore, the aim of this review was to examine definitions of health and social care standards used internationally and identify similarities and differences. A targeted grey literature search of standard-setting bodies' websites and related health legislation was conducted to retrieve explicit definitions of standards. Of 15 standard-setting bodies that were searched, 12 definitions of standards were narratively synthesised. Terms that appeared in two or more of the definitions were extracted. Counts and percentages were calculated for these terms to determine magnitude of use. The commonalities among definitions included 'quality' (n = 6, 50%), 'statements' (n = 5, 42%), 'performance' (n = 5, 42%), and 'measureable' (n = 4, 33%). The less commonly used terms were 'processes' (n = 3, 25%), 'set' (n = 3, 25%), 'evidence based' (n = 2, 17%), 'outcome' (n = 2, 17%), 'safe' (n = 2, 17%), and 'guidance' (n = 2, 17%). Explicit definitions of standards were not retrieved from health legislation documents. Standard-setting bodies develop standards in the context of the health systems in which they are implemented; some are aspirational levels of quality, while others are minimum levels of quality. Researchers, standards developers and policy makers should be cognisant of this when comparing standards between countries.


Subject(s)
Quality Improvement , Standard of Care
14.
Transpl Int ; 35: 10855, 2022.
Article in English | MEDLINE | ID: mdl-36568142

ABSTRACT

Donation-after-circulatory-death (DCD), donation-after-brain-death (DBD), and living-donation (LD) are the three possible options for liver transplantation (LT), each with unique benefits and complication rates. We aimed to compare DCD-, DBD-, and LD-LT-specific graft survival and biliary complications (BC). We collected data on 138 DCD-, 3,027 DBD- and 318 LD-LTs adult recipients from a single center and analyzed patient/graft survival. BC (leak and anastomotic/non-anastomotic stricture (AS/NAS)) were analyzed in a subset of 414 patients. One-/five-year graft survival were 88.6%/70.0% for DCD-LT, 92.6%/79.9% for DBD-LT, and, 91.7%/82.9% for LD-LT. DCD-LTs had a 1.7-/1.3-fold adjusted risk of losing their graft compared to DBD-LT and LD-LT, respectively (p < 0.010/0.403). Bile leaks were present in 10.1% (DCD-LTs), 7.2% (DBD-LTs), and 36.2% (LD-LTs) (ORs, DBD/LD vs. DCD: 0.7/4.2, p = 0.402/<0.001). AS developed in 28.3% DCD-LTs, 18.1% DBD-LTs, and 43.5% LD-LTs (ORs, DBD/LD vs. DCD: 0.5/1.8, p = 0.018/0.006). NAS was present in 15.2% DCD-LTs, 1.4% DBDs-LT, and 4.3% LD-LTs (ORs, DBD/LD vs. DCD: 0.1/0.3, p = 0.001/0.005). LTs w/o BC had better liver graft survival compared to any other groups with BC. DCD-LT and LD-LT had excellent graft survival despite significantly higher BC rates compared to DBD-LT. DCD-LT represents a valid alternative whose importance should increase further with machine/perfusion systems.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Adult , Humans , Liver Transplantation/adverse effects , Cohort Studies , Brain Death , Living Donors , Retrospective Studies , Graft Survival , Tissue Donors , Death
15.
Perioper Med (Lond) ; 11(1): 25, 2022 Jul 12.
Article in English | MEDLINE | ID: mdl-35818058

ABSTRACT

BACKGROUND: Preventing post-operative ileus (POI) is important given its associated morbidity and increased cost of care. The authors' prior work showed that POI in patients with newly created ileostomies is associated with a post-operative day (POD) 2 net fluid balance of > + 800 mL. The purpose of this study was to conduct an initial assessment of the efficacy of a pilot intervention. METHODS: This is a single-institution, pre-post-intervention, proof-of-concept study conducted on the Colorectal Surgery service at the University of California, San Francisco. The study included 58 procedures with ileostomy formation by board-certified colorectal surgeons between August 13, 2020 and June 1, 2021. The intervention included three adjustments to the standard Enhanced Recovery After Surgery protocol: addition of diuresis, delay in advancement to solid food, and earlier stoma intubation. Demographics, intraoperative factors, post-operative fluid balance, and outcomes (POI, post-procedure length of stay [LOS], hospitalization cost, and re-admissions) were compared between patients pre- and post-intervention. RESULTS: Eight (13.8%) of the 58 procedures in the intervention period were associated with POI vs. a baseline POI rate of 32.6% (p = 0.004). Compared to patients without intervention, those with intervention had 67% less odds of POI (OR 0.33, 95% CI 0.15-0.73, p = 0.01). This difference remained significant when adjusted for age, gender, body mass index, procedure duration, and operative approach (adjusted OR 0.32, 95% CI 0.14-0.72, p = 0.01). Average POD2 stoma output was 0.3 L greater (1.1 L vs. 0.8L; p < 0.001) and net fluid balance was 1.8 L lower (+ 0.3 L vs. + 2.1 L; p < 0.00001) for these 58 cases. Average post-procedure LOS was 1.9 days lower (5.3 vs. 7.2 days, p < 0.001) and direct cost was $5561 lower ($21,652 vs. $27,213, p = 0.004), with no difference in 30-day readmissions (p = 0.43). CONCLUSIONS: This pilot intervention shows promise for reduction in POI in patients with newly created ileostomies. Additional assessment is needed to confirm these initial findings.

17.
J Surg Res ; 276: 404-415, 2022 08.
Article in English | MEDLINE | ID: mdl-35468367

ABSTRACT

INTRODUCTION: Parathyroid allotransplantation is an emerging treatment for severe hypoparathyroidism. Ensuring the viability and functional integrity of donor parathyroid glands following procurement is essential for optimal transplantation outcomes. METHODS: Cellular viability, calcium-responsive hormone secretion, and gland xenograft survival were assessed in a series of deceased donor parathyroid glands following a two-stage procurement procedure recently developed by our group (en bloc cadaveric dissection with subsequent gland isolation after transport to the laboratory). RESULTS: Parathyroid glands resected in this manner and stored up to 48 h in 4°C University of Wisconsin (UW) media retained in vitro viability with no induction of hypoxic stress (HIF-1α) or apoptotic (caspase-3) markers. Ex vivo storage did not significantly affect parathyroid gland calcium sensing capacity, with comparable calcium EC50 values and suppression of parathyroid hormone secretion at high ambient calcium concentrations. The isolated glands engrafted readily, vascularizing rapidly in vivo following transplantation into mice. CONCLUSIONS: Parathyroid tissue retains viability, calcium-sensing capacity, and in vivo engraftment capability after en bloc cadaveric resection, ex vivo dissection, and extended cold storage.


Subject(s)
Hypoparathyroidism , Parathyroid Glands , Animals , Cadaver , Calcium/pharmacology , Humans , Mice , Parathyroid Glands/transplantation , Parathyroid Hormone , Tissue Donors
18.
BMC Med Ethics ; 23(1): 20, 2022 03 05.
Article in English | MEDLINE | ID: mdl-35248038

ABSTRACT

BACKGROUND: The Public Health Service Increased Risk designation identified organ donors at increased risk of transmitting hepatitis B, hepatitis C, and human immunodeficiency virus. Despite clear data demonstrating a low absolute risk of disease transmission from these donors, patients are hesitant to consent to receiving organs from these donors. We hypothesize that patients who consent to receiving offers from these donors have decreased time to transplant and decreased waitlist mortality. METHODS: We performed a single-center retrospective review of all-comers waitlisted for liver transplant from 2013 to 2019. The three competing risk events (transplant, death, and removal from transplant list) were analyzed. 1603 patients were included, of which 1244 (77.6%) consented to offers from increased risk donors. RESULTS: Compared to those who did not consent, those who did had 2.3 times the rate of transplant (SHR 2.29, 95% CI 1.88-2.79, p < 0.0001), with a median time to transplant of 11 months versus 14 months (p < 0.0001), as well as a 44% decrease in the rate of death on the waitlist (SHR 0.56, 95% CI 0.42-0.74, p < 0.0001). All findings remained significant after controlling for the recipient age, race, gender, blood type, and MELD. Of those who did not consent, 63/359 (17.5%) received a transplant, all of which were from standard criteria donors, and of those who did consent, 615/1244 (49.4%) received a transplant, of which 183/615 (29.8%) were from increased risk donors. CONCLUSIONS: The findings of decreased rates of transplantation and increased risk of death on the waiting list by patients who were unwilling to accept risks of viral transmission of 1/300-1/1000 in the worst case scenarios suggests that this consent process may be harmful especially when involving "trigger" words such as HIV. The rigor of the consent process for the use of these organs was recently changed but a broader discussion about informed consent in similar situations is important.


Subject(s)
HIV Infections , Organ Transplantation , Tissue and Organ Procurement , Humans , Informed Consent , Tissue Donors , Waiting Lists
19.
Transplant Direct ; 8(4): e1306, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35310601

ABSTRACT

Parathyroid allotransplantation is a burgeoning treatment for severe hypoparathyroidism. Deceased donor parathyroid gland (PTG) procurement can be technically challenging due to lack of normal intraoperative landmarks and exposure constraints in the neck of organ donors. In this study, we assessed standard 4-gland exposure in situ and en bloc surgical techniques for PTG procurement and ex vivo near-infrared autofluorescence (NIRAF) imaging for identification of PTGs during organ recovery. Methods: Research tissue consent was obtained from organ donors or donor families for PTG procurement. All donors were normocalcemic, brain-dead, solid organ donors between 18 and 65 y of age. PTGs were procured initially using a standard 4-gland exposure technique in situ and subsequently using a novel en bloc resection technique after systemic organ preservation flushing. Parathyroid tissue was stored at 4 °C in the University of Wisconsin solution up to 48 h post-procurement. Fluoptics Fluobeam NIRAF camera and Image J software were utilized for quantification of NIRAF signal. Results: Thirty-one brain-dead deceased donor PTG procurements were performed by abdominal transplant surgeons. In the initial 8 deceased donors, a mean of 1.75 glands (±1.48 glands SD) per donor were recovered using the 4-gland in situ technique. Implementation of combined en bloc resection with ex vivo NIRAF imaging in 23 consecutive donors yielded a mean of 3.60 glands (±0.4 SD) recovered per donor (P < 0.0001). Quantification of NIRAF integrated density signal demonstrated >1-fold log difference in PTG (2.13 × 105 pixels) versus surrounding anterior neck structures (1.9 × 104 pixels; P < 0.0001). PTGs maintain distinct NIRAF signal from the time of recovery (1.88 × 105 pixels) up to 48 h post-procurement (1.55 × 105 pixels) in organ preservation cold storage (P = 0.34). Conclusions: The use of an en bloc surgical technique with ex vivo NIRAF imaging significantly enhances the identification and recovery of PTG from deceased donors.

20.
J Interpers Violence ; 37(5-6): 2218-2241, 2022 03.
Article in English | MEDLINE | ID: mdl-32639853

ABSTRACT

Previous research has demonstrated a graded relationship between the number of Adverse Childhood Experiences reported (an ACE score) and child outcomes. However, ACE scores lack specificity and ignore the patterning of adversities, which are informative for interventions. The aim of the present study was to explore the clustering of ACEs and whether this clustering differs by gender or is predicted by poverty. Data on 8,572 participants of the Avon Longitudinal Study of Parents and Children (ALSPAC) were used. ALSPAC is a regionally representative prenatal cohort of children born between 1991 and 1992 in the Avon region of South-West England. ACEs included parental divorce, death of a close family member, interparental violence, parental mental health problems, parental alcohol misuse, parental drug use, parental convictions, and sexual, emotional, and physical abuse, between birth and 19 years. Latent class analysis was used to derive ACE clusters and associations between poverty, gender, and the derived classes tested using multinomial logistic regression. Five latent classes were identified: "Low ACEs" (55%), "Parental separation and mother's mental health problems" (18%), "Parental mental health problems, convictions and separation" (15%), "Abuse and mental health problems" (6%), and "Poly adversity" (6%). Death of a close family member and sexual abuse did not cluster with other adversities. The clustering did not differ by gender. Poverty was strongly related to both individual ACEs and clusters. These findings demonstrate that ACEs cluster in specific patterns and that poverty is strongly related to this. Therefore, reducing child poverty might be one strategy for reducing ACEs.


Subject(s)
Adverse Childhood Experiences , Child , Cluster Analysis , Divorce , Humans , Longitudinal Studies , Parents , Poverty
SELECTION OF CITATIONS
SEARCH DETAIL
...