Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
1.
Kidney Int Rep ; 9(6): 1571-1573, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38899210
2.
Clin Transplant ; 38(1): e15157, 2024 01.
Article in English | MEDLINE | ID: mdl-37792310

ABSTRACT

INTRODUCTION: Self-reported measures of immunosuppression adherence have been largely examined in research settings. METHODS: In this single center study of 610 kidney transplant recipients, we examined if a voluntary, non-anonymous self-report measure could identify non-adherence in a routine clinic setting and how patients perceived such a measure. Non-adherence was measured using the Basel Assessment of Adherence to Immunosuppressive Medications Scale (BAASIS) and patient perception was elicited using a customized questionnaire. RESULTS: Non-responders to the survey (15%) were younger, more likely to be black, and less likely to have had a pre-emptive transplant. Among complete responders (n = 485), 38% reported non-adherence with non-adherent patients being younger (54 y vs. 60 y; p = .01), less likely to have been on dialysis pre-transplant (59% vs. 68%; p = .04), further out from transplant (37 vs. 22 months; p < .001) and had more rejections in the preceding year (8% vs. 3%; p = .02). Self-reported non-adherence was associated with higher calcineurin inhibitor intra-patient variability (27.4% vs. 24.5%; p = .02), but not with donor-specific antibody detection (27.8% vs. 21.2%, p = .15). Of patients providing feedback (n = 500), the majority of patients felt comfortable reporting adherence (92%), that the survey was relevant to their visit (71%), and that the survey did not interfere with their clinic visit (88%). CONCLUSION: In summary, a self-reported questionnaire during clinic visits identified immunosuppression non-adherence in a significant proportion of patients and was well received by patients. Integrating self-report measures into routine post-transplant care may enable early identification of non-adherence.


Subject(s)
Kidney Transplantation , Humans , Self Report , Immunosuppressive Agents/therapeutic use , Surveys and Questionnaires , Immunosuppression Therapy , Transplant Recipients , Medication Adherence
4.
Am Surg ; 89(4): 1286-1289, 2023 Apr.
Article in English | MEDLINE | ID: mdl-33631945

ABSTRACT

Enteric hyperoxaluria (EH) is a known complication of Roux-en-Y gastric bypass (RYGB) and can lead to nephrolithiasis, oxalate-induced nephropathy, and end-stage renal disease. Recurrent EH-induced renal impairment has been reported after kidney transplantation and may lead to allograft loss. EH occurs in up to one quarter of patients following malabsorption-based bariatric operations. We present a report of medically refractory EH in a renal transplant recipient with allograft dysfunction that was successfully managed with reversal of RYGB. The patient developed renal failure 7 years following gastric bypass requiring renal transplant. Following an uneventful living donor kidney transplant, the patient developed recurrent subacute allograft dysfunction. A diagnosis of oxalate nephropathy was made based on biopsy findings of renal tubular calcium oxalate deposition in conjunction with elevated serum oxalate levels and elevated 24-hr urinary oxalate excretion. Progressive renal failure ensued despite medical management. The patient underwent reversal of her RYGB, which resulted in recovery of allograft function. This report highlights an under-recognized, potentially treatable cause of renal allograft failure in patients with underlying gastrointestinal pathology or history of bariatric surgery and proposes a strategy for management of patients with persistent hyperoxaluria based on a review of the literature.


Subject(s)
Gastric Bypass , Hyperoxaluria , Kidney Transplantation , Renal Insufficiency , Humans , Female , Gastric Bypass/adverse effects , Kidney Transplantation/adverse effects , Calcium Oxalate/urine , Oxalates , Hyperoxaluria/surgery , Hyperoxaluria/complications , Allografts
5.
J Am Soc Nephrol ; 34(1): 26-39, 2023 01 01.
Article in English | MEDLINE | ID: mdl-36302599

ABSTRACT

BACKGROUND: In March 2021, the United States implemented a new kidney allocation system (KAS250) for deceased donor kidney transplantation (DDKT), which eliminated the donation service area-based allocation and replaced it with a system on the basis of distance from donor hospital to transplant center within/outside a radius of 250 nautical miles. The effect of this policy on kidney discards and logistics is unknown. METHODS: We examined discards, donor-recipient characteristics, cold ischemia time (CIT), and delayed graft function (DGF) during the first 9 months of KAS250 compared with a pre-KAS250 cohort from the preceding 2 years. Changes in discards and CIT after the onset of COVID-19 and the implementation of KAS250 were evaluated using an interrupted time-series model. Changes in allocation practices (biopsy, machine perfusion, and virtual cross-match) were also evaluated. RESULTS: Post-KAS250 saw a two-fold increase in kidneys imported from nonlocal organ procurement organizations (OPO) and a higher proportion of recipients with calculated panel reactive antibody (cPRA) 81%-98% (12% versus 8%; P <0.001) and those with >5 years of pretransplant dialysis (35% versus 33%; P <0.001). CIT increased (mean 2 hours), including among local OPO kidneys. DGF was similar on adjusted analysis. Discards after KAS250 did not immediately change, but we observed a statistically significant increase over time that was independent of donor quality. Machine perfusion use decreased, whereas reliance on virtual cross-match increased, which was associated with shorter CIT. CONCLUSIONS: Early trends after KAS250 show an increase in transplant access to patients with cPRA>80% and those with longer dialysis duration, but this was accompanied by an increase in CIT and a suggestion of worsening kidney discards.


Subject(s)
COVID-19 , Kidney Transplantation , Tissue and Organ Procurement , Humans , United States , Kidney , Tissue Donors , Antibodies , Graft Survival , Delayed Graft Function/epidemiology
6.
Hum Immunol ; 84(3): 214-223, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36581507

ABSTRACT

Virtual crossmatch (VXM) is used as an alternative to or in conjunction with a cell-based physical crossmatch (PXM) for assessing HLA (human leukocyte antigen) compatibility prior to deceased donor kidney transplantation (DDKT). Data on practice patterns and perceptions regarding VXM use in the US are limited. We performed a survey of US HLA directors and transplant surgeons regarding HLA testing and crossmatch strategies. 53 (56 %) HLA directors and 68 surgeons (representing âˆ¼ 23 % of US transplant centers) completed the survey. Both groups agreed that VXM could reduce cold ischemia time (CIT), costs and improve allocation efficiency. VXM use increased following the 2021 kidney allocation change. Reducing CIT was the primary reason for favoring VXM over PXM. Preference for VXM reduced as candidates' panel reactive antibodies increased. Regulations, program policies and limitations of HLA technology were cited as important reasons for preferring PXM over VXM. Surgeons reported similar perceptions, but findings are limited by the low response rate. Finally, half the labs reported lacking specific protocols for VXM use. In conclusion, improved HLA technology and protocols along with changes to institutional procedures and policy regulations are needed for safer expansion of VXM in DDKT.


Subject(s)
Kidney Transplantation , Humans , United States , Kidney Transplantation/methods , Blood Grouping and Crossmatching , Histocompatibility Testing/methods , Kidney , HLA Antigens , Histocompatibility , Graft Rejection
7.
Front Robot AI ; 9: 898075, 2022.
Article in English | MEDLINE | ID: mdl-35783023

ABSTRACT

Tactile sensing for robotics is achieved through a variety of mechanisms, including magnetic, optical-tactile, and conductive fluid. Currently, the fluid-based sensors have struck the right balance of anthropomorphic sizes and shapes and accuracy of tactile response measurement. However, this design is plagued by a low Signal to Noise Ratio (SNR) due to the fluid based sensing mechanism "damping" the measurement values that are hard to model. To this end, we present a spatio-temporal gradient representation on the data obtained from fluid-based tactile sensors, which is inspired from neuromorphic principles of event based sensing. We present a novel algorithm (GradTac) that converts discrete data points from spatial tactile sensors into spatio-temporal surfaces and tracks tactile contours across these surfaces. Processing the tactile data using the proposed spatio-temporal domain is robust, makes it less susceptible to the inherent noise from the fluid based sensors, and allows accurate tracking of regions of touch as compared to using the raw data. We successfully evaluate and demonstrate the efficacy of GradTac on many real-world experiments performed using the Shadow Dexterous Hand, equipped with the BioTac SP sensors. Specifically, we use it for tracking tactile input across the sensor's surface, measuring relative forces, detecting linear and rotational slip, and for edge tracking. We also release an accompanying task-agnostic dataset for the BioTac SP, which we hope will provide a resource to compare and quantify various novel approaches, and motivate further research.

8.
Clin Transplant ; 36(9): e14776, 2022 09.
Article in English | MEDLINE | ID: mdl-35821617

ABSTRACT

BACKGROUND: In kidney transplantation, delayed graft function (DGF) is associated with increased morbidity and a higher risk of graft failure. Prior research suggests that chronic hypotension increases DGF risk, but the relationship of preoperative blood pressure to DGF is unclear. METHODS: In this single center study of adult deceased donor kidney transplant recipients transplanted between 2015 and 2019, we evaluated the question of whether preoperative mean arterial pressure (MAP) affected DGF risk. Additionally, we investigated whether the risk of DGF was moderated by certain donor and recipient characteristics. For recipient characteristics associated with increased DGF risk and preoperative MAP, we performed a mediation analysis to estimate the proportion of DGF risk mediated through preoperative MAP. RESULTS: Among 562 deceased donor kidney recipients, DGF risk decreased as preoperative MAP increased, with a 2% lower risk per 1 mm Hg increase in MAP. This increased risk was similar, with no statistically significant interaction effect between preoperative MAP and donor (donation after circulatory death) and recipient characteristics (diabetes, body mass index, and use of anti-hypertensive medications). Preoperative MAP was negativity correlated with recipient BMI and duration of pre transplant dialysis. On mediation analysis, MAP accounted for 12% and 16% of the DGF risk associated with recipient BMI and pre-transplant dialysis duration, respectively. CONCLUSION: In deceased donor kidney transplantation, each 1 mm Hg increase in preoperative MAP was associated with 2% lower DGF risk. Preoperative MAP was influenced by recipient BMI and dialysis duration, and likely contributes to some of the high DGF risk from obesity and long dialysis vintage.


Subject(s)
Delayed Graft Function , Kidney Transplantation , Adult , Antihypertensive Agents , Blood Pressure , Delayed Graft Function/etiology , Graft Rejection/etiology , Graft Survival , Humans , Kidney , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Tissue Donors
9.
Kidney360 ; 3(3): 426-434, 2022 03 31.
Article in English | MEDLINE | ID: mdl-35582179

ABSTRACT

Background: Investigations of health-related quality of life (HRQoL) in AKI have been limited in number, size, and domains assessed. We surveyed AKI survivors to describe the range of HRQoL AKI-related experiences and examined potential differences in AKI effects by sex and age at AKI episode. Methods: AKI survivors among American Association of Kidney Patients completed an anonymous online survey in September 2020. We assessed: (1) sociodemographic characteristics; (2) effects of AKI-physical, emotional, social; and (3) perceptions about interactions with health care providers using quantitative and qualitative items. Results: Respondents were 124 adult AKI survivors. Eighty-four percent reported that the AKI episode was very/extremely impactful on physical/emotional health. Fifty-seven percent reported being very/extremely concerned about AKI effects on work, and 67% were concerned about AKI effects on family. Only 52% of respondents rated medical team communication as very/extremely good. Individuals aged 22-65 years at AKI episode were more likely than younger/older counterparts to rate the AKI episode as highly impactful overall (90% versus 63% younger and 75% older individuals; P=0.04), more impactful on family (78% versus 50% and 46%; P=0.008), and more impactful on work (74% versus 38% and 10%; P<0.001). Limitations of this work include convenience sampling, retrospective data collection, and unknown AKI severity. Conclusions: These findings are a critical step forward in understanding the range of AKI experiences/consequences. Future research should incorporate more comprehensive HRQoL measures, and health care professionals should consider providing more information in their patient communication about AKI and follow-up.


Subject(s)
Acute Kidney Injury/psychology , Patient Reported Outcome Measures , Quality of Life , Survivors/psychology , Acute Kidney Injury/epidemiology , Adult , Age Factors , Aged , Health Impact Assessment , Humans , Middle Aged , Quality of Life/psychology , Retrospective Studies , Sex Factors , United States/epidemiology , Young Adult
10.
Transpl Int ; 35: 10094, 2022.
Article in English | MEDLINE | ID: mdl-35368641

ABSTRACT

Anti-HLA Donor Specific Antibody (DSA) detection post kidney transplant has been associated with adverse outcomes, though the impact of early DSA screening on stable patients remain unclear. We analyzed impact of DSA detection through screening in 1st year stable patients (n = 736) on subsequent estimated glomerular filtration rate (eGFR), death censored graft survival (DCGS), and graft failure (graft loss including return to dialysis or re-transplant, patient death, or eGFR < 20 ml/min at last follow up). Patients were grouped using 1st year screening into DSA+ (Class I, II; n = 131) or DSA- (n = 605). DSA+ group were more DR mismatched (p = 0.02), more sensitized (cPRA ≥90%, p = 0.002), less Caucasian (p = 0.04), and had less pre-emptive (p = 0.04) and more deceased donor transplants (p = 0.03). DSA+ patients had similar eGFR (54.8 vs. 53.8 ml/min/1.73 m2, p = 0.56), DCGS (91% vs. 94%, p = 0.30), and graft failure free survival (76% vs. 82%, p = 0.11). DSA timing and type did not impact survival. Among those with a protocol biopsy (n = 515), DSA detected on 1st year screening was a predictor for graft failure on multivariate analysis (1.91, 95% CI 1.03-3.55, p = 0.04). Overall, early DSA detection in stable patients was an independent risk factor for graft failure, though only among those who underwent a protocol biopsy.


Subject(s)
Kidney Transplantation , Graft Rejection , HLA Antigens , Humans , Kidney Transplantation/adverse effects , Tissue Donors , Transplant Recipients
11.
Transplant Direct ; 8(1): e1256, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34912945

ABSTRACT

Barriers to medication adherence may differ from barriers in other domains of adherence. In this study, we assessed the association between pre-kidney transplantation (KT) factors with nonadherent behaviors in 3 different domains post-KT. METHODS: We conducted a prospective cohort study with patient interviews at initial KT evaluation (baseline-nonadherence predictors in sociodemographic, condition-related, health system, and patient-related psychosocial factors) and at ≈6 mo post-KT (adherence outcomes: medications, healthcare follow-up, and lifestyle behavior). All patients who underwent KT at our institution and had ≈6-mo follow-up interview were included in the study. We assessed nonadherence in 3 different domains using continuous composite measures derived from the Health Habit Survey. We built multiple linear and logistic regression models, adjusting for baseline characteristics, to predict adherence outcomes. RESULTS: We included 173 participants. Black race (mean difference in adherence score: -0.72; 95% confidence interval [CI], -1.12 to -0.32) and higher income (mean difference: -0.34; 95% CI, -0.67 to -0.02) predicted lower medication adherence. Experience of racial discrimination predicted lower adherence (odds ratio, 0.31; 95% CI, 0.12-0.76) and having internal locus of control predicted better adherence (odds ratio, 1.46; 95% CI, 1.06-2.03) to healthcare follow-up. In the lifestyle domain, higher education (mean difference: 0.75; 95% CI, 0.21-1.29) and lower body mass index (mean difference: -0.08; 95% CI, -0.13 to -0.03) predicted better adherence to dietary recommendations, but no risk factors predicted exercise adherence. CONCLUSIONS: Different nonadherence behaviors may stem from different motivation and risk factors (eg, clinic nonattendance due to experiencing racial discrimination). Thus adherence intervention should be individualized to target at-risk population (eg, bias reduction training for medical staff to improve patient adherence to clinic visit).

12.
PLoS One ; 16(8): e0254115, 2021.
Article in English | MEDLINE | ID: mdl-34437548

ABSTRACT

Due to shortage of donor, kidney transplants (KTs) from donors with acute kidney injury (AKI) are expanding. Although previous studies comparing clinical outcomes between AKI and non-AKI donors in KTs have shown comparable results, data on high-volume analysis of KTs outcomes with AKI donors are limited. This study aimed to analyze the selection trends of AKI donors and investigate the impact of AKI on graft failure using the United states cohort data. We analyzed a total 52,757 KTs collected in the Scientific Registry of Transplant Recipient (SRTR) from 2010 to 2015. The sample included 4,962 (9.4%) cases of KTs with AKI donors (creatinine ≥ 2 mg/dL). Clinical characteristics of AKI and non-AKI donors were analyzed and outcomes of both groups were compared. We also analyzed risk factors for graft failure in AKI donor KTs. Although the incidence of delayed graft function was higher in recipients of AKI donors compared to non-AKI donors, graft and patient survival were not significantly different between the two groups. We found donor hypertension, cold ischemic time, the proportion of African American donors, and high KDPI were risk factors for graft failure in AKI donor KTs. KTs from deceased donor with AKI showed comparable outcomes. Thus, donors with AKI need to be considered more actively to expand donor pool. Caution is still needed when donors have additional risk factors of graft failure.


Subject(s)
Acute Kidney Injury , Donor Selection , Graft Rejection/mortality , Kidney Transplantation/mortality , Registries , Acute Kidney Injury/mortality , Acute Kidney Injury/surgery , Adult , Female , Humans , Male , Retrospective Studies , Risk Management
13.
Kidney Int ; 100(3): 660-671, 2021 09.
Article in English | MEDLINE | ID: mdl-33940109

ABSTRACT

For assessing human leukocyte antigen compatibility in deceased donor kidney transplantation, virtual crossmatch is used as an alternative to physical crossmatch and has potential to reduce cold ischemia time. The 2014 United States kidney allocation system prioritized highly sensitized candidates but led to increased shipping of kidneys. Using data from the Scientific Registry of Transplant Recipients, we evaluated changes in virtual crossmatch use with the new allocation policy and the impact of virtual crossmatch use on cold ischemia time and transplant outcomes. This was a retrospective cohort study of adult deceased donor kidney recipients in the United States (2011-2018) transplanted with either 9,632 virtual or 71,839 physical crossmatches. Before allocation change, only 9% of transplants were performed relying on a virtual crossmatch. After the 2014 allocation change, this increased by 2.4%/year so that 18% transplants in 2018 were performed with just a virtual crossmatch. There was significant variation in virtual crossmatch use among transplant regions (range 0.7-36%) and higher use was noted among large volume centers. Compared to physical crossmatches, virtual crossmatches were significantly associated with shorter cold ischemia times (mean 15.0 vs 16.5 hours) and similar death-censored graft loss and mortality (both hazard ratios HR 0.99) at a median follow-up of 2.9 years. Thus, our results show that virtual crossmatch is an attractive strategy for shortening cold ischemia time without negatively impacting transplant outcomes. Hence, strategies to optimize use and reduce practice variation may allow for maximizing benefits from virtual crossmatch.


Subject(s)
Cold Ischemia , Kidney Transplantation , Adult , Graft Survival , Histocompatibility Testing , Humans , Kidney , Kidney Transplantation/adverse effects , Retrospective Studies , Tissue Donors , United States
14.
Am J Kidney Dis ; 77(6): 833-856, 2021 06.
Article in English | MEDLINE | ID: mdl-33745779

ABSTRACT

Evaluation of patients for kidney transplant candidacy is a comprehensive process that involves a detailed assessment of medical and surgical issues, psychosocial factors, and patients' physical and cognitive abilities with an aim of balancing the benefits of transplantation and potential risks of surgery and long-term immunosuppression. There is considerable variability among transplant centers in their approach to evaluation and decision-making regarding transplant candidacy. The 2020 KDIGO (Kidney Disease: Improving Guidelines Outcome) clinical practice guideline on the evaluation and management of candidates for kidney transplantation provides practice recommendations that can serve as a useful reference guide to transplant professionals. The guideline, covering a broad range of topics, was developed by an international group of experts from transplant and nephrology through a review of literature published until May 2019. A work group of US transplant nephrologists convened by NKF-KDOQI (National Kidney Foundation-Kidney Disease Quality Initiative) chose key topics for this commentary with a goal of presenting a broad discussion to the US transplant community. Each section of this article has a summary of the key KDIGO guideline recommendations, followed by a brief commentary on the recommendations, their clinical utility, and potential implementation challenges. The KDOQI work group agrees broadly with the KDIGO recommendations but also recognizes and highlights the decision-making challenges that arise from lack of high-quality evidence and the need to balance equity with utility of organ transplantation.


Subject(s)
Kidney Transplantation , Patient Selection , Practice Guidelines as Topic , Renal Insufficiency, Chronic/surgery , Humans
15.
Clin Transplant ; 35(5): e14259, 2021 05.
Article in English | MEDLINE | ID: mdl-33605490

ABSTRACT

Kidney transplant recipients with high-risk cytomegalovirus (CMV) serostatus (seropositive donor to seronegative recipient) are at risk for late-onset CMV after cessation of antiviral prophylaxis. We report findings from a strategy of bimonthly (every 2 weeks) CMV screening for late-onset CMV. This is a single-center retrospective cohort study of 70 high-risk CMV kidney transplant recipients transplanted between June 2016 and September 2018. Patients were monitored at 6-12 months post-transplantation for late-onset CMV using bimonthly CMV nucleic acid testing (NAT). Adherence to screening and its correlation with CMV-related hospitalizations were assessed. Failure to prevent CMV-related hospitalization was classified into three categories (non-adherence to CMV testing, rapid CMV progression, and health system failure). Twenty-one (30%) patients developed CMV DNAemia, of whom 10 (14%) required hospitalization. Reasons for CMV-related hospitalization despite screening were (i) screening non-adherence (50%), (ii) rapid progression (40%), and (iii) health system failure (10%). Adherence to screening was associated with lower viral counts at diagnosis (r = -.44, p = .049) and a trend towards lower risk of CMV-related hospitalization (OR: 0.97 per 1% increase in adherence; 95% CI: 0.94-1.00; p = .06). Bimonthly monitoring for late-onset CMV allows for early CMV detection and may lower CMV-related hospitalization.


Subject(s)
Cytomegalovirus Infections , Kidney Transplantation , Antiviral Agents/therapeutic use , Cytomegalovirus , Cytomegalovirus Infections/drug therapy , Humans , Retrospective Studies , Transplant Recipients
17.
Am J Transplant ; 21(1): 186-197, 2021 01.
Article in English | MEDLINE | ID: mdl-32558153

ABSTRACT

Subclinical rejection (SCR) screening in kidney transplantation (KT) using protocol biopsies and noninvasive biomarkers has not been evaluated from an economic perspective. We assessed cost-effectiveness from the health sector perspective of SCR screening in the first year after KT using a Markov model that compared no screening with screening using protocol biopsy or biomarker at 3 months, 12 months, 3 and 12 months, or 3, 6, and 12 months. We used 12% subclinical cellular rejection and 3% subclinical antibody-mediated rejection (SC-ABMR) for the base-case cohort. Results favored 1-time screening at peak SCR incidence rather than repeated screening. Screening 2 or 3 times was favored only with age <35 years and with high SC-ABMR incidence. Compared to biomarkers, protocol biopsy yielded more quality-adjusted life years (QALYs) at lower cost. A 12-month biopsy cost $13 318/QALY for the base-case cohort. Screening for cellular rejection in the absence of SC-ABMR was less cost effective with 12-month biopsy costing $46 370/QALY. Screening was less cost effective in patients >60 years. Using biomarker twice or thrice was cost effective only if biomarker cost was <$700. In conclusion, in KT, screening for SCR more than once during the first year is not economically reasonable. Screening with protocol biopsy was favored over biomarkers.


Subject(s)
Kidney Transplantation , Adult , Antibodies , Biomarkers , Biopsy , Graft Rejection/diagnosis , Graft Rejection/etiology , Humans
19.
Crit Care Med ; 48(2): e87-e97, 2020 02.
Article in English | MEDLINE | ID: mdl-31939807

ABSTRACT

OBJECTIVES: To assess the attitudes of practitioners with respect to net ultrafiltration prescription and practice among critically ill patients with acute kidney injury treated with renal replacement therapy. DESIGN: Multinational internet-assisted survey. SETTING: Critical care practitioners involved with 14 societies in 80 countries. SUBJECTS: Intervention: MEASUREMENT AND MAIN RESULTS:: Of 2,567 practitioners who initiated the survey, 1,569 (61.1%) completed the survey. Most practitioners were intensivists (72.7%) with a median duration of 13.2 years of practice (interquartile range, 7.2-22.0 yr). Two third of practitioners (71.0%; regional range, 55.0-95.5%) reported using continuous renal replacement therapy with a net ultrafiltration rate prescription of median 80.0 mL/hr (interquartile range, 49.0-111.0 mL/hr) for hemodynamically unstable and a maximal rate of 299.0 mL/hr (interquartile range, 200.0-365.0 mL/hr) for hemodynamically stable patients, with regional variation. Only a third of practitioners (31.5%; range, 13.7-47.8%) assessed hourly net fluid balance during continuous renal replacement therapy. Hemodynamic instability was reported in 20% (range, 20-38%) of patients and practitioners decreased the rate of fluid removal (70.3%); started or increased the dose of a vasopressor (51.5%); completely stopped fluid removal (35.8%); and administered a fluid bolus (31.6%), with significant regional variation. Compared with physicians, nurses were most likely to report patient intolerance to net ultrafiltration (73.4% vs 81.3%; p = 0.002), frequent interruptions (40.4% vs 54.5%; p < 0.001), and unavailability of trained staff (11.9% vs 15.6%; p = 0.04), whereas physicians reported unavailability of dialysis machines (14.3% vs 6.1%; p < 0.001) and costs associated with treatment as barriers (12.1% vs 3.0%; p < 0.001) with significant regional variation. CONCLUSIONS: Our study provides new knowledge about the presence and extent of international practice variation in net ultrafiltration. We also identified barriers and specific targets for quality improvement initiatives. Our data reflect the need for evidence-based practice guidelines for net ultrafiltration.


Subject(s)
Acute Kidney Injury/therapy , Continuous Renal Replacement Therapy/methods , Critical Care/methods , Critical Illness/therapy , Personnel, Hospital/statistics & numerical data , Continuous Renal Replacement Therapy/adverse effects , Humans , Ultrafiltration
20.
J Orthop Case Rep ; 10(7): 34-38, 2020 Oct.
Article in English | MEDLINE | ID: mdl-33585313

ABSTRACT

INTRODUCTION: Traumatic spine injury is one of the leading causes of morbidity and mortality in trauma patients. Open surgical procedure is associated with increased blood loss, surgical trauma, and increased recovery period. The goal of minimally invasive surgery (MIS) is to minimize iatrogenic trauma caused by open surgery. CASE REPORT: A 39-year-old female patient presented to us with complaints of severe pain in back following a fall from ten feet height 1-day back. She was diagnosed with L1 burst fracture and was managed by indirect fracture reduction and posterior instrumented stabilization from D12 to L2 by MIS. She presented to us with complaints of pain over back after 3 months of index surgery. Neurology was intact, and ESR and quantitative CRPH were normal. X-ray showed downward and outward displacement of left connecting rod with pedicle screws in situ. CONCLUSION: Minimal invasive surgery in spine is associated with steep-learning curve and technical challenges. Mechanical complications associated with implants should be always kept in mind while planning the surgery.

SELECTION OF CITATIONS
SEARCH DETAIL
...