Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 54
Filter
1.
Clin Res Cardiol ; 2024 Apr 08.
Article in English | MEDLINE | ID: mdl-38587564

ABSTRACT

BACKGROUND AND AIMS: Candidate selection for lung transplantation (LuTx) is pivotal to ensure individual patient benefit as well as optimal donor organ allocation. The impact of coronary artery disease (CAD) on post-transplant outcomes remains controversial. We provide comprehensive data on the relevance of CAD for short- and long-term outcomes following LuTx and identify risk factors for mortality. METHODS: We retrospectively analyzed all adult patients (≥ 18 years) undergoing primary and isolated LuTx between January 2000 and August 2021 at the LMU University Hospital transplant center. Using 1:1 propensity score matching, 98 corresponding pairs of LuTx patients with and without relevant CAD were identified. RESULTS: Among 1,003 patients having undergone LuTx, 104 (10.4%) had relevant CAD at baseline. There were no significant differences in in-hospital mortality (8.2% vs. 8.2%, p > 0.999) as well as overall survival (HR 0.90, 95%CI [0.61, 1.32], p = 0.800) between matched CAD and non-CAD patients. Similarly, cardiovascular events such as myocardial infarction (7.1% CAD vs. 2.0% non-CAD, p = 0.170), revascularization by percutaneous coronary intervention (5.1% vs. 1.0%, p = 0.212), and stroke (2.0% vs. 6.1%, p = 0.279), did not differ statistically between both matched groups. 7.1% in the CAD group and 2.0% in the non-CAD group (p = 0.078) died from cardiovascular causes. Cox regression analysis identified age at transplantation (HR 1.02, 95%CI [1.01, 1.04], p < 0.001), elevated bilirubin (HR 1.33, 95%CI [1.15, 1.54], p < 0.001), obstructive lung disease (HR 1.43, 95%CI [1.01, 2.02], p = 0.041), decreased forced vital capacity (HR 0.99, 95%CI [0.99, 1.00], p = 0.042), necessity of reoperation (HR 3.51, 95%CI [2.97, 4.14], p < 0.001) and early transplantation time (HR 0.97, 95%CI [0.95, 0.99], p = 0.001) as risk factors for all-cause mortality, but not relevant CAD (HR 0.96, 95%CI [0.71, 1.29], p = 0.788). Double lung transplant was associated with lower all-cause mortality (HR 0.65, 95%CI [0.52, 0.80], p < 0.001), but higher in-hospital mortality (OR 2.04, 95%CI [1.04, 4.01], p = 0.039). CONCLUSION: In this cohort, relevant CAD was not associated with worse outcomes and should therefore not be considered a contraindication for LuTx. Nonetheless, cardiovascular events in CAD patients highlight the necessity of control of cardiovascular risk factors and a structured cardiac follow-up.

2.
Transpl Int ; 36: 11506, 2023.
Article in English | MEDLINE | ID: mdl-37799668

ABSTRACT

Prolonged mechanical ventilation (PMV) after lung transplantation poses several risks, including higher tracheostomy rates and increased in-hospital mortality. Mechanical power (MP) of artificial ventilation unifies the ventilatory variables that determine gas exchange and may be related to allograft function following transplant, affecting ventilator weaning. We retrospectively analyzed consecutive double lung transplant recipients at a national transplant center, ventilated through endotracheal tubes upon ICU admission, excluding those receiving extracorporeal support. MP and derived indexes assessed up to 36 h after transplant were correlated with invasive ventilation duration using Spearman's coefficient, and we conducted receiver operating characteristic (ROC) curve analysis to evaluate the accuracy in predicting PMV (>72 h), expressed as area under the ROC curve (AUROC). PMV occurred in 82 (35%) out of 237 cases. MP was significantly correlated with invasive ventilation duration (Spearman's ρ = 0.252 [95% CI 0.129-0.369], p < 0.01), with power density (MP normalized to lung-thorax compliance) demonstrating the strongest correlation (ρ = 0.452 [0.345-0.548], p < 0.01) and enhancing PMV prediction (AUROC 0.78 [95% CI 0.72-0.83], p < 0.01) compared to MP (AUROC 0.66 [0.60-0.72], p < 0.01). Mechanical power density may help identify patients at risk for PMV after double lung transplantation.


Subject(s)
Lung Transplantation , Respiration, Artificial , Humans , Retrospective Studies , Time Factors , Ventilator Weaning , Lung
3.
Blood Purif ; 52(11-12): 849-856, 2023.
Article in English | MEDLINE | ID: mdl-37820591

ABSTRACT

INTRODUCTION: Hyperbilirubinemia is often the first evidence for any kind of liver disorder and over one-third of all patients in intensive care units (ICU) show elevated bilirubin concentrations. In critically ill patients, high concentrations of serum bilirubin are correlated with a poor outcome. Therapies to lower bilirubin concentrations are often just symptomatically and their effect on the patients' outcome is hardly evaluated. Therefore, this study investigates whether the extracorporeal elimination of bilirubin with the cytokine adsorber CytoSorb® (CS) reduces mortality in patients with hyperbilirubinemia. METHODS: Patients with bilirubin concentrations >10 mg/dL at the ICU were screened for evaluation from 2018 to 2020. Patients with kidney replacement therapy and older than 18 years were included. Patients with continuously decreasing bilirubin concentrations after liver transplantation or other liver support systems (i.e., Molecular Adsorbents Recirculating System [MARS®], Advanced Organ Support [ADVOS]) were excluded. CS therapy was used in clinical routine and was indicated by the treating physicians. Statistical analysis was performed with IBM SPSS statistics utilizing a multivariate model. Primary outcome measure was the effect of CS on the 30-day mortality. RESULTS: Data from 82 patients (mean Simplified Acute Physiology Score [SAPS] II: 74 points, mean bilirubin: 18 mg/dL, mean lactate: 3.7 mmol/L) were analyzed. There were no significant differences in patients with and without CS treatment. The multivariate model showed no significant effect of CS therapy (p = 0.402) on the 30-day mortality. In addition, a significant effect of bilirubin concentration (p = 0.274) or Model for End-Stage Liver Disease score (p = 0.928) on the 30-day mortality could not be shown. In contrast, lactate concentration (p = 0.001, b = 0.044) and SAPS II (p = 0.025, b = 0.008) had significant impact on 30-day mortality. CONCLUSION: The use of CS in patients with hyperbilirubinemia did not result in a significant reduction in 30-day mortality. Randomized and controlled studies with mortality as primary outcome measure are needed in the future to justify their use.


Subject(s)
Bilirubin , End Stage Liver Disease , Humans , Critical Illness/therapy , Cytokines , Severity of Illness Index , Hyperbilirubinemia/therapy , Lactates , Retrospective Studies
4.
Ren Fail ; 45(2): 2259231, 2023.
Article in English | MEDLINE | ID: mdl-37728069

ABSTRACT

Severe rhabdomyolysis frequently results in acute kidney injury (AKI) due to myoglobin accumulation with the need of kidney replacement therapy (KRT). The present study investigated whether the application of Cytosorb® (CS) led to an increased rate of kidney recovery in patients with KRT due to severe rhabdomyolysis. Adult patients with a myoglobin-concentration >10,000 ng/ml and KRT were included from 2014 to 2021. Exclusion criteria were chronic kidney disease and CS-treatment before study inclusion. Groups 1 and 2 were defined as KRT with and without CS, respectively. The primary outcome parameter was independence from KRT after 30 days. Propensity score (PS) matching was performed (predictors: myoglobin, SAPS-II, and age), and the chi2-test was used. 35 pairings could be matched (mean age: 57 vs. 56 years; mean myoglobin: 27,218 vs. 26,872 ng/ml; mean SAPS-II: 77 vs. 76). The probability of kidney recovery was significantly (p = .04) higher in group 1 (31.4 vs. 11.4%, mean difference: 20.0%, odds ratio (OR): 3.6). Considering patients who survived 30 days, kidney recovery was also significantly (p = .03) higher in patients treated with CS (61.1 vs. 23.5%, mean difference: 37.6%, OR: 5.1). In conclusion, the use of CS might positively affect renal recovery in patients with severe rhabdomyolysis. A prospective randomized controlled trial is needed to confirm this hypothesis.


Subject(s)
Critical Illness , Rhabdomyolysis , Adult , Humans , Middle Aged , Propensity Score , Critical Illness/therapy , Myoglobin , Prospective Studies , Kidney , Rhabdomyolysis/complications
5.
J Clin Med ; 12(15)2023 Jul 29.
Article in English | MEDLINE | ID: mdl-37568398

ABSTRACT

Immunosuppressants and antifibrotics are currently used to treat patients with various interstitial lung diseases, which may undergo lung transplantation (LTx). The retrospective study aimed to evaluate the potential effects of therapeutic regimen on the perioperative course in patients with idiopathic pulmonary fibrosis (IPF) or progressive pulmonary fibrosis (PPF) undergoing LTx. All patients with IPF and PPF undergoing LTx between January 2014 and December 2021 were included. We retrospectively screened for previous use of immunosuppressants and antifibrotic therapy. We analyzed perioperative courses, short-term outcomes, and safety retrospectively. In total, 286 patients with diagnosis of IPF or PPF were analyzed. According to the treatment regimen before LTx, the study cohort was divided into four groups and compared. No differences between antifibrotic monotherapy, combined antifibrotic and immunosuppressive therapy with regard to postoperative complications were observed. Length of mechanical ventilation was shorter in patients with antifibrotics prior to LTx. Pretreatment with antifibrotic monotherapy and a combination of antifibrotic drugs with immunosuppressive therapy, lower body mass index (BMI) and lower blood loss, were independently associated with primary graft dysfunction grades 0-3 72 hours after LTx (p < 0.001). Finally, patients with antifibrotic monotherapy developed significantly less de novo donor-specific antibodies (DSA) (p = 0.009). Higher intraoperative blood loss, etiology of interstitial lung disease (ILD) and older age were independently associated with shorter survival after LTx. Use of antifibrotic monotherapy and a combination of antifibrotic drugs with immunosuppressive therapy in IPF/PPF patients undergoing LTx, proved to be safe and might lead to beneficial effects after LTx.

6.
Am J Infect Control ; 51(10): 1167-1171, 2023 10.
Article in English | MEDLINE | ID: mdl-37044262

ABSTRACT

BACKGROUND: Hand disinfection (HD) is known to be the single most effective prevention measure to avoid nosocomial infections, but the compliance rate (CR) remains low. The aim of this study was to determine the incidence of HD opportunities and the CR during the treatment of critically ill patients. One special focus was on glove usage to determine whether gloves were substituted for HD. METHODS: This is a single-blinded direct observation of employees of an.ßintensive care unit. One specially educated observer recorded all hand hygiene indications over a period of 21 8-hour shifts as well as performed HD and study of glove use behavior. RESULTS: Over a period of 168.ßhours, 2,036 HDs should be performed during the care for 1 intensive care unit patient. In total, only 690 HDs occurred, resulting in a CR of 33.9%. With regard to the nurses, there was an HD opportunity around the clock every 6.ßminutes on average. About 17% of the total working time would have to be applied for 100% correct hand hygiene application. Donning or changing of gloves took place in 38.2% of all indications for HD. CONCLUSIONS: Our results show that HD opportunities occur in high frequency during the treatment of critically ill patients. The compliance with HD remains too low, even when a 100% CR seems to be unachievable. Improvements should focus on aseptic procedures, combining the lowest CR with the highest procedural risk for the patient. The Healthcare Personal (HCP) uses gloves when an HD opportunity occurs. Implementing glove disinfection strategies in daily routine might help optimize patient care.


Subject(s)
Cross Infection , Hand Hygiene , Humans , Critical Illness , Cross Infection/prevention & control , Guideline Adherence , Hand Disinfection/methods , Hand Hygiene/methods , Infection Control/methods , Intensive Care Units
7.
HLA ; 102(3): 331-342, 2023 09.
Article in English | MEDLINE | ID: mdl-37068792

ABSTRACT

Molecular matching is a new approach for virtual histocompatibility testing in organ transplantation. The aim of our study was to analyze whether the risk for de novo donor-specific HLA antibodies (dnDSA) after lung transplantation (LTX) can be predicted by molecular matching algorithms (MMA) and their combination. In this retrospective study we included 183 patients undergoing LTX at our center from 2012-2020. We monitored dnDSA development for 1 year. Eplet mismatches (epMM) using HLAMatchmaker were calculated and highly immunogenic eplets based on their ElliPro scores were identified. PIRCHE-II scores were calculated using PIRCHE-II algorithm (5- and 11-loci). We compared epMM and PIRCHE-II scores between patients with and without dnDSA using t-test and used ROC-curves to determine optimal cut-off values to categorize patients into four groups. We used logistic regression with AIC to compare the predictive value of PIRCHE-II, epMM, and their combination. In total 28.4% of patients developed dnDSA (n = 52), 12.5% class I dnDSA (n = 23), 24.6% class II dnDSA (n = 45), and 8.7% both class II and II dnDSA (n = 16). Mean epMMs (p-value = 0.005), mean highly immunogenic epMMs (p-value = 0.003), and PIRCHE-II (11-loci) (p = 0.01) were higher in patients with compared to without class II dnDSA. Patients with highly immunogenic epMMs above 30.5 and PIRCHE-II 11-loci above 560.0 were more likely to develop dnDSA (31.1% vs. 14.8%, p-value = 0.03). The logistic regression model including the grouping variable showed the best predictive value. MMA can support clinicians to identify patients at higher or lower risk for developing class II dnDSA and might be helpful tools for immunological risk assessment in LTX patients.


Subject(s)
Kidney Transplantation , Lung Transplantation , Humans , Retrospective Studies , Graft Rejection , Alleles , Antibodies , Histocompatibility Testing , HLA Antigens , Tissue Donors , Isoantibodies
8.
Oral Dis ; 2023 Mar 20.
Article in English | MEDLINE | ID: mdl-36939725

ABSTRACT

INTRODUCTION: Poor oral hygiene can cause infections and inflammatory diseases. Data on its impact on outcome after lung transplantation (LuTX) is scarce. Most transplant centers have individual standards regarding dental care as there is no clinical guideline. This study's objective was to assess LuTX-listed patient's dental status and determine its effect on postoperative outcome. METHODS: Two hundred patients having undergone LuTX from 2014 to 2019 were selected. Collected data comprised LuTX-indication, periodontal status, and number of carious teeth/fillings. A preoperative panoramic dental X-ray and a dentist's consultative clarification were mandatory. RESULTS: 63.5% had carious dental status, differing significantly regarding TX-indication (p < 0.001; ILD: 41.7% vs. CF: 3.1% of all patients with carious teeth). Mean age at the time of LuTX differed significantly within these groups. Neither preoperative carious dental status nor periodontitis or bone loss deteriorated post-LuTX survival significantly. No evidence was found that either resulted in a greater number of deaths related to an infectious etiology. CONCLUSION: This study shows that carious dental status, periodontitis, and bone loss do not affect post-TX survival. However, literature indicates that they can cause systemic/pulmonary infections that deteriorate post-LuTX survival. Regarding the absence of standardized guidelines regarding dental care and LuTX, we strongly recommend emphasizing research in this field.

9.
Ultraschall Med ; 44(5): 537-543, 2023 Oct.
Article in English | MEDLINE | ID: mdl-36854384

ABSTRACT

PURPOSE: The aim of the study was to evaluate whether the quantification of B-lines via lung ultrasound after lung transplantation is feasible and correlates with the diagnosis of primary graft dysfunction. METHODS: Following lung transplantation, patients underwent daily lung ultrasound on postoperative days 1-3. B-lines were quantified by an ultrasound score based on the number of single and confluent B-lines per intercostal space, using a four-region protocol. The ultrasound score was correlated with the diagnosis of primary graft dysfunction. Furthermore, correlation analyses and receiver operating characteristics analyses taking into account ultrasound score, chest radiographs, and PaO2/FiO2 ratio were performed. RESULTS: A total of 32 patients (91 ultrasound measurements) were included, whereby 10 were diagnosed with primary graft dysfunction. The median B-line score was 5 [IQR: 4, 8]. There was a significant correlation between B-line score and the diagnosis of primary graft dysfunction (r = 0.59, p < 0.001). A significant correlation could also be seen between chest X-rays and primary graft dysfunction (r = 0.34, p = 0.008), but the B-line score showed superiority over chest X-rays with respect to diagnosing primary graft dysfunction in the receiver operating characteristics curves with an area under the curve value of 0.921 versus 0.708. There was a significant negative correlation between B-line score and PaO2/FiO2 ratio (r = -0.41, p < 0.001), but not between chest X-rays and PaO2/FiO2 ratio (r = -0.14, p = 0.279). CONCLUSION: The appearance of B-lines correlated well with primary graft dysfunction and outperformed chest radiographs.


Subject(s)
Lung Transplantation , Primary Graft Dysfunction , Respiratory Distress Syndrome , Humans , Primary Graft Dysfunction/diagnostic imaging , Lung/diagnostic imaging , Ultrasonography , Lung Transplantation/adverse effects
10.
Clin Transplant ; 37(1): e14850, 2023 01.
Article in English | MEDLINE | ID: mdl-36398875

ABSTRACT

INTRODUCTION: Posterior reversible encephalopathy syndrome is a rare neurologic complication that can occur under immunosuppressive therapy with CNI after organ transplantation. METHODS: We retrospectively reviewed medical records of 545 patients who underwent lung transplantation between 2012 and 2019. Within this group, we identified 30 patients with neurological symptoms typical of PRES and compared the characteristics of patients who were diagnosed with PRES (n = 11) to those who were not (n = 19). RESULTS: The incidence of PRES after lung transplantation was 2%. Notably, 73% of the patients with PRES were female and the mean age was 39.2. Seizure (82% vs. 21%, p = .002) was the most common neurological presentation. The risk of developing PRES was significantly associated with age (OR = .92, p < .0001) and having cystic fibrosis (CF) (OP = 10.1, p < .0001). Creatinine level (1.9 vs. 1.1 mg/dl, p = .047) and tacrolimus trough level (19.4 vs. 16.5 ng/ml, p = .048) within 1 week prior to neurological symptoms were significantly higher in patients with PRES. CONCLUSION: Renal insufficiency and high tacrolimus levels are associated with PRES. A change of immunosuppressive drug should be done after confirmed PRES diagnosis or immediately in case of severe neurological dysfunction to improve neurological outcomes and minimize the risk of early allograft rejection.


Subject(s)
Lung Transplantation , Posterior Leukoencephalopathy Syndrome , Humans , Female , Adult , Male , Tacrolimus/adverse effects , Posterior Leukoencephalopathy Syndrome/diagnosis , Posterior Leukoencephalopathy Syndrome/etiology , Retrospective Studies , Lung Transplantation/adverse effects , Risk Factors
11.
Pharmaceutics ; 14(9)2022 Sep 10.
Article in English | MEDLINE | ID: mdl-36145667

ABSTRACT

Voriconazole (VRC) is used as first line antifungal agent against invasive aspergillosis. Model-based approaches might optimize VRC therapy. This study aimed to investigate the predictive performance of pharmacokinetic models of VRC without pharmacogenetic information for their suitability for model-informed precision dosing. Seven PopPK models were selected from a systematic literature review. A total of 66 measured VRC plasma concentrations from 33 critically ill patients was employed for analysis. The second measurement per patient was used to calculate relative Bias (rBias), mean error (ME), relative root mean squared error (rRMSE) and mean absolute error (MAE) (i) only based on patient characteristics and dosing history (a priori) and (ii) integrating the first measured concentration to predict the second concentration (Bayesian forecasting). The a priori rBias/ME and rRMSE/MAE varied substantially between the models, ranging from -15.4 to 124.6%/-0.70 to 8.01 mg/L and from 89.3 to 139.1%/1.45 to 8.11 mg/L, respectively. The integration of the first TDM sample improved the predictive performance of all models, with the model by Chen (85.0%) showing the best predictive performance (rRMSE: 85.0%; rBias: 4.0%). Our study revealed a certain degree of imprecision for all investigated models, so their sole use is not recommendable. Models with a higher performance would be necessary for clinical use.

13.
Intensive Care Med ; 48(9): 1165-1175, 2022 09.
Article in English | MEDLINE | ID: mdl-35953676

ABSTRACT

PURPOSE: This case-control study investigated the long-term evolution of multidrug-resistant bacteria (MDRB) over a 5-year period associated with the use of selective oropharyngeal decontamination (SOD) in the intensive care unit (ICU). In addition, effects on health care-associated infections and ICU mortality were analysed. METHODS: We investigated patients undergoing mechanical ventilation > 48 h in 11 adult ICUs located at 3 campuses of a university hospital. Administrative, clinical, and microbiological data which were routinely recorded electronically served as the basis. We analysed differences in the rates and incidence densities (ID, cases per 1000 patient-days) of MDRB associated with SOD use in all patients and stratified by patient origin (outpatient or inpatient). After propensity score matching, health-care infections and ICU mortality were compared. RESULTS: 5034 patients were eligible for the study. 1694 patients were not given SOD. There were no differences in the incidence density of MDRB when SOD was used, except for more vancomycin-resistant Enterococcus faecium (0.72/1000 days vs. 0.31/1000 days, p < 0.01), and fewer ESBL-producing Klebsiella pneumoniae (0.22/1000 days vs. 0.56/1000 days, p < 0.01). After propensity score matching, SOD was associated with lower incidence rates of ventilator-associated pneumonia and death in the ICU but not with ICU-acquired bacteremia or urinary tract infection. CONCLUSIONS: Comparisons of the ICU-acquired MDRB over a 5-year period revealed no differences in incidence density, except for lower rate of ESBL-producing Klebsiella pneumoniae and higher rate of vancomycin-resistant Enterococcus faecium with SOD. Incidence rates of ventilator-associated pneumonia and death in the ICU were lower in patients receiving SOD.


Subject(s)
Cross Infection , Pneumonia, Ventilator-Associated , Adult , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Bacteria , Case-Control Studies , Cross Infection/drug therapy , Cross Infection/epidemiology , Cross Infection/prevention & control , Decontamination , Humans , Intensive Care Units , Pneumonia, Ventilator-Associated/drug therapy , Pneumonia, Ventilator-Associated/epidemiology , Pneumonia, Ventilator-Associated/prevention & control , Vancomycin
14.
J Crit Care ; 71: 154100, 2022 10.
Article in English | MEDLINE | ID: mdl-35780622

ABSTRACT

RATIONALE: The concentration-time profile of linezolid varies considerably in critically ill patients. Question of interest is, if the site of infection influences linezolid serum concentrations. METHODS: 68 critically ill patients, treated with linezolid, were included. The concentration-time-profile for linezolid was determined using maximum a-posteriori predictions. A trough concentration (Cmin) between 2 and 10 mg/L was defined as the target. A generalized linear model (GLM) was established to evaluate potential covariates. RESULTS: The indications for linezolid therapy were in descending order: peritonitis (38.2%), pneumonia (25.0%), infectious acute respiratory distress syndrome (ARDS) (19.1%), and other non-pulmonary infection (17.7%). 27.2 and 7.9% of Cmin were subtherapeutic and toxic, respectively. In the GLM, ARDS (mean: -2.1 mg/L, CI: -3.0 to -1.2 mg/L) and pneumonia (mean: -2.2 mg/L, CI: -2.8 to -1.6 mg/L) were significant (p < 0.001) determinants of Cmin. Patients with ARDS (mean: 2.3 mg/L, 51.2% subtherapeutic, 0.0% toxic) and pneumonia (mean: 3.5 mg/L, 41.5% subtherapeutic, 7.7% toxic) had significantly (p < 0.001) lower Cmin than those with peritonitis (mean: 5.5 mg/L, 14.4% subtherapeutic, 9.3% toxic) and other non-pulmonary infection (mean: 5.2 mg/L, 3.3% subtherapeutic, 16.5% toxic). CONCLUSION: Linezolid serum concentrations are reduced in patients with pulmonary infections. Future studies should investigate if other linezolid thresholds are needed in those patients due to linezolid pooling in patients´ lung.


Subject(s)
Peritonitis , Pneumonia , Respiratory Distress Syndrome , Anti-Bacterial Agents , Critical Illness , Humans , Linezolid/therapeutic use , Pneumonia/drug therapy , Respiratory Distress Syndrome/drug therapy
15.
Ann Intensive Care ; 12(1): 44, 2022 May 23.
Article in English | MEDLINE | ID: mdl-35599248

ABSTRACT

BACKGROUND: Hemadsorption of cytokines is used in critically ill patients with sepsis or septic shock. Concerns have been raised that the cytokine adsorber CytoSorb® unintentionally adsorbs vancomycin. This study aimed to quantify vancomycin elimination by CytoSorb®. METHODS: Critically ill patients with sepsis or septic shock receiving continuous renal replacement therapy and CytoSorb® treatment during a prospective observational study were included in the analysis. Vancomycin pharmacokinetics was characterized using population pharmacokinetic modeling. Adsorption of vancomycin by the CytoSorb® was investigated as linear or saturable process. The final model was used to derive dosing recommendations based on stochastic simulations. RESULTS: 20 CytoSorb® treatments in 7 patients (160 serum samples/24 during CytoSorb®-treatment, all continuous infusion) were included in the study. A classical one-compartment model, including effluent flow rate of the continuous hemodialysis as linear covariate on clearance, best described the measured concentrations (without CytoSorb®). Significant adsorption with a linear decrease during CytoSorb® treatment was identified (p < 0.0001) and revealed a maximum increase in vancomycin clearance of 291% (initially after CytoSorb® installation) and a maximum adsorption capacity of 572 mg. For a representative patient of our cohort a reduction of the area under the curve (AUC) by 93 mg/L*24 h during CytoSorb® treatment was observed. The additional administration of 500 mg vancomycin over 2 h during CytoSorb® attenuated the effect and revealed a negligible reduction of the AUC by 4 mg/L*24 h. CONCLUSION: We recommend the infusion of 500 mg vancomycin over 2 h during CytoSorb® treatment to avoid subtherapeutic concentrations. Trial registration NCT03985605. Registered 14 June 2019, https://clinicaltrials.gov/ct2/show/NCT03985605.

16.
Anaesthesist ; 71(5): 333-339, 2022 05.
Article in German | MEDLINE | ID: mdl-35397669

ABSTRACT

The controversy surrounding ventilation in coronavirus disease 2019 (COVID-19) continues. Early in the pandemic it was postulated that the high intensive care unit (ICU) mortality may have been due to too early intubation. As the pandemic progressed recommendations changed and the use of noninvasive respiratory support (NIRS) increased; however, this did not result in a clear reduction in ICU mortality. Furthermore, large studies on optimal ventilation in COVID-19 are lacking. This review article summarizes the pathophysiological basis, the current state of the science and the impact of different treatment modalities on the outcome. Potential factors that could undermine the benefits of noninvasive respiratory support are discussed. The authors attempt to provide guidance in answering the difficult question of when is the right time to intubate?


Subject(s)
COVID-19 , Noninvasive Ventilation , Respiratory Insufficiency , Humans , Intensive Care Units , Pandemics , Respiration, Artificial , Respiratory Insufficiency/therapy
17.
Respiration ; 101(7): 638-645, 2022.
Article in English | MEDLINE | ID: mdl-35354156

ABSTRACT

BACKGROUND: Long-term outcome of lung transplantation (LTx) recipients is limited by chronic lung allograft dysfunction (CLAD). In this setting of new onset respiratory failure, the amount of oxygenated hemoglobin (OxyHem; hemoglobin (Hb) concentration × fractional oxygen saturation) may provide valuable information. OBJECTIVE: We hypothesized that OxyHem predicts survival of LTx recipients at the onset of CLAD. METHODS: Data from 292 LTx recipients with CLAD were analyzed. After excluding patients with missing data or supplemental oxygen, the final population comprised 218 patients. The relationship between survival upon CLAD and OxyHem was analyzed by Cox regression analyses and ROC curves. RESULTS: Among the 218 patients (102 males, 116 females), 128 (58.7%) died, median survival time after CLAD onset being 1,156 days. Survival was significantly associated with type of transplantation, time to CLAD, CLAD stage at onset, and OxyHem, which was superior to Hb or oxygen saturation. The risk for death after CLAD increased by 14% per reduction of OxyHem by 1 g/dL, and values below 11 g/dL corresponded to an 80% increase in mortality risk. CONCLUSION: Thus, OxyHem was identified as an independent predictor of mortality after CLAD onset. Whether it is useful in supporting therapeutic decisions and potentially home monitoring in the surveillance of lung transplant recipients has to be studied further.


Subject(s)
Lung Transplantation , Allografts , Female , Hemoglobins , Humans , Lung , Lung Transplantation/adverse effects , Male , ROC Curve , Retrospective Studies
18.
J Crit Care ; 69: 154016, 2022 06.
Article in English | MEDLINE | ID: mdl-35279494

ABSTRACT

PURPOSE: To advance a transition towards an indication-based chest radiograph (CXR) ordering in intensive care units (ICUs) without compromising patient safety. MATERIALS AND METHODS: Single-center prospective cohort study with a retrospective reference group including 857 ICU patients. The routine group (n = 415) received CXRs at the discretion of the ICU physician, the restrictive group (n = 442) if specified by an indication catalogue. Documented data include number of CXRs per day and CXR radiation dose as primary outcomes, re-intubation and re-admission rates, hours of mechanical ventilation and ICU length of stay. RESULTS: CXR numbers were reduced in the restrictive group (964 CXRs in 2479 days vs. 1281 CXRs in 2318 days) and median radiation attributed to CXR per patient was significantly lowered in the restrictive group (0.068 vs. 0.076 Gy x cm2, P = 0.003). For patients staying ≥24 h, median number of CXRs per day was significantly reduced in the restrictive group (0.41 (IQR 0.21-0.61) vs. 0.55 (IQR 0.34-0.83), P < 0.001). Survival analysis proved non-inferiority. Secondary outcome parameters were not significantly different between the groups. CXR reduction was significant even for patients in most critical conditions. CONCLUSIONS: A substantial reduction of the number of CXRs on ICUs was feasible and safe using an indication catalogue thereby improving resource management. TRIAL REGISTRATION: DRKS00015621, German Clinical Trials Register.


Subject(s)
Intensive Care Units , Radiography, Thoracic , Humans , Prospective Studies , Radiography , Retrospective Studies
19.
Transplant Proc ; 54(6): 1504-1516, 2022.
Article in English | MEDLINE | ID: mdl-35120764

ABSTRACT

BACKGROUND: COVID-19 causes a wide range of symptoms, with particularly high risk of severe respiratory failure and death in patients with predisposing risk factors such as advanced age or obesity. Recipients of solid organ transplants, and in particular lung transplantation, are more susceptible to viral infection owing to immune suppressive medication. As little is known about the SARS-CoV-2 infection in these patients, this study was undertaken to describe outcomes and potential management strategies in early COVID-19 infection early after lung transplantation. METHODS: We describe the incidence and outcome of COVID-19 in a cohort of recent lung transplant recipients in Munich. Six of 186 patients who underwent lung transplantation in the period between March 2019 and March 2021 developed COVID-19 within the first year after transplantation. We documented the clinical course and laboratory changes for all patients showing differences in the severity of the infection with COVID-19 and their outcomes. RESULTS: Three of 6 SARS-CoV-2 infections were hospital-acquired and the patients were still in inpatient treatment after lung transplantation. All patients suffered from symptoms. One patient did not receive antiviral therapy. Remdesivir was prescribed in 4 patients and the remaining patient received remdesivir, bamlanivimab and convalescent plasma. CONCLUSIONS: COVID-19 does not appear to cause milder disease in lung transplant recipients compared with the general population. Immunosuppression is potentially responsible for the delayed formation of antibodies and their premature loss. Several comorbidities and a general poor preoperative condition showed an extended hospital stay.


Subject(s)
COVID-19 , Antibodies, Monoclonal, Humanized , Antibodies, Neutralizing , Antiviral Agents/therapeutic use , COVID-19/therapy , Humans , Immunization, Passive , Lung , SARS-CoV-2 , Transplant Recipients , COVID-19 Serotherapy
20.
Infection ; 50(5): 1111-1120, 2022 Oct.
Article in English | MEDLINE | ID: mdl-35182354

ABSTRACT

PURPOSE: Duodenal involvement in COVID-19 is poorly studied. Aim was to describe clinical and histopathological characteristics of critically ill COVID-19 patients suffering from severe duodenitis that causes a significant bleeding and/or gastrointestinal dysmotility. METHODS: In 51 critically ill patients suffering from SARS-CoV-2 pneumonia, severe upper intestinal bleeding and/or gastric feeding intolerance were indications for upper gastrointestinal endoscopy. Duodenitis was diagnosed according to macroscopic signs and mucosal biopsies. Immunohistochemistry was performed to detect viral specific protein and ACE2. In situ hybridization was applied to confirm viral replication. RESULTS: Nine of 51 critically ill patients (18%) suffering from SARS-CoV-2 pneumonia had developed upper GI bleeding complications and/or high gastric reflux. Five of them presented with minor and four (44%) with severe duodenitis. In two patients, erosions had caused severe gastrointestinal bleeding requiring PRBC transfusions. Immunohistochemical staining for SARS-CoV-2 spike protein was positive inside duodenal enterocytes in three of four patients suffering from severe duodenitis. Viral replication could be confirmed by in situ hybridization. CONCLUSION: Our data suggest that about 8% of critically ill COVID-19 patients may develop a severe duodenitis presumably associated with a direct infection of the duodenal enterocytes by SARS-CoV-2. Clinical consequences from severe bleeding and/or upper gastrointestinal dysmotility seem to be underestimated.


Subject(s)
COVID-19 , Duodenitis , Angiotensin-Converting Enzyme 2 , COVID-19/complications , Critical Illness , Humans , Infant, Newborn , SARS-CoV-2 , Spike Glycoprotein, Coronavirus , Tropism
SELECTION OF CITATIONS
SEARCH DETAIL
...