Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 53
Filter
1.
Rev Esp Quimioter ; 34 Suppl 1: 41-43, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34598424

ABSTRACT

Cefiderocol is a novel catechol-substituted siderophore cephalosporin that binds to the extracellular free iron, and uses the bacterial active iron transport channels to penetrate in the periplasmic space of Gram-negative bacteria (GNB). Cefiderocol overcomes many resistance mechanisms of these bacteria. Cefiderocol is approved for the treatment of complicated urinary tract infections, hospital-acquired bacterial pneumonia and ventilator-associated bacterial pneumonia in the case of adults with limited treatment options, based on the clinical data from the APEKS-cUTI, APEKS-NP and CREDIBLE-CR trials. In the CREDIBLE-CR trial, a higher all-cause mortality was observed in the group of patients who received cefiderocol, especially those with severe infections due to Acinetobacter spp. Further phase III clinical studies are necessary in order to evaluate cefiderocol´s efficacy in the treatment of serious infections.


Subject(s)
Anti-Bacterial Agents , Gram-Negative Bacterial Infections , Adult , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Cephalosporins/pharmacology , Cephalosporins/therapeutic use , Drug Resistance, Multiple, Bacterial , Gram-Negative Bacteria , Gram-Negative Bacterial Infections/drug therapy , Humans , Cefiderocol
2.
Mol Cell Neurosci ; 105: 103497, 2020 06.
Article in English | MEDLINE | ID: mdl-32353527

ABSTRACT

Various animal models have been employed to understand the pathogenic mechanism of neuropathic pain. Nitric oxide (NO) is an important molecule in nociceptive transmission and is involved in neuropathic pain. However, its mechanistic actions remain unclear. The aim of this study was to better understand the involvement of neuronal and inducible isoforms of nitric oxide synthase (nNOS and iNOS) in neuropathic pain induced by chronic constriction injury (CCI) of the sciatic nerve in rats. We evaluated pain sensitivity (mechanical withdrawal thresholds using Randall and Selitto, and von Frey tests, and thermal withdrawal thresholds using Hargreaves test) prior to CCI surgery, 14 days post CCI and after intrathecal injections of selective nNOS or iNOS inhibitors. We also evaluated the distribution of NOS isozymes in the spinal cord and dorsal root ganglia (DRG) by immunohistochemistry, synthesis of iNOS and nNOS by Western blot, and NO production using fluorescent probe DAF-2 DA (DA). Our results showed higher number of nNOS and iNOS-positive neurons in the spinal cord and DRG of CCI compared to sham rats, and their reduction in CCI rats after treatment with selective inhibitors compared to non-treated groups. Western blot results also indicated reduced expression of nNOS and iNOS after treatment with selective inhibitors. Furthermore, both inhibitors reduced CCI-evoked mechanical and thermal withdrawal thresholds but only nNOS inhibitor was able to efficiently lower mechanical withdrawal thresholds using von Frey test. In addition, we observed higher NO production in the spinal cord and DRG of injured rats compared to control group. Our study innovatively shows that nNOS may strongly modulate nociceptive transmission in rats with neuropathic pain, while iNOS may partially participate in the development of nociceptive responses. Thus, drugs targeting nNOS for neuropathic pain may represent a potential therapeutic strategy.


Subject(s)
Ganglia, Spinal/metabolism , Neuralgia/metabolism , Nitric Oxide/metabolism , Sciatic Nerve/metabolism , Animals , Hyperalgesia/drug therapy , Male , Nitric Oxide Synthase Type II/metabolism , Rats, Wistar , Spinal Cord/metabolism
3.
J Dairy Sci ; 103(5): 4672-4680, 2020 May.
Article in English | MEDLINE | ID: mdl-32173018

ABSTRACT

The weaning process may cause intense stress for dairy calves, even when low volumes of liquid diet are fed. Management tools that increase the intake of solid feeds, such as gradual weaning, can provide better physiological and metabolic conditions through better ruminal development, leading to better adaptation to ruminant metabolism and aiding in stress mitigation. The objective of this study was to evaluate the effects of 2 weaning protocols and 2 levels of concentrate intake on the performance and physiological and behavioral variables related to stress in dairy calves. Thirty-six newborn male Holstein calves were used in a randomized block design with a 2 × 2 factorial arrangement: 2 weaning strategies, abrupt or gradual, and 2 levels of concentrate intake at 5 wk of age, high (>350 g/d) or low (≤350 g/d). Calves were equally managed until they were 5 wk of age and then grouped according to concentrate intake. Statistical analyzes were performed using the MIXED procedure of SAS software (SAS Institute Inc., Cary, NC), and no significant interaction was observed between studied factors (weaning method and starter intake level); therefore, we considered each factor separately and their interactions with age. The highest dry matter intake and concentration of ß-hydroxybutyrate were recorded for animals with a high level of starter intake independent of the weaning method. Structural growth (cm/wk) and average daily gain were superior for calves with high starter intake, but weaning method had no effect. The gradual weaning protocol increased the time eating starter, regardless of the level of concentrate intake. Even animals with low concentrate intake that were weaned abruptly showed levels of cortisol and acid-soluble glycoprotein within normal physiological levels. Apparently, other factors besides the milk supply affect the starter intake level of calves in a conventional feeding program. The adoption of gradual weaning is not effective in improving performance when a calf has low intake 3 wk before weaning is complete, but it reduced vocalization on d 2 postweaning.


Subject(s)
Animal Feed/analysis , Behavior, Animal , Cattle/physiology , Eating , Milk Substitutes/metabolism , Weaning , 3-Hydroxybutyric Acid/blood , Animals , Cattle/growth & development , Diet/veterinary , Male , Random Allocation , Rumen/physiology , Stress, Physiological , Vocalization, Animal
4.
Open Forum Infect Dis ; 6(6): ofz180, 2019 Jun.
Article in English | MEDLINE | ID: mdl-31198815

ABSTRACT

BACKGROUND: We analyzed the prevalence, etiology, and risk factors of culture-positive preservation fluid and their impact on the management of solid organ transplant recipients. METHODS: From July 2015 to March 2017, 622 episodes of adult solid organ transplants at 7 university hospitals in Spain were prospectively included in the study. RESULTS: The prevalence of culture-positive preservation fluid was 62.5% (389/622). Nevertheless, in only 25.2% (98/389) of the cases were the isolates considered "high risk" for pathogenicity. After applying a multivariate regression analysis, advanced donor age was the main associated factor for having culture-positive preservation fluid for high-risk microorganisms. Preemptive antibiotic therapy was given to 19.8% (77/389) of the cases. The incidence rate of preservation fluid-related infection was 1.3% (5 recipients); none of these patients had received preemptive therapy. Solid organ transplant (SOT) recipients with high-risk culture-positive preservation fluid receiving preemptive antibiotic therapy presented both a lower cumulative incidence of infection and a lower rate of acute rejection and graft loss compared with those who did not have high-risk culture-positive preservation fluid. After adjusting for age, sex, type of transplant, and prior graft rejection, preemptive antibiotic therapy remained a significant protective factor for 90-day infection. CONCLUSIONS: The routine culture of preservation fluid may be considered a tool that provides information about the contamination of the transplanted organ. Preemptive therapy for SOT recipients with high-risk culture-positive preservation fluid may be useful to avoid preservation fluid-related infections and improve the outcomes of infection, graft loss, and graft rejection in transplant patients.

7.
Clin Microbiol Infect ; 24 Suppl 2: S53-S70, 2018 Jun.
Article in English | MEDLINE | ID: mdl-29454849

ABSTRACT

BACKGROUND: The present review is part of the European Society of Clinical Microbiology and Infectious Diseases (ESCMID) Study Group for Infections in Compromised Hosts (ESGICH) Consensus Document on the safety of targeted and biologic therapies. AIMS: To review, from an infectious diseases perspective, the safety profile of therapies targeting different intracellular signaling pathways and to suggest preventive recommendations. SOURCES: Computer-based Medline searches with MeSH terms pertaining to each agent or therapeutic family. CONTENT: Although BCR-ABL tyrosine kinase inhibitors modestly increase the overall risk of infection, dasatinib has been associated with cytomegalovirus and hepatitis B virus reactivation. BRAF/MEK kinase inhibitors do not significantly affect infection susceptibility. The effect of Bruton tyrosine kinase inhibitors (ibrutinib) among patients with B-cell malignancies is difficult to distinguish from that of previous immunosuppression. However, cases of Pneumocystis jirovecii pneumonia (PCP), invasive fungal infection and progressive multifocal leukoencephalopathy have been occasionally reported. Because phosphatidylinositol-3-kinase inhibitors (idelalisib) may predispose to opportunistic infections, anti-Pneumocystis prophylaxis and prevention strategies for cytomegalovirus are recommended. No increased rates of infection have been observed with venetoclax (antiapoptotic protein Bcl-2 inhibitor). Therapy with Janus kinase inhibitors markedly increases the incidence of infection. Pretreatment screening for chronic hepatitis B virus and latent tuberculosis infection must be performed, and anti-Pneumocystis prophylaxis should be considered for patients with additional risk factors. Cancer patients receiving mTOR inhibitors face an increased incidence of overall infection, especially those with additional risk factors (prior therapies or delayed wound healing). IMPLICATIONS: Specific preventive approaches are warranted in view of the increased risk of infection associated with some of the reviewed agents.


Subject(s)
Biological Therapy/adverse effects , Communicable Diseases/therapy , Protein-Tyrosine Kinases/antagonists & inhibitors , Signal Transduction/drug effects , TOR Serine-Threonine Kinases/antagonists & inhibitors , Biological Therapy/methods , Clinical Trials as Topic , Humans , Immunocompromised Host , Janus Kinase Inhibitors/adverse effects , Janus Kinase Inhibitors/therapeutic use , Protein Kinase Inhibitors/adverse effects , Protein Kinase Inhibitors/therapeutic use
8.
Clin Microbiol Infect ; 24 Suppl 2: S21-S40, 2018 Jun.
Article in English | MEDLINE | ID: mdl-29447987

ABSTRACT

BACKGROUND: The present review is part of the ESCMID Study Group for Infections in Compromised Hosts (ESGICH) Consensus Document on the safety of targeted and biological therapies. AIMS: To review, from an Infectious Diseases perspective, the safety profile of agents targeting interleukins, immunoglobulins and complement factors and to suggest preventive recommendations. SOURCES: Computer-based MEDLINE searches with MeSH terms pertaining to each agent or therapeutic family. CONTENT: Patients receiving interleukin-1 (IL-1) -targeted (anakinra, canakinumab or rilonacept) or IL-5-targeted (mepolizumab) agents have a moderate risk of infection and no specific prevention strategies are recommended. The use of IL-6/IL-6 receptor-targeted agents (tocilizumab and siltuximab) is associated with a risk increase similar to that observed with anti-tumour necrosis factor-α agents. IL-12/23-targeted agents (ustekinumab) do not seem to pose a meaningful risk of infection, although screening for latent tuberculosis infection may be considered and antiviral prophylaxis should be given to hepatitis B surface antigen-positive patients. Therapy with IL-17-targeted agents (secukinumab, brodalumab and ixekizumab) may result in the development of mild-to-moderate mucocutaneous candidiasis. Pre-treatment screening for Strongyloides stercoralis and other geohelminths should be considered in patients who come from areas where these are endemic who are receiving IgE-targeted agents (omalizumab). C5-targeted agents (eculizumab) are associated with a markedly increased risk of infection due to encapsulated bacteria, particularly Neisseria spp. Meningococcal vaccination and chemoprophylaxis must be administered 2-4 weeks before initiating eculizumab. Patients with high-risk behaviours and their partners should also be screened for gonococcal infection. IMPLICATIONS: Preventive strategies are particularly encouraged to minimize the occurrence of neisserial infection associated with eculizumab.


Subject(s)
Antibodies, Monoclonal, Humanized/adverse effects , Biological Therapy/adverse effects , Communicable Diseases/therapy , Complement System Proteins/drug effects , Immunoglobulins/drug effects , Interleukins/antagonists & inhibitors , Molecular Targeted Therapy/adverse effects , Antibodies, Monoclonal/administration & dosage , Antibodies, Monoclonal/adverse effects , Antibodies, Monoclonal/therapeutic use , Antibodies, Monoclonal, Humanized/administration & dosage , Antibodies, Monoclonal, Humanized/therapeutic use , Clinical Trials as Topic , Communicable Disease Control , Communicable Diseases/immunology , Dermatologic Agents/administration & dosage , Dermatologic Agents/adverse effects , Humans , Immunocompromised Host , Interleukin-17/antagonists & inhibitors , Interleukins/immunology , Meningococcal Vaccines/administration & dosage
9.
Transplant Rev (Orlando) ; 32(1): 36-57, 2018 01.
Article in English | MEDLINE | ID: mdl-28811074

ABSTRACT

Solid organ transplant (SOT) recipients are especially at risk of developing infections by multidrug resistant (MDR) Gram-negative bacilli (GNB), as they are frequently exposed to antibiotics and the healthcare setting, and are regulary subject to invasive procedures. Nevertheless, no recommendations concerning prevention and treatment are available. A panel of experts revised the available evidence; this document summarizes their recommendations: (1) it is important to characterize the isolate's phenotypic and genotypic resistance profile; (2) overall, donor colonization should not constitute a contraindication to transplantation, although active infected kidney and lung grafts should be avoided; (3) recipient colonization is associated with an increased risk of infection, but is not a contraindication to transplantation; (4) different surgical prophylaxis regimens are not recommended for patients colonized with carbapenem-resistant GNB; (5) timely detection of carriers, contact isolation precautions, hand hygiene compliance and antibiotic control policies are important preventive measures; (6) there is not sufficient data to recommend intestinal decolonization; (7) colonized lung transplant recipients could benefit from prophylactic inhaled antibiotics, specially for Pseudomonas aeruginosa; (8) colonized SOT recipients should receive an empirical treatment which includes active antibiotics, and directed therapy should be adjusted according to susceptibility study results and the severity of the infection.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Disease Management , Drug Resistance, Multiple , Gram-Negative Bacterial Infections , Organ Transplantation , Tissue Donors , Transplant Recipients , Gram-Negative Bacterial Infections/drug therapy , Gram-Negative Bacterial Infections/etiology , Gram-Negative Bacterial Infections/microbiology , Humans , Postoperative Complications
10.
Clin Microbiol Infect ; 24(2): 192-198, 2018 Feb.
Article in English | MEDLINE | ID: mdl-28652112

ABSTRACT

OBJECTIVES: To assess the risk factors for development of late-onset invasive pulmonary aspergillosis (IPA) after kidney transplantation (KT). METHODS: We performed a multinational case-control study that retrospectively recruited 112 KT recipients diagnosed with IPA between 2000 and 2013. Controls were matched (1:1 ratio) by centre and date of transplantation. Immunosuppression-related events (IREs) included the occurrence of non-ventilator-associated pneumonia, tuberculosis, cytomegalovirus disease, and/or de novo malignancy. RESULTS: We identified 61 cases of late (>180 days after transplantation) IPA from 24 participating centres (accounting for 54.5% (61/112) of all cases included in the overall study). Most diagnoses (54.1% (33/61)) were established within the first 36 post-transplant months, although five cases occurred more than 10 years after transplantation. Overall mortality among cases was 47.5% (29/61). Compared with controls, cases were significantly older (p 0.010) and more likely to have pre-transplant chronic obstructive pulmonary disease (p 0.001) and a diagnosis of bloodstream infection (p 0.016) and IRE (p <0.001) within the 6 months prior to the onset of late IPA. After multivariate adjustment, previous occurrence of IRE (OR 19.26; 95% CI 2.07-179.46; p 0.009) was identified as an independent risk factor for late IPA. CONCLUSION: More than half of IPA cases after KT occur beyond the sixth month, with some of them presenting very late. Late IPA entails a poor prognosis. We identified some risk factors that could help the clinician to delimit the subgroup of KT recipients at the highest risk for late IPA.


Subject(s)
Invasive Pulmonary Aspergillosis/etiology , Kidney Transplantation/adverse effects , Case-Control Studies , Female , Global Health/statistics & numerical data , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Time Factors
11.
J Biol Regul Homeost Agents ; 31(2): 309-319, 2017.
Article in English | MEDLINE | ID: mdl-28685530

ABSTRACT

Chronic constriction injury (CCI) simulates the symptoms of chronic nerve compression, which is characterized by allodynia and hyperalgesia. Nerve growth factor (NGF) is released after nerve injury by immune and Schwann cells and transported in retrograde fashion to the dorsal root ganglion (DRG), resulting in increased synthesis of Substance P (SP) and the triggering of neuropathic pain. Here we performed long-term evaluation of allodynia and hyperalgesia in a CCI model, and evaluated the effects of NGF and SP on the peripheral and central nervous systems. Most previous studies have shown deficits and molecular changes 14 days after surgery, however, the long-term effects have not been evaluated. We performed Randall-Selitto, Von Frey, Hargreaves and acetone tests for the entire 56 days post-surgery. Several of these deficits increased 14 to 56 days after CCI and we measured a constant increase in NGF levels in the DRG and spinal cord over the course of the experiment. In contrast, SP optical density maintained enhanced expression in DRG tissue from 14 to 56 days after CCI, whereas it was significantly increased only 56 days post-surgery in spinal cord. We perform long-term evaluation of symptoms associated with CCI and measure associated molecular changes. Moreover, by characterizing the behavioral signatures of this model, our work supports future studies.


Subject(s)
Hyperalgesia/metabolism , Nerve Growth Factor/metabolism , Sciatic Nerve/injuries , Sciatic Nerve/metabolism , Substance P/metabolism , Animals , Chronic Disease , Ganglia, Spinal/metabolism , Ganglia, Spinal/pathology , Ganglia, Spinal/physiopathology , Hyperalgesia/pathology , Hyperalgesia/physiopathology , Male , Rats , Rats, Wistar , Sciatic Nerve/pathology , Sciatic Nerve/physiopathology , Spinal Cord/metabolism , Spinal Cord/pathology , Spinal Cord/physiopathology
12.
J Dairy Sci ; 100(6): 4448-4456, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28365119

ABSTRACT

The objective of this study was to evaluate the effects of different liquid-feeding systems using a medium crude protein milk replacer on performance, rumen, and blood parameters. Thirty newborn Holstein calves were blocked according to birth weight and date of birth, and randomly distributed to different liquid-feeding systems: conventional (4 L/d), intensive (8 L/d), or step-up/step-down (wk 1, 4 L/d; wk 2 to 6, 8 L/d; wk 7 and 8, 4 L/d). The commercial milk replacer (12.5% solids, 20.2% crude protein, 15.6% fat) was fed twice daily (0700 and 1700 h) until calves were weaned, at 8 wk of age. Calves were individually housed in wood hutches, with free access to water and starter concentrate, and to hay only after weaning. They were followed through 10 wk of age. Milk replacer and starter intake were inversely affected by feeding system. After weaning, starter intake and hay intake were similar among feeding systems. Total dry matter intake was higher during the liquid-feeding period for calves on the intensive system compared to calves on the conventional system, but conventional feeding resulted in the highest dry matter intake after weaning. Feed efficiency was similar among feeding systems before and after weaning. Average body weight and daily gain were not affected by feeding system before or after weaning. During liquid feeding, diarrhea occurrence was lower for calves on the conventional system; however, when calves on the step-up/step-down system were fed lower volumes of liquid feed, diarrhea occurrence was similar to that of calves on the conventional system. Plasma concentrations of ß-hydroxybutyrate were higher for calves on the conventional system, reflecting starter intake. Rumen pH, short-chain fatty acids, and N-NH3 were not affected by feeding system. Feeding higher volumes of milk replacer with a medium crude protein content had no beneficial effect on the performance of calves up to 10 wk of age.


Subject(s)
Animal Feed , Eating , Rumen/chemistry , 3-Hydroxybutyric Acid/blood , Age Factors , Animal Husbandry/methods , Animals , Animals, Newborn , Body Weight , Cattle , Cattle Diseases/etiology , Diarrhea/veterinary , Fatty Acids, Volatile/blood , Milk , Random Allocation , Weaning
13.
Transpl Infect Dis ; 18(4): 552-65, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27260953

ABSTRACT

BACKGROUND: Monitoring of peripheral blood lymphocyte subpopulation (PBLS) counts might be useful for estimating the risk of infection after liver transplantation (LT). METHODS: We prospectively measured total lymphocyte and PBLS counts at baseline and post-transplant months 1 and 6 in 92 LT recipients. PBLS were enumerated by single-platform 6-color flow cytometry technology. Areas under receiver operating characteristic (ROC) curves were used to evaluate the accuracy of different PBLS for predicting cytomegalovirus (CMV) disease and overall opportunistic infection (OI). Adjusted hazard ratios (aHRs) for both outcomes were estimated by Cox regression. RESULTS: After a median follow-up of 730.0 days, 29 patients (31.5%) developed 38 episodes of OI (including 22 episodes of CMV disease). The counts of CD3(+) , CD4(+) , and CD8(+) T cells, and CD56(+) CD16(+) natural killer (NK) cells at month 1 were significantly lower in patients subsequently developing OI. The NK cell count was the best predictive parameter (area under ROC curve for predicting CMV disease: 0.78; P-value = 0.001). Patients with an NK cell count <0.050 × 10(3) cells/µL had higher cumulative incidences of CMV disease (P-value = 0.001) and overall OI (P-value <0.001). In the multivariate models, an NK cell count <0.050 × 10(3) cells/µL at month 1 post transplantation remained as an independent risk factor for CMV disease (aHR: 5.54; P-value = 0.003) and overall OI (aHR: 7.56; P-value <0.001). CONCLUSION: Post-transplant kinetics of NK cell counts may be used as a simple and affordable proxy to the cell-mediated immunity status in LT recipients and to their associated risk of OI.


Subject(s)
Cytomegalovirus Infections/blood , Killer Cells, Natural/immunology , Liver Transplantation/adverse effects , Lymphocyte Subsets/immunology , Monitoring, Immunologic/methods , Opportunistic Infections/blood , Aged , Cytomegalovirus/isolation & purification , Cytomegalovirus Infections/epidemiology , Cytomegalovirus Infections/immunology , Cytomegalovirus Infections/virology , Female , Flow Cytometry , Follow-Up Studies , Humans , Immunity, Cellular , Lymphocyte Count/economics , Male , Middle Aged , Opportunistic Infections/epidemiology , Opportunistic Infections/immunology , Opportunistic Infections/microbiology , Predictive Value of Tests , Prospective Studies , Risk Factors
14.
Transpl Infect Dis ; 18(3): 431-41, 2016 Jun.
Article in English | MEDLINE | ID: mdl-27061510

ABSTRACT

BACKGROUND: Recent studies suggest that Epstein-Barr virus DNAemia (EBVd) may act as a surrogate marker of post-transplant immunosuppression. This hypothesis has not been tested so far in lung transplant (LT) recipients. METHODS: We included 63 patients undergoing lung transplantation at our center between October 2008 and May 2013. Whole blood EBVd was systematically assessed by real-time polymerase chain reaction assay on a quarterly basis. The occurrence of late complications (overall and opportunistic infection [OI] and chronic lung allograft dysfunction [CLAD]) was analyzed according to the detection of EBVd within the first 6 months post transplantation. RESULTS: Any EBVd was detected in 30 (47.6%) patients. Peak EBVd was higher in patients with late overall infection (2.23 vs. 1.73 log10 copies/mL; P = 0.026) and late OI (2.39 vs. 1.74 log10 copies/mL; P = 0.004). The areas under receiver operating characteristic curves for predicting both events were 0.806 and 0.871 respectively. The presence of an EBVd ≥2 log10 copies/mL during the first 6 months post transplantation was associated with a higher risk of late OI (adjusted hazard ratio [aHR] 7.92; 95% confidence interval [CI] 2.10-29.85; P = 0.002). Patients with detectable EBVd during the first 6 months also had lower CLAD-free survival (P = 0.035), although this association did not remain statistically significant in the multivariate analysis (aHR 1.26; 95% CI 0.87-5.29; P = 0.099). CONCLUSIONS: Although preliminary in nature, our results suggest that the detection of EBVd within the first 6 months after transplantation is associated with the subsequent occurrence of late OI in LT recipients.


Subject(s)
Epstein-Barr Virus Infections/diagnosis , Herpesvirus 4, Human/isolation & purification , Lung Transplantation/adverse effects , Opportunistic Infections/etiology , Postoperative Complications/etiology , Adult , Aged , Biomarkers/blood , Cohort Studies , DNA, Viral/blood , Epstein-Barr Virus Infections/virology , Female , Follow-Up Studies , Herpesvirus 4, Human/genetics , Humans , Immunosuppression Therapy , Incidence , Male , Middle Aged , Real-Time Polymerase Chain Reaction , Retrospective Studies , Viremia
15.
Am J Transplant ; 16(11): 3220-3234, 2016 11.
Article in English | MEDLINE | ID: mdl-27105907

ABSTRACT

The prognostic factors and optimal therapy for invasive pulmonary aspergillosis (IPA) after kidney transplantation (KT) remain poorly studied. We included in this multinational retrospective study 112 recipients diagnosed with probable (75.0% of cases) or proven (25.0%) IPA between 2000 and 2013. The median interval from transplantation to diagnosis was 230 days. Cough, fever, and expectoration were the most common symptoms at presentation. Bilateral pulmonary involvement was observed in 63.6% of cases. Positivity rates for the galactomannan assay in serum and bronchoalveolar lavage samples were 61.3% and 57.1%, respectively. Aspergillus fumigatus was the most commonly identified species. Six- and 12-week survival rates were 68.8% and 60.7%, respectively, and 22.1% of survivors experienced graft loss. Occurrence of IPA within the first 6 months (hazard ratio [HR]: 2.29; p-value = 0.027) and bilateral involvement at diagnosis (HR: 3.00; p-value = 0.017) were independent predictors for 6-week all-cause mortality, whereas the initial use of a voriconazole-based regimen showed a protective effect (HR: 0.34; p-value = 0.007). The administration of antifungal combination therapy had no apparent impact on outcome. In conclusion, IPA entails a dismal prognosis among KT recipients. Maintaining a low clinical suspicion threshold is key to achieve a prompt diagnosis and to initiate voriconazole therapy.


Subject(s)
Graft Rejection/mortality , Invasive Pulmonary Aspergillosis/mortality , Kidney Failure, Chronic/complications , Kidney Transplantation/mortality , Postoperative Complications/mortality , Aspergillus , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/etiology , Graft Rejection/pathology , Graft Survival , Humans , International Agencies , Invasive Pulmonary Aspergillosis/etiology , Invasive Pulmonary Aspergillosis/pathology , Kidney Failure, Chronic/surgery , Kidney Function Tests , Kidney Transplantation/adverse effects , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Survival Rate , Transplant Recipients
16.
J Infect ; 72(6): 738-744, 2016 06.
Article in English | MEDLINE | ID: mdl-27025205

ABSTRACT

BACKGROUND: The role of galactomannan (GM) in serum or bronchoalveolar lavage fluid (BALF) for the diagnosis of invasive pulmonary aspergillosis (IPA) has been extensively evaluated in hematological patients, however its performance in non-hematological patients is not well established. METHODS: We performed a multicenter retrospective study in 3 university hospitals in Madrid, Spain between 2010 and 2014. The study population comprised patients with chronic obstructive pulmonary disease (COPD) and patients with immunosuppressive conditions in whom IPA was suspected and for whom BALF GM was available. Patients with hematological disorders were excluded. RESULTS: A total of 188 patients (35 with COPD and 153 with immunosuppressive conditions) were analyzed, and 31 cases of IPA (proven or probable) were identified. The global sensitivity of BALF GM (optical density index [ODI] ≥ 1.0) was 77.4%; sensitivity was higher in patients with immunosuppressive conditions than in patients with COPD (81.8% vs 66.7%; p: 0.38). In COPD patients, the best performance was obtained for BALF GM (ODI ≥ 0.5), although sensitivity (88.9%) was similar to that of BALF fungal culture (88.9%). The sensitivity of GM in serum was very poor in both populations (36.4% and 11.6%, respectively). CONCLUSIONS: In the present series, the diagnostic performance of BALF GM was good for IPA in non-hematological patients, especially in patients with immunosuppressive conditions.


Subject(s)
Bronchoalveolar Lavage Fluid/chemistry , Immunocompromised Host , Invasive Pulmonary Aspergillosis/diagnosis , Mannans/analysis , Adult , Bronchoalveolar Lavage Fluid/microbiology , Female , Galactose/analogs & derivatives , Humans , Invasive Pulmonary Aspergillosis/microbiology , Male , Mannans/chemistry , Mannans/isolation & purification , Middle Aged , Neutropenia , Pulmonary Disease, Chronic Obstructive , Retrospective Studies , Sensitivity and Specificity , Spain , Young Adult
17.
Asian-Australas J Anim Sci ; 29(7): 971-8, 2016 Jul.
Article in English | MEDLINE | ID: mdl-26954149

ABSTRACT

The aim of this study was to evaluate the effect of replacing corn grain for sugar cane molasses (MO) or glucose syrup (GS) in the starter concentrate on performance and metabolism of dairy calves. Thirty-six individually housed Holstein male calves were blocked according to weight and date of birth and assigned to one of the starter feed treatments, during an 8 week study: i) starter containing 65% corn with no MO or GS (0MO); ii) starter containing 60% corn and 5% MO (5MO); iii) starter containing 55% corn and 10% MO (10MO); and iv) starter containing 60% corn and 5% GS (5GS). Animals received 4 L of milk replacer daily (20 crude protein, 16 ether extract, 12.5% solids), divided in two meals (0700 and 1700 h). Starter and water were provided ad libitum. Starter intake and fecal score were monitored daily until animals were eight weeks old. Body weight and measurements (withers height, hip width and heart girth) were measured weekly before the morning feeding. From the second week of age, blood samples were collected weekly, 2 h after the morning feeding, for glucose, ß-hydroxybutyrate and lactate determination. Ruminal fluid was collected at 4, 6, and 8 weeks of age using an oro-ruminal probe and a suction pump for determination of pH and short-chain fatty acids (SCFA). At the end of the eighth week, animals were harvested to evaluate development of the proximal digestive tract. The composition of the starter did not affect (p>0.05) concentrate intake, weight gain, fecal score, blood parameters, and rumen development. However, treatment 5MO showed higher (p<0.05) total concentration of SCFAs, acetate and propionate than 0MO, and these treatments did not differ from 10MO and 5GS (p>0.05). Thus, it can be concluded that the replacement of corn by 5% or 10% sugar cane molasses or 5% GS on starter concentrate did not impact performance, however it has some positive effects on rumen fermentation which may be beneficial for calves with a developing rumen.

18.
Am J Transplant ; 16(7): 2148-57, 2016 07.
Article in English | MEDLINE | ID: mdl-26813515

ABSTRACT

Risk factors for invasive pulmonary aspergillosis (IPA) after kidney transplantation have been poorly explored. We performed a multinational case-control study that included 51 kidney transplant (KT) recipients diagnosed with early (first 180 posttransplant days) IPA at 19 institutions between 2000 and 2013. Control recipients were matched (1:1 ratio) by center and date of transplantation. Overall mortality among cases was 60.8%, and 25.0% of living recipients experienced graft loss. Pretransplant diagnosis of chronic pulmonary obstructive disease (COPD; odds ratio [OR]: 9.96; 95% confidence interval [CI]: 1.09-90.58; p = 0.041) and delayed graft function (OR: 3.40; 95% CI: 1.08-10.73; p = 0.037) were identified as independent risk factors for IPA among those variables already available in the immediate peritransplant period. The development of bloodstream infection (OR: 18.76; 95% CI: 1.04-339.37; p = 0.047) and acute graft rejection (OR: 40.73, 95% CI: 3.63-456.98; p = 0.003) within the 3 mo prior to the diagnosis of IPA acted as risk factors during the subsequent period. In conclusion, pretransplant COPD, impaired graft function and the occurrence of serious posttransplant infections may be useful to identify KT recipients at the highest risk of early IPA. Future studies should explore the potential benefit of antimold prophylaxis in this group.


Subject(s)
Delayed Graft Function/etiology , Graft Rejection/etiology , Invasive Pulmonary Aspergillosis/etiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Case-Control Studies , Delayed Graft Function/pathology , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/pathology , Graft Survival , Humans , Invasive Pulmonary Aspergillosis/pathology , Kidney Function Tests , Male , Middle Aged , Prognosis , Risk Factors , Transplant Recipients
19.
Am J Transplant ; 16(3): 951-9, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26560685

ABSTRACT

Microbiological spectrum and outcome of infectious complications following small bowel transplantation (SBT) have not been thoroughly characterized. We performed a retrospective analysis of all patients undergoing SBT from 2004 to 2013 in Spain. Sixty-nine patients underwent a total of 87 SBT procedures (65 pediatric, 22 adult). The median follow-up was 867 days. Overall, 81 transplant patients (93.1%) developed 263 episodes of infection (incidence rate: 2.81 episodes per 1000 transplant-days), with no significant differences between adult and pediatric populations. Most infections were bacterial (47.5%). Despite universal prophylaxis, 22 transplant patients (25.3%) developed cytomegalovirus disease, mainly in the form of enteritis. Specifically, 54 episodes of opportunistic infection (OI) occurred in 35 transplant patients. Infection was the major cause of mortality (17 of 24 deaths). Multivariate analysis identified retransplantation (hazard ratio [HR]: 2.21; 95% confidence interval [CI]: 1.02-4.80; p = 0.046) and posttransplant renal replacement therapy (RRT; HR: 4.19; 95% CI: 1.40-12.60; p = 0.011) as risk factors for OI. RRT was also a risk factor for invasive fungal disease (IFD; HR: 24.90; 95% CI: 5.35-115.91; p < 0.001). In conclusion, infection is the most frequent complication and the leading cause of death following SBT. Posttransplant RRT and retransplantation identify those recipients at high risk for developing OI and IFD.


Subject(s)
Graft Rejection/microbiology , Intestinal Diseases/surgery , Intestine, Small/transplantation , Mycoses/microbiology , Opportunistic Infections/microbiology , Postoperative Complications , Adult , Child , Child, Preschool , Female , Follow-Up Studies , Graft Rejection/epidemiology , Graft Survival , Humans , Incidence , Intestinal Diseases/complications , Intestinal Diseases/microbiology , Male , Mycoses/epidemiology , Opportunistic Infections/epidemiology , Prognosis , Retrospective Studies , Risk Factors , Young Adult
20.
Transpl Infect Dis ; 17(3): 418-23, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25816889

ABSTRACT

Infections produced by Mycobacterium abscessus are emerging in immunosuppressed patients, such as solid organ transplant recipients. We report the first case, to our knowledge, of a vertebral osteomyelitis caused by M. abscessus in a heart transplant recipient, and review the risk factors, manifestations, and therapeutic approaches to this uncommon disease.


Subject(s)
Heart Transplantation/adverse effects , Mycobacterium/isolation & purification , Osteomyelitis/complications , Respiratory Tract Infections/complications , Aged , Anti-Bacterial Agents/therapeutic use , Clarithromycin/therapeutic use , Female , Humans , Immunocompromised Host , Mycobacterium/classification , Osteomyelitis/diagnostic imaging , Osteomyelitis/drug therapy , Osteomyelitis/microbiology , Respiratory Tract Infections/diagnostic imaging , Respiratory Tract Infections/drug therapy , Respiratory Tract Infections/microbiology , Risk Factors , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...