Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Rev Esp Quimioter ; 34 Suppl 1: 49-51, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34598427

ABSTRACT

The diagnosis of SARS-CoV-2 is based on the use of nucleic acid amplification tests (NAAT), especially rRT-PCR. The latter also allows us to quickly identify variants of concern. However, its use in follow-up of patients and the correlation between Ct value and the viability of the virus is controversial.


Subject(s)
COVID-19 , SARS-CoV-2 , COVID-19 Testing , Humans , Pathology, Molecular
2.
Transpl Infect Dis ; 18(3): 431-41, 2016 Jun.
Article in English | MEDLINE | ID: mdl-27061510

ABSTRACT

BACKGROUND: Recent studies suggest that Epstein-Barr virus DNAemia (EBVd) may act as a surrogate marker of post-transplant immunosuppression. This hypothesis has not been tested so far in lung transplant (LT) recipients. METHODS: We included 63 patients undergoing lung transplantation at our center between October 2008 and May 2013. Whole blood EBVd was systematically assessed by real-time polymerase chain reaction assay on a quarterly basis. The occurrence of late complications (overall and opportunistic infection [OI] and chronic lung allograft dysfunction [CLAD]) was analyzed according to the detection of EBVd within the first 6 months post transplantation. RESULTS: Any EBVd was detected in 30 (47.6%) patients. Peak EBVd was higher in patients with late overall infection (2.23 vs. 1.73 log10 copies/mL; P = 0.026) and late OI (2.39 vs. 1.74 log10 copies/mL; P = 0.004). The areas under receiver operating characteristic curves for predicting both events were 0.806 and 0.871 respectively. The presence of an EBVd ≥2 log10 copies/mL during the first 6 months post transplantation was associated with a higher risk of late OI (adjusted hazard ratio [aHR] 7.92; 95% confidence interval [CI] 2.10-29.85; P = 0.002). Patients with detectable EBVd during the first 6 months also had lower CLAD-free survival (P = 0.035), although this association did not remain statistically significant in the multivariate analysis (aHR 1.26; 95% CI 0.87-5.29; P = 0.099). CONCLUSIONS: Although preliminary in nature, our results suggest that the detection of EBVd within the first 6 months after transplantation is associated with the subsequent occurrence of late OI in LT recipients.


Subject(s)
Epstein-Barr Virus Infections/diagnosis , Herpesvirus 4, Human/isolation & purification , Lung Transplantation/adverse effects , Opportunistic Infections/etiology , Postoperative Complications/etiology , Adult , Aged , Biomarkers/blood , Cohort Studies , DNA, Viral/blood , Epstein-Barr Virus Infections/virology , Female , Follow-Up Studies , Herpesvirus 4, Human/genetics , Humans , Immunosuppression Therapy , Incidence , Male , Middle Aged , Real-Time Polymerase Chain Reaction , Retrospective Studies , Viremia
3.
Transplant Proc ; 47(1): 57-61, 2015.
Article in English | MEDLINE | ID: mdl-25645770

ABSTRACT

BACKGROUND: Mammalian target of rapamycin inhibitors (mTOR-i) have been proposed as possible immunosuppressants of choice in BK virus nephropathy (BKN) because of their antiviral capacity. On this basis, in 2007, our Service proposed a conversion to everolimus (EVE)-based therapy from calcineurin inhibitors with an anti-calcineurin-free therapy protocol in those patients diagnosed of BKN. METHODS: A prospective, single-center case series study was performed. Fifteen cases of BKN were diagnosed from 2007 to the end of 2010. According to our protocol, immunosuppressant treatment was modified in 9 of these patients with suspension of mycophenolate and conversion from tacrolimus to EVE. RESULTS: The renal function achieved by our patients after the transplantation was excellent. Mean serum creatinine (sCr) achieved was 1.16 ± 0.2 mg/dL. Evolution of the renal function after BKN diagnosis and conversion to mTOR-i was positive in all the patients. sCr on diagnosis was 1.85 ± 0.22 mg/dL, sCr at the point in time of conversion to EVE was 2 ± 0.21 mg/dL, and final sCr of the follow-up was 1.6 ± 0.39 mg/dL (P = .05). BK viremia became negative in 5 of our patients and decreased more than 95% in the remaining 4. None of the patients had an acute rejection episode after the change of immunosuppressant. CONCLUSIONS: Conversion to mTOR-i-based therapy could provide an added benefit in BKN and could be an effective strategy for the decrease of the viremia and increase of graft survival in selected patients.


Subject(s)
BK Virus , Immunosuppressive Agents/therapeutic use , Kidney Diseases/therapy , Kidney Transplantation , Polyomavirus Infections/prevention & control , Sirolimus/analogs & derivatives , Adult , Calcineurin Inhibitors , Everolimus , Female , Graft Survival , Humans , Immunosuppression Therapy , Kidney Diseases/diagnosis , Kidney Diseases/epidemiology , Male , Middle Aged , Polyomavirus Infections/diagnosis , Polyomavirus Infections/epidemiology , Prospective Studies , Sirolimus/therapeutic use , Tacrolimus/therapeutic use , Viral Load , Viremia/diagnosis , Viremia/etiology , Viremia/prevention & control
4.
J Orthop Res ; 14(4): 663-7, 1996 Jul.
Article in English | MEDLINE | ID: mdl-8764878

ABSTRACT

Experiments were performed on 120 rabbits to compare the probability of infection after bone surgery without an implant, with polymethylmethacrylate, and with autografts. Staphylococcus aureus phage type 94/96, isolated from a human osteomyelitis, was instilled into the intramedullar cavity after reaming of the femoral canal and before insertion of the implant. The different 50% infective doses were determined for each of the groups for comparative purposes. The bacterial concentrations required to produce infection in femora without an implant were two times less than those necessary in femora implanted with polymethylmethacrylate. The bone graft required bacterial concentrations nine times less than those necessary to infect femora containing polymethylmethacrylate and four times less than those required to infect femora without an implant. The results presented here confirm that the susceptibility to infection in orthopaedic surgery is not only material dependent but also bacteria dependent.


Subject(s)
Bone and Bones/microbiology , Staphylococcal Infections/physiopathology , Staphylococcus aureus , Wound Infection , Animals , Bacteriological Techniques , Bone Transplantation , Humans , Male , Polymethacrylic Acids , Prostheses and Implants , Rabbits , Transplantation, Autologous , Wound Healing/physiology
5.
Injury ; 27 Suppl 3: SC34-7, 1996.
Article in English | MEDLINE | ID: mdl-9039352

ABSTRACT

The influence of the localization and size of orthopaedic implants on infection has been analyzed extensively, but the influence of implant shape and chemical composition has rarely been studied, and the influence of the surface has only been described in one single report. Several experimental studies have tried to compare the incidence of infection for different materials. PMMA usually appears as the implant material most prone to causing infection, while titanium (Ti) and cobalt-chromium (CoCr) are the materials most resistant to infection. On the polished surface of cylinders implanted in rabbit femora, it took 40 times more inoculum to produce a clinical infection than it took for porous CoCr implants. The polished surface implants required 2.5 times more inoculum than porous Ti to produce infection.


Subject(s)
Prostheses and Implants/adverse effects , Prosthesis-Related Infections/etiology , Surgical Wound Infection/etiology , Animals , Chromium Alloys/adverse effects , Cricetinae , Dogs , Durapatite/adverse effects , Methylmethacrylates/adverse effects , Polyethylenes/adverse effects , Rabbits , Stainless Steel/adverse effects , Surface Properties , Titanium/adverse effects
6.
J Bone Joint Surg Br ; 76(5): 717-20, 1994 Sep.
Article in English | MEDLINE | ID: mdl-8083257

ABSTRACT

We implanted cylinders of cobalt-chrome or titanium, with smooth or porous surfaces, into rabbit bones which had been inoculated with suspensions of Staphylococcus aureus in various doses. The bacterial concentration required to produce infection of porous-coated titanium implants was 2.5 times smaller than that necessary to infect implants with polished surfaces. Porous-coated cobalt-chromium implants required bacterial concentrations that were 40 times smaller than those needed to infect implants with polished surfaces, and 15 times smaller than those required to infect porous-coated titanium implants. The other advantages and disadvantages of the various implants, such as improved osseointegration, larger ion-release surfaces, surface wear and relative stiffness, must be weighed against the higher infection rates in the porous-coated implants, and particularly in the cobalt-chromium implants.


Subject(s)
Chromium Alloys , Hip Prosthesis/adverse effects , Prostheses and Implants , Prosthesis-Related Infections/microbiology , Staphylococcal Infections/microbiology , Titanium , Animals , Double-Blind Method , Equipment Contamination , False Negative Reactions , False Positive Reactions , Male , Models, Biological , Prosthesis Design , Prosthesis-Related Infections/epidemiology , Rabbits , Random Allocation , Reproducibility of Results , Staphylococcal Infections/epidemiology , Surface Properties
8.
Enferm Infecc Microbiol Clin ; 8(9): 553-9, 1990 Nov.
Article in Spanish | MEDLINE | ID: mdl-2099856

ABSTRACT

We have evaluated 44 cases of Serratia marcescens bacteremia (SB). Most took place in surgical services (57%) and the ICU (34%). In one occasion, the cases developed as an epidemic outbreak. SB basically developed in patients with underlying diseases (neoplasia in 32%, heart disease in 16%, chronic bronchitis in 14% and miscellaneous in 20%) in whom some invasive procedure had been carried out (98%). The most common complication was septic shock. In 17 cases the infection was polymicrobial. The most common serogroup was 0:5 (41%). 98% of strains were resistant to cephalothin, 78% to ampicillin and 29% to tobramycin. The mortality rate was 39% and the most common cause of death was septic shock. The factors which adversely influenced prognosis were as follows, in order of decreasing importance: leukocytosis, thrombopenia, associated gram-positive infection, age older than 65 years, "non-typable" serogroup, unknown portal of entry, epidemic case and septic shock.


Subject(s)
Cross Infection/epidemiology , Enterobacteriaceae Infections/epidemiology , Sepsis/epidemiology , Serratia marcescens , Adult , Aged , Aged, 80 and over , Anti-Bacterial Agents/therapeutic use , Cross Infection/microbiology , Cross Infection/mortality , Disease Outbreaks , Drug Resistance, Microbial , Enterobacteriaceae Infections/drug therapy , Enterobacteriaceae Infections/mortality , Female , Humans , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Sepsis/drug therapy , Sepsis/microbiology , Sepsis/mortality , Serratia marcescens/drug effects , Serratia marcescens/isolation & purification , Shock, Septic/etiology , Shock, Septic/mortality
SELECTION OF CITATIONS
SEARCH DETAIL
...