Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
1.
Transplant Proc ; 43(8): 2871-4, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21996176

ABSTRACT

BACKGROUND: As the disparity between the numbers of available organ donors and patients awaiting transplantation increases, different strategies have been proposed to extend the donor pool. Patients with acute kidney injury (AKI) developing during an intensive care unit (ICU) stay are often considered to be donors, but the long-term outcomes of such high-risk kidney transplantations is unknown. We analyzed the renal function and outcomes over 5 years of kidney grafts recovered from deceased donors diagnosed with AKI. MATERIALS AND METHODS: We collected data from 61 deceased kidney donors, identified in 1 ICU, and 120 kidney graft recipients who underwent transplantation between January 1999 and December 2006. Donors were stratified according to the RIFLE classification, based on their creatinine and urine output change from admission to the ICU and organ procurement. Recipient kidney graft function (eGFR) calculated according to the MDRD (Modification of Diet in Renal Disease) equation was estimated every 6 months. RESULTS: Among 61 donors, 10 (16.4%) developed AKI, including 7 classified as "risk", 2 as "injury," and 1 as "failure." The mean follow-up of kidney graft recipients was 49±18 months. The long-term risk for graft loss was significantly higher among the group of kidneys recovered from donors with AKI (27.8% vs 7.1%; P=.02; log-rank=0.07). Their excretory function was worse over the whole follow-up period. CONCLUSION: Patients with kidney grafts obtained from the donors with AKI showed a higher risk for graft loss and worse excretory function upon long-term follow-up.


Subject(s)
Acute Kidney Injury , Kidney Transplantation/adverse effects , Tissue Donors , Tissue and Organ Procurement , Acute Kidney Injury/diagnosis , Acute Kidney Injury/mortality , Adult , Cause of Death , Critical Care , Female , Humans , Kidney Transplantation/physiology , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors
2.
Inflamm Res ; 53(8): 338-43, 2004 Aug.
Article in English | MEDLINE | ID: mdl-15316663

ABSTRACT

RATIONALE: IL-10, the main anti-inflammatory cytokine, may play a pivotal role in cerebral inflammation implicated in the development of brain edema and secondary brain damage after injury. AIM OF THE STUDY: 1) Determining absolute IL-10 serum level and its pattern in critically ill patients with traumatic and non-traumatic acute brain injury. 2) Assessment of prognostic value of serum IL-10 in those patients. MATERIALS AND METHODS: Serum IL-10 levels in 46 adults (multi-profile ICU, teaching hospital) with traumatic brain injury (TBI, N = 18), nontraumatic intracranial hemorrhage (SAH, N = 11) and polytrauma with concomitant brain injury (POL, N = 17) were measured using ELISA. Relationship of IL-10 and initial diagnosis, clinical state, outcome and risk of infection development was evaluated. RESULTS: IL-10 was detectable in the serum of all but one patient on ICU admission (56.6 +/- 91.9 pg/ml; mean +/- SD). No statistically significant differences in IL-10 between TBI, SAH and POL groups as well as between survivors and non-survivors on any day were found. No correlation between IL-10 and GCS or SAPS II was seen. Significant fall in serum IL-10 during the first 4 days of injury in patients of all subgroups was observed. Patients with initial serum IL-10 below 77 pg/ml were at significantly higher risk of development of any infection within the first week of injury. CONCLUSIONS: After acute brain injury, serum IL-10 in adults is detectable independent of CNS lesion type. Its systemic release is strongly individualized. Serum IL-10 on ICU admission may have some prognostic value to predict development of infection in patients with CNS lesions.


Subject(s)
Intensive Care Units , Interleukin-10/blood , Trauma, Nervous System/blood , Adult , Biomarkers , Female , Humans , Male , Middle Aged , Reference Values , Trauma Severity Indices , Trauma, Nervous System/mortality
3.
Endoscopy ; 36(6): 508-14, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15202047

ABSTRACT

BACKGROUND AND STUDY AIM: Colonoscopy is a common gastroenterological procedure for investigation of the bowel. The main side effects of colonoscopy are pain during investigation, cardiovascular complications and very rarely even death. The aim of this study was to compare the continuous fluctuation of heart rate variability (HRV) components during colonoscopy under normal conditions, analgesia/sedation, and total intravenous anesthesia. PATIENTS AND METHODS: 37 consecutive patients (aged 35 - 65), were randomly allocated to three groups: no sedation (control group 1); analgesia/sedation (group 2); and total intravenous anesthesia (group 3). Holter electrocardiography and subsequent frequency domain analysis were undertaken. The low-frequency (LF, 0.04 - 0.15 Hz) and the high-frequency (HF, 0.15 - 0.40 Hz) components were estimated using spectral analysis in the usual way. Normalized units (nu) were calculated from the following equations: LFnu = LF/(LF + HF), and HFnu = HF/(LF + HF). RESULTS: Groups 2 and 3 were found to have a significantly lower HFnu and higher LFnu than group 1 essentially throughout the procedure. A one-way analysis of variance and t-test confirmed that the differences were significant when the colonoscope reached the splenic flexure as were the LF/HF balances at the splenic and hepatic flexures and the cecum. The percentage change in LF/HF was also analyzed, and it was found that in group 3 the mean change was over 136 % when the colonoscope reached the sigmoid flexure, which was significantly higher than in the other two groups. CONCLUSION: Most changes in HRV components occurred during colonoscopy of the left side of the bowel. Analgesia/sedation and total intravenous anesthesia increased HRV by increasing the LF component.


Subject(s)
Colonoscopy , Heart Rate/physiology , Sympathetic Nervous System/physiology , Vagus Nerve/physiology , Adult , Aged , Analgesics, Opioid/therapeutic use , Anesthetics, Intravenous/administration & dosage , Cecum/anatomy & histology , Colon, Ascending/anatomy & histology , Colon, Descending/anatomy & histology , Colon, Transverse/anatomy & histology , Conscious Sedation , Electrocardiography, Ambulatory , Female , Fentanyl/administration & dosage , Humans , Male , Midazolam/administration & dosage , Middle Aged , Piperidines/therapeutic use , Propofol/administration & dosage , Remifentanil
4.
Med Pr ; 52(5): 361-8, 2001.
Article in Polish | MEDLINE | ID: mdl-11828851

ABSTRACT

Building of the physiologically-based toxicokinetic (PBTK) models is based on the application of simulation languages, such as advanced continuous simulation language (ACSL). The aim of this study was to develop the principles of constructing the fundamental model and the models dedicated to chemicals found in the work environment (e.g. trimethylbenzene (TMB) isomers present in the petrochemical, paint and lacquer and related industries). The fundamental model is based on four main compartments (fat tissues, richly perfused tissue, slowly perfused tissue and liver) and six auxiliary compartments (lungs, venous blood, arterial blood, body weight, inhaled air and exhaled air). The basic element of the PBTK model comprises blocks containing definitions of variables and constants supplemented by the following parameters: command, calculated, transferred and resulted. The models dedicated to various chemicals and organisms are built by a suitable modification of the fundamental model. All sets of command parameters values for the organism, chemical and simulation are written in the text files and loaded before or during the simulation. The empirical data obtained in experiments with volunteers are used in a similar way. The specimen dedicated model was built for 1,2,3-TMB (hemimellitene). 2,3-Dimethylbenzoic acid (2,3,-DMBA) (a hemimelitene metabolite) excretion rate data obtained from an experiment during which volunteers were exposed to hemimellitene at 25 at 100 mg/m3 were compared with the results of the computer simulation. A high convergence of the comparable values was obtained. Simulations were also made for exposure periods of one week and one month. The results confirmed the experiment-based recommendations on the assessment of the occupational exposure. The application of the new physiologically-based toxicokinetic models renders it possible to forecast toxic chemical (or it metabolite) concentrations corresponding to the concentrations of those chemicals in the workplace atmosphere.


Subject(s)
Computer Simulation , Hazardous Substances/analysis , Models, Biological , Software , Toxicology/methods , Body Fluid Compartments , Humans , Nonlinear Dynamics , Programming Languages
5.
Med Pr ; 51(5): 447-56, 2000.
Article in Polish | MEDLINE | ID: mdl-11199174

ABSTRACT

For the description of the processes of absorption, excretion or elimination of chemicals, the open one- or two-compartment models have been used thus far. The latter consist mainly of the fast (central) and slow (peripheral) compartments. The toxicological studies were based on an assumption that the organic processes develop according to is the first order kinetic reaction. However, the absorption, elimination or excretion of toxic chemicals are in fact much more complicated processes that should be explained using, e.g. the physiologically-based toxicokinetic (PBTK) models, covering physiological, biochemical and metabolic parameters, as well as the allometric calibration of selected parameters for interspecies extrapolations, and in vitro/in vivo extrapolations of metabolic parameters. Simulation languages, e.g. ACSL (Advanced Continuous Simulation Language) are indispensable application tools to be operated with PBTK models. They have been developed for modelling systems described by time-dependent non-linear differential equations and/or transfer functions. ACSL with its interfaces (ACSL Builder, ACSL Graphic Modeller, ACSL Math) ensures data input and communication inside the model by the control, transfer and computed parameters. The physiologically-based toxicokinetic models employ a large number of different parameters, which enables, e.g. forecasting the dose/effect or dose/response relationship absorption rate, metabolic pathways, excretion or elimination according to the absorbed dose of xenobiotic; evaluation of risk assessment; extrapolation from high to low doses characteristic of environmental exposure or setting biological exposure limits.


Subject(s)
Computer Simulation , Models, Biological , Programming Languages , Xenobiotics/pharmacokinetics , Body Fluid Compartments/physiology , Nonlinear Dynamics
6.
Klin Oczna ; 98(1): 45-9, 1996 Jan.
Article in Polish | MEDLINE | ID: mdl-9019575

ABSTRACT

PURPOSE: This prospective study was designed to compare intraocular pressure changes and haemodynamic response to insertion of either a laryngeal mask or an orotracheal tube during general anaesthesia for cataract surgery. MATERIAL AND METHODS: The effect of techniques (tracheal tube-TT, laryngeal mask airway-LMA) securing a clear upper airway on the heart rate changes (HR), arterial pressure (SAP, DAP), oxygen saturation (SpO2), end-tidal carbon-dioxide pressure (Et-CO2), intraocular pressure (IOP) and the incidence of coughing, stridor and sore throat was analysed in 60 patients undergoing cataract surgery during general anaesthesia. RESULTS: The mean values for HR, SAP, DAP and IOP measured before and after induction of anaesthesia were not different in both groups. After securing a clear airway, mean HR increased in TT group to 95.5/min and decreased to 75.7/min in LMA group. SAP increased in TT group to 131 mmHg, DAP to 82.6 mmHg, whilst in LMA group both values decreased to 98.6 mmHG and 66.3 mmHg, respectively. The significant difference in IOP values was observed after intubation or using laryngeal mask. In TT group, intraocular pressure increased to 15 mmHg in healthy eye and to 13.6 mmHg in ill eye whilst there was a decrease in LMA group to 5.5 and 7.43 mmHg, respectively. Furthermore, a greater incidence of such complications as coughing, stridor and sore throat in TT group was observed. CONCLUSION: The results show that using LMA in microsurgery during general anaesthesia is more advantageous and safer for patients in comparison with tracheal intubation.


Subject(s)
Anesthesia, General/instrumentation , Cataract Extraction/methods , Intubation, Intratracheal , Laryngeal Masks , Adult , Aged , Aged, 80 and over , Cough/etiology , Female , Hemodynamics/physiology , Humans , Intraocular Pressure/physiology , Intraoperative Period , Intubation, Intratracheal/adverse effects , Male , Microsurgery , Middle Aged , Pharyngitis/etiology , Prospective Studies , Respiratory Sounds/etiology
SELECTION OF CITATIONS
SEARCH DETAIL
...