Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
1.
Infection ; 42(6): 1013-22, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25143193

ABSTRACT

PURPOSE: Vancomycin-Resistant Enterococci (VRE) are important causes of Intensive Care Unit (ICU) infections. Our goal was to identify the prevalence and risk factors for VRE colonization upon ICU admission and during ICU stay, as well as, their impact in enterococcal infection including vancomycin-susceptible cases (VSE). METHODS: A prospective study regarding patients admitted in ICU (n = 497) was conducted during a 24-month period. Rectal swabs were collected upon admission and during hospitalization and inoculated onto selective medium. Enterococci were phenotypically characterized. van genes were investigated by PCR and clones were identified by Pulsed-Field Gel Electrophoresis and Multilocus Sequence Typing. Epidemiologic data were collected from the ICU database. RESULTS: Risk factors for VRE carriage upon ICU admission (71/497) were: duration of previous hospitalization, glycopeptide administration, chronic heart failure, malignancy, insulin-dependent diabetes mellitus, and previous enterococcal infection (VRE and/or VSE). Risk factors for VRE colonization during ICU stay (36/250) were: quinolone administration, chronic obstructive pulmonary disease, chronic renal failure, and number of VRE-positive patients in nearby beds. Risk factors for enterococcal infection during ICU stay (15/284), including VRE and VSE cases, were: administration of third- or fourth-generation cephalosporins, cortisone use before ICU admission and VRE colonization, whereas, enteral nutrition was a protective factor. CONCLUSIONS: Previous VRE colonization and antibiotic usage are essential parameters for enterococcal infection (by VRE or VSE) during ICU stay. Previous enterococcal infection, co-morbidities and antibiotic usage are associated with VRE colonization upon ICU admission, whereas, patient to patient transmission, co-morbidities and antibiotic usage constitute risk factors for VRE colonization during ICU hospitalization.


Subject(s)
Cross Infection/microbiology , Gram-Positive Bacterial Infections/microbiology , Vancomycin-Resistant Enterococci/pathogenicity , Adult , Aged , Analysis of Variance , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Critical Illness , Cross Infection/drug therapy , Environmental Microbiology , Female , Gram-Positive Bacterial Infections/drug therapy , Humans , Intensive Care Units , Male , Microbial Sensitivity Tests , Middle Aged , Prevalence , Prospective Studies , Risk Factors , Vancomycin-Resistant Enterococci/drug effects , Vancomycin-Resistant Enterococci/isolation & purification
2.
Infection ; 42(5): 883-90, 2014 Oct.
Article in English | MEDLINE | ID: mdl-25008195

ABSTRACT

PURPOSE: To identify the risk factors for incident enteric colonization by KPC-producing Klebsiella pneumoniae (KPC-Kp) resistant to colistin or tigecycline during Intensive Care Unit (ICU) stay. METHOD: A prospective observational study of patients admitted to the ICU was conducted during a 27-month period. Rectal samples taken upon admission and weekly afterwards were inoculated on selective chromogenic agar. K. pneumoniae isolates were characterized by standard methodology. Mean inhibitory concentration (MIC) to colistin and tigecycline were determined by E-test. The presence of bla KPC gene was confirmed by PCR. RESULTS: Among 254 patients, 62 (24.4%) became colonized by colistin- resistant KPC-Kp during their stay. Multivariate analysis revealed that corticosteroid, colistin administration and number of colonized patients in nearby beds per day were significantly associated with colonization. Among 257 patients, 39 (17.9%) became colonized by tigecycline resistant KPC-Kp during their stay. Risk factors identified by multivariate analysis were: days at risk, obesity, number of colonized patients treated in nearby beds per day and administration of tigecycline. CONCLUSIONS: The high prevalence of colistin or tigecycline resistant KPC-Kp enteric carriage in ICU patients indicate that dissemination is due to their transfer from patient to patient via the personnel and indicates the importance of strict infection control protocols.


Subject(s)
Anti-Bacterial Agents/pharmacology , Colistin/pharmacology , Drug Resistance, Bacterial , Klebsiella Infections/epidemiology , Klebsiella pneumoniae/drug effects , Minocycline/analogs & derivatives , Adult , Aged , Bacterial Proteins/genetics , Bacterial Proteins/metabolism , Female , Greece/epidemiology , Hospitalization , Hospitals, University , Humans , Intensive Care Units , Klebsiella Infections/microbiology , Klebsiella Infections/transmission , Klebsiella pneumoniae/enzymology , Male , Middle Aged , Minocycline/pharmacology , Prospective Studies , Risk Factors , Tertiary Care Centers , Tigecycline , beta-Lactamases/genetics , beta-Lactamases/metabolism
3.
J Clin Pharm Ther ; 35(5): 603-8, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20831684

ABSTRACT

BACKGROUND AND OBJECTIVE: Use of opioids is common in perioperative haemodialysis patients because they often suffer from intractable ischaemic or neuropathic lower extremity pain. Intravenous (IV) fentanyl, patient-controlled analgesia (PCA) does not appear to have been evaluated in this setting; hence this study. METHODS AND RESULTS: This is a prospective, single-centre study. IV fentanyl PCA was used for pain control in 16 patients with lower extremity, neuropathic/ischaemic pain, scheduled for major lower extremity amputation. IV fentanyl PCA was used before and after amputation in eight patients, before but not after amputation in seven patients, and until death in one terminal cancer patient who chose to forgo surgery. Pain intensity was assessed with the Visual Analogue Scale (VAS) and the McGill Pain Questionnaire. Depth of sedation was assessed on a 4-point scale. Ischaemic pain scores were high before fentanyl PCA started, but decreased significantly and remained low with fentanyl PCA use (P<0·001). Phantom pain scores were low (VAS≤4). Respiratory depression was not a problem in any patient. CONCLUSIONS: Concerns about accumulation of active opioid metabolites make provision of adequate analgesia problematic in haemodialysis patients scheduled for amputation, and emergency surgery. Our data on a small patient population suggest that IV fentanyl PCA is safe and effective for severe pain in haemodialysis patients.


Subject(s)
Analgesics, Opioid/administration & dosage , Fentanyl/administration & dosage , Neuralgia/drug therapy , Pain, Postoperative/drug therapy , Pain/drug therapy , Adult , Aged , Analgesia, Patient-Controlled/adverse effects , Analgesics, Opioid/adverse effects , Analgesics, Opioid/therapeutic use , Diabetic Neuropathies/drug therapy , Female , Fentanyl/adverse effects , Fentanyl/therapeutic use , Humans , Infusions, Intravenous , Injections, Intravenous , Male , Middle Aged , Neuralgia/etiology , Pain/etiology , Pain Measurement , Pain, Postoperative/etiology , Prospective Studies , Renal Dialysis
4.
Anaesth Intensive Care ; 38(1): 190-3, 2010 Jan.
Article in English | MEDLINE | ID: mdl-20191796

ABSTRACT

We investigated the time-dependent ocular surface bacterial colonisation of sedated patients hospitalised in an intensive care unit and aimed to evaluate whether proper topical antibiotic prophylaxis could prohibit corneal infection. The study lasted 12 months and included 134 patients undergoing sedation and mechanical respiratory support for various medical reasons. Patients hospitalised for less than seven days and those with pre-existing ocular surface pathology were excluded. All patients were examined on admission by inspecting the cornea for erosions. Followup examinations were performed each subsequent day. Cultures were also obtained from the conjunctival sac of both eyes on admission and every seventh day until the end of sedation. Standard laboratory techniques were used for isolation, identification and antibiotic susceptibility testing of bacteria. Antibiotic treatment for prophylaxis was administered accordingly. Analysis was carried out for 70 patients. Duration of sedation ranged from seven to 122 days. Fifty-four (77%) patients were colonised by at least one bacterial species other than normal flora within seven to 42 days. Multiple bacteria were isolated from 28 patients undergoing prolonged sedation. Prevalent isolates were Pseudomonas aeruginosa, Acinetobacter spp. and Staphylococcus epidermidis. Infectious keratitis was prohibited in all cases. Ocular surface of long-term sedated patients was found to be colonised by various bacterial species and their isolation was closely associated with the time period of hospitalisation. The results of this study suggest that the early identification of ocular surface bacteria colonisation and the administration of topical antibiotics for prophylaxis can prohibit corneal infection in these patients.


Subject(s)
Bacteria/growth & development , Eye/microbiology , Intensive Care Units , Adult , Aged , Antibiotic Prophylaxis , Bacteria/drug effects , Conscious Sedation , Corneal Diseases/microbiology , Corneal Diseases/prevention & control , Drug Resistance, Multiple, Bacterial , Eye Infections, Bacterial/microbiology , Eye Infections, Bacterial/prevention & control , Female , Humans , Keratitis/pathology , Length of Stay , Male , Microbial Sensitivity Tests , Middle Aged , Prospective Studies , Respiration, Artificial , Time Factors , Young Adult
5.
Anaesth Intensive Care ; 38(6): 1090-3, 2010 Nov.
Article in English | MEDLINE | ID: mdl-21226443

ABSTRACT

We present the case of a 52-year-old female admitted with fever and multiple organ failure, initially treated for presumed sepsis. However the combination of multiple organ failure, hyperthermia and vascular instability raised the suspicion of a phaeochromocytoma multisystem crisis. An emergency abdominal ultrasound in the intensive care unit disclosed a large tumour of the right adrenal. Despite specific medical treatment for the presumed adrenal emergency and multiple organ failure, the patient succumbed. Postmortem examination verified the diagnosis of phaeochromocytoma.


Subject(s)
Adrenal Gland Neoplasms/complications , Fever/etiology , Multiple Organ Failure/etiology , Pheochromocytoma/complications , Sepsis/complications , Female , Humans , Middle Aged
6.
Anaesth Intensive Care ; 37(6): 1005-7, 2009 Nov.
Article in English | MEDLINE | ID: mdl-20014610

ABSTRACT

Thyroid storm is a rare but life-threatening condition manifesting with several clinical presentations. Atypical thyroid storm should be part of the differential diagnosis in patients with multiple organ dysfunction of unknown aetiology. In this case report, delayed recognition of thyroid storm in a young female who presented with acute abdomen increased the risk of poor outcome. Prompt initiation of anti-thyroid therapy once the diagnosis of thyroid storm was established, combined with adequate vital organ support using a goal-directed therapy protocol in the intensive care unit resulted in a good outcome.


Subject(s)
Abdominal Pain/etiology , Multiple Organ Failure/diagnosis , Thyroid Crisis/diagnosis , Abdominal Pain/diagnosis , Adult , Critical Care/methods , Diagnosis, Differential , Female , Humans , Multiple Organ Failure/etiology , Sepsis/diagnosis , Thyroid Crisis/physiopathology , Treatment Outcome
7.
Eur J Cardiothorac Surg ; 20(2): 372-7, 2001 Aug.
Article in English | MEDLINE | ID: mdl-11463560

ABSTRACT

OBJECTIVE: Blood transfusion may adversely affect the prognosis following surgery for non-small cell lung carcinoma (NSCLC). Conventionally by most thoracic surgeons, a perioperative haemoglobin (Hb) less than 10 g/dl has been considered a transfusion trigger. In this prospective trial we have (a) evaluated the overall blood transfusion requirements and factors associated with an increased need for transfusion and (b) in a subsequent subset of patients, tested the hypothesis that elective anaemia after major lung resection may be safely tolerated in the early postoperative period. METHODS: A total of 198 (M/F 179/10, mean age 61.2, range 32--85 years) patients suffering from NSCLC were submitted to pneumonectomy (n = 89), bilobectomy (n = 19) and lobectomy (n = 90). A rather strict protocol was used as a transfusion strategy. The transfusion requirements were analyzed and seven parameters (gender, age > 65, preoperative Hb < 11.5 g/dl, chest wall resection, history of previous thoracotomy, pneumonectomy and total blood loss) were statistically evaluated by univariate and logistic regression analysis. Subsequently, according to the perioperative Hb level during the first 48 h, patients were divided into group A (n = 49, Hb = 8.5--10) and group B (n = 149, Hb > 10) with a view to estimate the risks of elective perioperative anaemia. Groups were comparable in terms of age, sex, type of operation performed, preoperative Hb, creatinine level, FEV1, arterial blood gases and history of heart disease. RESULTS: The overall transfusion rate was 16%. Univariate analysis revealed that preoperative Hb < 11.5 g/dl (P < 0.01) and total blood loss (P < 0.0001) were associated with increased need for transfusion, but only the total blood loss was identified as an independent variable in multivariate analysis. Statistical analysis between groups A and B showed no significant difference regarding postoperative morbidity and mortality: atelectasis (3 vs. 6), chest infection (2 vs. 9), sputum retention requiring bronchoscopy (5 vs. 12), admission to intensive care unit (5 vs. 7), ARDS (0 vs. 3), postoperative hospital stay (7.7 +/- 2.6 vs. 9.1 +/- 3.8 days) and deaths (1 vs. 3). CONCLUSIONS: The use of a strict transfusion strategy could help in reducing overall blood transfusion. Furthermore, a perioperative Hb of 8.5--10 g/dl could be considered safe in elective lung resections for carcinoma.


Subject(s)
Anemia/complications , Blood Transfusion , Carcinoma, Non-Small-Cell Lung/complications , Carcinoma, Non-Small-Cell Lung/surgery , Lung Neoplasms/pathology , Lung Neoplasms/surgery , Adenocarcinoma/complications , Adenocarcinoma/surgery , Adult , Aged , Aged, 80 and over , Carcinoma, Non-Small-Cell Lung/pathology , Carcinoma, Squamous Cell/complications , Carcinoma, Squamous Cell/pathology , Carcinoma, Squamous Cell/surgery , Contraindications , Female , Hematocrit , Humans , Length of Stay , Male , Middle Aged , Neoplasm Staging , Treatment Outcome
8.
J Am Coll Surg ; 190(4): 423-31, 2000 Apr.
Article in English | MEDLINE | ID: mdl-10757380

ABSTRACT

BACKGROUND: This study was undertaken to investigate the effect of growth hormone (GH) and insulin-like growth factor I (IGF-I), two well-known growth factors, on bacterial translocation, endotoxemia, enterocyte apoptosis, and intestinal and liver histology in a model of experimental obstructive jaundice in rats. STUDY DESIGN: One hundred six male Wistar rats were divided into five groups: I (n = 21), controls; II (n = 22), sham operated; III (n = 22), bile duct ligation (BDL); IV (n = 21), BDL and GH treatment; and V (n = 20), BDL and IGF-I administration. By the end of the experiment, on day 10, blood bilirubin was determined, and mesenteric lymph nodes, liver specimens, and bile from the bile duct stump were cultured. Endotoxin was measured in portal and aortic blood. Tissue samples from the terminal ileum and liver were examined histologically and apoptotic body count (ABC) in intestinal mucosa was evaluated. Mucosal DNA and protein content were also determined. RESULTS: Bilirubin increased significantly after BDL (p < 0.001). Bile from the bile duct was sterile. In group III, MLN and liver specimens were contaminated by gut origin bacteria (significant versus group I and II, p < 0.001, respectively). GH reduced significantly positive cultures (p < 0.01), and IGF-I had no effect. BDL resulted in significant increase in portal and aortic endotoxemia (p < 0.001); treatment with GH and IGF-I reduced it (p < 0.001). Mucosal DNA and protein content were reduced in animals with BDL and after treatment with GH or IGF-I; an increase to almost normal levels was noted in DNA, but not in protein. Overall the ileal architecture remained intact in all animal groups. The ABC increased after BDL. After GH and IGF-I administration, the ABC decreased significantly, and there was no difference between GH and IGF-I treated animals. After BDL, liver biopsies displayed typical changes of biliary obstruction, which were significantly improved after administration of GH and IGF-I. CONCLUSIONS: Treatment with GH and IGF-I in rats with experimental obstructive jaundice reduces endotoxemia, and it improves liver histology. Apoptosis, in the intestinal epithelium, may serve as a morphologic marker of the ileal mucosal integrity, demonstrating the proliferative potential of GH and IGF-I in cases of obstructive jaundice, and this might be of potential value in patients with such conditions.


Subject(s)
Bacterial Translocation , Cholestasis/physiopathology , Human Growth Hormone/therapeutic use , Insulin-Like Growth Factor I/therapeutic use , Animals , Apoptosis , Bacterial Translocation/drug effects , Bilirubin/blood , Cholestasis/microbiology , Cholestasis/pathology , Endotoxemia/prevention & control , Ileum/pathology , Liver/microbiology , Lymph Nodes/microbiology , Male , Rats , Rats, Wistar
9.
Anesthesiology ; 92(5): 1286-92, 2000 May.
Article in English | MEDLINE | ID: mdl-10781273

ABSTRACT

BACKGROUND: Nonsurgical patients with sinus node dysfunction are at high risk for atrial tachyarrhythmias, but whether a similar relation exists for atrial fibrillation after coronary artery bypass graft surgery is not clear. The purpose of this study was to evaluate sinus nodal function before and after coronary artery bypass graft surgery and to evaluate its relation with the risk for postoperative atrial arrhythmias. METHODS: Sixty patients without complications having elective coronary artery bypass graft surgery underwent sinus nodal function testing by measurement of sinoatrial conduction time (SACT) and corrected sinus nodal recovery time (CSNRT). Patients were categorized based on whether postoperative atrial fibrillation developed. RESULTS: Twenty patients developed atrial fibrillation between postoperative days 1 through 3. For patients remaining in sinus rhythm (n = 40), sinoatrial conduction times were no different and corrected sinus nodal recovery times were shorter after surgery when compared with measurements obtained after anesthesia induction. Sinus node function test results before surgery were similar between the sinus rhythm and the atrial fibrillation groups. After surgery, patients who later developed atrial fibrillation had longer sinoatrial conduction times compared with the sinus rhythm group (P = 0.006), but corrected sinus nodal recover time was not different between these groups. A sinoatrial conduction time > 96 ms measured at this time point was associated with a 7.3-fold increased risk of postoperative atrial fibrillation (sensitivity, 62%; specificity, 81%; positive and negative predictive values, 56% and 85%, respectively; area under the receiver operator characteristic curve, 0.72). CONCLUSIONS: These data show that sinus nodal function is not adversely affected by uncomplicated coronary artery bypass surgery. Patients who later developed atrial fibrillation, however, had prolonged sinoatrial conduction immediately after surgery compared with patients remaining in sinus rhythm. These results suggest that injury to atrial conduction tissue at the time of surgery predisposes to postoperative atrial fibrillation and that assessment of sinoatrial conduction times could provide a means of identifying patients at high risk for postoperative atrial fibrillation.


Subject(s)
Atrial Fibrillation/etiology , Coronary Artery Bypass , Postoperative Complications/etiology , Sinoatrial Node/physiology , Aged , Female , Hemodynamics , Humans , Intraoperative Care , Intraoperative Complications , Male , Middle Aged , Preoperative Care , Risk Factors , Sinoatrial Node/injuries , Time Factors
10.
Eur Surg Res ; 31(2): 97-107, 1999.
Article in English | MEDLINE | ID: mdl-10213847

ABSTRACT

Despite a growing trend in acute pain management, many deficiencies still account for the high incidence of moderate to severe postoperative pain to date. Patients nowadays continue to receive inadequate doses of analgesics, but additionally the identification and treatment of those patients with pain still remains a significant health care problem. Advanced techniques are available including epidural or intrathecal administration of local anaesthetics and opioids, various opioid administration techniques such as patient-controlled analgesia and infusions via sublingual, oral-transmucosal, nasal, intra-articular and rectal routes. Nonopioid analgesics such as nonsteroidal anti-inflammatory drugs and newer nonopioid drugs such as alpha2-adrenergic agonists, calcium channel antagonists and various combinations of the above are possible. However, the solution to the problem of inadequate pain relief lies not so much in the development of new drugs and new techniques, but in the effective strategy of delivering these to patients through the introduction of acute pain management services on surgical wards.


Subject(s)
Pain, Postoperative/therapy , Analgesia , Analgesia, Patient-Controlled , Analgesics, Opioid/therapeutic use , Humans , Incidence , Nerve Block , Pain Measurement , Pain, Postoperative/epidemiology
11.
Eur Surg Res ; 31(2): 122-32, 1999.
Article in English | MEDLINE | ID: mdl-10213850

ABSTRACT

Surgical trauma induces nociceptive sensitization leading to amplification and prolongation of postoperative pain. In experimental studies, preinjury (e.g. pre-emptive) neural blockade using local anaesthetics or opioids has been shown to prevent or to reduce postinjury sensitization of the central nervous system, while similar techniques applied after the injury had less or no effect. Several clinical studies have evaluated possible pre-emptive analgesic effects by administering prior to surgery a variety of analgesic drugs both systemically or epidurally. These treatment modalities were compared to the same treatment following surgery or to control groups not given such treatment. In general, the results from these studies have been disappointing, although some clinical studies have confirmed the impressive results from animal studies. The present paper discusses deficiencies in study design of clinical trials, since the question regarding the effectiveness of pre-emptive analgesic regimens lies not so much in the timing of analgesic administration (e.g. preinjury vs. postinjury treatment), but in the effective prevention of altered central sensitization. Recent evidence suggests that administration of analgesics in order to effectively pre-empt postoperative pain should start before surgery and furthermore, this treatment should be extended into the early postoperative period.


Subject(s)
Analgesics/administration & dosage , Pain, Postoperative/prevention & control , Analgesia, Epidural , Analgesics, Opioid/administration & dosage , Anesthetics, Local/administration & dosage , Anti-Inflammatory Agents, Non-Steroidal/administration & dosage , Humans , Pain/physiopathology
12.
Anesth Analg ; 84(3): 479-83, 1997 Mar.
Article in English | MEDLINE | ID: mdl-9052286

ABSTRACT

Previous studies have demonstrated that heparin concentrations during cardiopulmonary bypass (CPB) may vary considerably, which may be related to variability in redistribution, cellular and plasma protein binding, and clearance of heparin. The purpose of this study was to determine whether hemofiltration removes lower molecular weight fractions of heparin from plasma and thus contributes to variability of blood levels of heparin. Twenty patients undergoing cardiac surgery with CPB were enrolled in this study after informed consent was obtained. The study was subdivided into two phases. The first 10 patients were enrolled in Phase I, which was designed to determine whether hemofiltration removes lower molecular weight fractions of heparin from blood. Blood specimens obtained from the inflow line and outflow lines of the hemofiltration unit were used to measure complete blood counts (CBC) and plasma heparin activity by anti-Xa and anti-IIa assays. Phase II was designed to evaluate the effect of hemofiltration on circulating plasma heparin activity. In Phase II, blood specimens were obtained from 10 patients via the arterial cannula of the extracorporeal circuit prior to and after hemofiltration for measurements of CBCs, anti-Xa plasma heparin, as well as whole blood heparin concentration using an automated protamine titration assay (Hepcon instrument, Medtronic Inc., Parker, CO). Ultrafiltrate and reservoir volumes were measured in both phases of the study. Hemofiltration did not remove lower (anti-Xa measurable) molecular weight heparin, but it resulted in a considerable increase in heparin activity in the outflow line, as measured by both anti-Xa and anti-IIa assays. The plasma anti-Xa heparin activity obtained after hemofiltration (5 +/- 1.8 U/mL) was substantially (P = 0.003) greater than heparin activity obtained prior to hemofiltration (3.9 +/- 1.7 U/mL). The increase in heparin activity with hemofiltration was directly related to ultrafiltrate volume (r = 0.63, P < 0.0001) and hematocrit (r = 0.73, P < 0.0001). Hemofiltration increases heparin concentration and may contribute to variability in heparin activity during CPB. Point-of-care heparin concentration methods would allow identification of the anticipated rise in heparin concentration, with the apparent clinical implication of a reduced need for supplemental heparin to maintain a target whole blood heparin concentration.


Subject(s)
Cardiopulmonary Bypass , Heparin/blood , Factor Xa Inhibitors , Hemofiltration , Humans , Prospective Studies , Prothrombin/antagonists & inhibitors
13.
Anesthesiology ; 85(6): 1311-23, 1996 Dec.
Article in English | MEDLINE | ID: mdl-8968178

ABSTRACT

BACKGROUND: This study was designed to evaluate a new point-of-care test (HemoSTATUS) that assesses acceleration of kaolin-activated clotting time (ACT) by platelet activating factor (PAF) in patients undergoing cardiac surgery. Our specific objectives were to determine whether HemoSTATUS-derived measurements correlate with postoperative blood loss and identify patients at risk for excessive blood loss and to characterize the effect of desmopressin acetate (DDAVP) and/or platelet transfusion on these measurements. METHODS: Demographic, operative, blood loss and hematologic data were recorded in 150 patients. Two Hepcon instruments were used to analyze ACT values in the absence (channels 1 and 2: Ch1 and Ch2) and in the presence of increasing doses of PAF (1.25, 6.25, 12.5, and 150 nM) in channels 3-6 (Ch3-Ch6). Clot ratio (CR) values were calculated with the following formula for each respective PAF concentration: clot ratio = 1-(ACT/control ACT). These values also were expressed as percent of maximal (%M = clot ratio/0.51 x 100) using the mean CRCh6 (0.51) obtained in a reference population. RESULTS: When compared with baseline clot ratios before anesthetic induction, a marked reduction in clot ratios was observed in both Ch5 and Ch6 after protamine administration, despite average platelet counts greater than 100 K/microliter. There was a high degree of correlation between clot ratio values and postoperative blood loss (cumulative chest tube drainage in the first 4 postoperative hours) with higher concentrations of PAF: CRCh6 (r = -0.80), %M of CRCh6 (r = -0.82), CRCh5 (r = -0.70), and %M of CRCh5 (r = -0.85). A significant (P < 0.01) improvement in clot ratios was observed with time after arrival in the intensive care unit in both Ch5 and Ch6, particularly in patients receiving DDAVP and/or platelets. CONCLUSIONS: Activated clotting time-based clot ratio values correlate significantly with postoperative blood loss and detect recovery of PAF-accelerated coagulation after administration of DDAVP or platelet therapy. The HemoSTATUS assay may be useful in the identification of patients at risk for excessive blood loss and who could benefit from administration of DDAVP and/or platelet transfusion.


Subject(s)
Blood Loss, Surgical/prevention & control , Cardiac Surgical Procedures , Deamino Arginine Vasopressin/therapeutic use , Platelet Activating Factor/therapeutic use , Postoperative Complications/prevention & control , Renal Agents/therapeutic use , Whole Blood Coagulation Time , Aged , Evaluation Studies as Topic , Female , Hemostasis , Humans , Male , Middle Aged , Platelet Transfusion , Preoperative Care
14.
J Pain Symptom Manage ; 12(4): 255-60, 1996 Oct.
Article in English | MEDLINE | ID: mdl-8898510

ABSTRACT

Pruritus is a common opioid side effect and can be so severe that opioid therapy must be modified or abandoned. Antihistamines, opioid antagonists, and propofol have been proposed as treatment options, but none is universally effective. The use of intranasal butorphanol, an opioid agonist-antagonist, for pruritus has not been described previously. Six patients complaining of severe opioid-induced pruritus unresponsive to diphenhydramine received 2 mg intranasal butorphanol every 4-6 hr. Scores for pruritus, pain, and sedation were recorded on separate visual analogue scales (VAS). All patients reported significant relief from pruritus 60 min after butorphanol administration (P < 0.001); five patients noted an improvement within 15 min (P < 0.08). Sedation and pain VAS scores were not significantly different from baseline at all time points. These preliminary data demonstrate a substantial effect of intranasal butorphanol on opioid-induced pruritus that has not responded to antihistamines. Prospective controlled studies are needed to validate these findings.


Subject(s)
Analgesics, Opioid/adverse effects , Butorphanol/therapeutic use , Narcotic Antagonists/therapeutic use , Pruritus/drug therapy , Administration, Intranasal , Adult , Analgesics, Opioid/antagonists & inhibitors , Female , Humans , Male , Middle Aged , Prospective Studies , Pruritus/chemically induced
15.
Spine (Phila Pa 1976) ; 21(17): 1979-84, 1996 Sep 01.
Article in English | MEDLINE | ID: mdl-8883198

ABSTRACT

STUDY DESIGN: This is a prospective study. OBJECTIVE: The authors investigated the effects of continuous bracing for idiopathic scoliosis on lung function variables at three consecutive time points over a 2-year period. SUMMARY OF BACKGROUND DATA: Only short-term results regarding lung function impairment caused by bracing exist. METHODS: Thirty adolescents (aged 13.6 +/- 1.8 years) with primary idiopathic thoracic scoliosis of 28.7 degrees +/- 4.1 degrees and primary lumbar scoliosis of 26.5 degrees +/- 10.4 degrees were treated with a Boston brace. All patients underwent pulmonary function studies at the beginning of brace treatment and 12 and 24 months after treatment initiation. The examinations were always performed while the patients were sitting, in and out of the brace. Patients removed the brace for 1 hour before the measurements for non-brace-wearing were performed. Vital capacity, forced expiratory volume in 1.0 second, and minute ventilation were determined with a low inertia, low resistance bell spirometer. Lung volume, including total lung capacity and functional residual capacity, was recorded. RESULTS: The primary thoracic scoliosis was corrected to 14.5 degrees +/- 4.0 degrees and the primary lumbar scoliosis to 13.0 degrees +/- 6.0 degrees. The values of the following parameters taken while the brace was worn were significantly lower than those taken without the brace at all time points (one-way analysis of variance); vital capacity (P < 0.02), forced vital capacity (P < 0.03), functional residual capacity (P < 0.02), and residual volume (P < 0.05). Furthermore, the predicted negative residual volume and negative functional residual capacity values differed significantly in all time points from negative residual volume and negative functional residual capacity values of patients while wearing the Boston brace (P < 0.01 and P < 0.02, respectively). CONCLUSIONS: The results suggest that brace wearing for mild idiopathic scoliosis does not harm adolescent lung function over a 2-year period and is recommended for treatment of idiopathic scoliosis in early adolescence when the generally accepted criteria for bracing are fulfilled.


Subject(s)
Braces , Respiration , Scoliosis/physiopathology , Scoliosis/therapy , Adolescent , Humans , Longitudinal Studies , Prospective Studies , Residual Volume , Respiratory Function Tests , Vital Capacity
16.
Anesth Analg ; 82(6): 1126-31, 1996 Jun.
Article in English | MEDLINE | ID: mdl-8638779

ABSTRACT

This study was designed to evaluate the effect of aprotinin on activated versus nonactivated whole blood clotting time using two different on-site methods and to quantify these anticoagulant properties when compared to heparin in a controlled, in vitro environment. Blood specimens were obtained prior to heparin administration from 56 patients undergoing cardiac surgery. Specimens obtained from the first consecutive 20 patients were mixed with either normal saline (NS) or aprotinin (400 kallikrein inhibiting units (KIU)/mL), inserted into Hemochron tubes containing either NS or heparin (0.3 or 0.6 U/mL) and then used to measure celite-activated (celite ACT) and nonactivated whole blood clotting time (WBCT1) using four Hemochron instruments. Accordingly, specimens obtained from the second consecutive 20 patients were mixed with either NS or aprotinin, inserted into Automated Clot Timer cartridges containing either NS or heparin (0.06, 0.13, or 0.25 U/mL) and then used to measure kaolin-activated (kaolin ACT) or nonactivated whole blood clotting times (WBCT2) using four Automated Clot Timer instruments. Specimens obtained from the last 16 patients were mixed with either incrementally larger doses of aprotinin (0, 100, 200, 300, or 400 KIU/mL) or heparin (0, 0.12, 0.24, 0.36, 0.48, or 0.72 U/mL) and were then used for measurement of whole blood clotting time (WBCT2) using six Automated Clot Timer instruments. Aprotinin significantly prolonged activated or nonactivated whole blood clotting time and potentiated the prolongation of whole blood clotting time by heparin. The linear relationship between whole blood clotting time and either heparin concentration (WBCT2 = H x 357 + 280, mean adjusted r2 = 0.88) or aprotinin concentration (WBCT2 = A x 0.97 + 300, mean adjusted r2 = 0.94) was variable among patients. On average, 200 KIU/mL of aprotinin prolonged WBCT2 to the same extent as 0.69 +/- 0.28 U/mL of heparin using linear regression models within each patient. Aprotinin significantly prolongs activated or nonactivated whole blood clotting time measurements in a dose-dependent manner. Since prolongation of whole blood clotting time by heparin is potentiated by aprotinin in vitro, aprotinin's anticoagulant properties may in part account for the prolonged celite activated clotting time values observed in the presence of aprotinin.


Subject(s)
Anticoagulants/pharmacology , Aprotinin/pharmacology , Blood Coagulation/drug effects , Hemostatics/pharmacology , Heparin/pharmacology , Adult , Cardiac Surgical Procedures/methods , Diatomaceous Earth/pharmacology , Drug Synergism , Humans , Kaolin/pharmacology , Whole Blood Coagulation Time
17.
Crit Care Med ; 24(5): 855-61, 1996 May.
Article in English | MEDLINE | ID: mdl-8706465

ABSTRACT

OBJECTIVE: To examine the hemodynamic effects and the oxygen transport pattern of autotransfusion of unprocessed blood on hemodynamics and oxygen transportation. DESIGN: Prospective, observational study. SETTING: Research laboratory of a university medical center. SUBJECTS: Six healthy, domestic pigs (20 - 33 kg). INTERVENTIONS: A left thoracotomy was performed and a 5-mm incision was created in the descending aorta, resulting in a controlled hemorrhage of 30 mL/kg (approximately 40% of blood volume) into the thoracic cavity over a 45-min time period. During that period, mean arterial pressure (MAP) was maintained slightly > 50 mm Hg, using intravenous lactated Ringers' solution. The blood sample was collected from the open thorax with compresses soaked in citric acid solution and then extracted by manual squeezing, filtered through several layers of gauzes, and stored in glass bottles. Repeat measurements were performed after hemorrhage, after retransfusion of the harvested blood, and thereafter every 15 mins up to 60 mins. The animals were supported for 2 more hrs and were observed for the following 48 hrs. MEASUREMENTS AND MAIN RESULTS: All animals survived and were in good condition 48 hrs after the experimental hemorrhage. The circulatory and oxygen transport response at the end of hemorrhage and concomitant maintenance of blood pressure at > 50 mm Hg resulted in: significant reductions of cardiac index, MAP, and oxygen transport (DO2) (46%, 50%, and 64% reductions, respectively, p < .01, in an increase of heart rate (HR) (+21%, not significant), pulmonary vascular resistance index (+112%, p < .05), and oxygen extraction (+105%, p < .01), as well as in a nonsignificant decrease of systemic vascular resistance index (-8%). After autotransfusion, the basic hemodynamic variables, MAP and HR were corrected, remaining near baseline (not significant) afterward. Cardiac index and DO2 increased after autotransfusion, but remained below the baseline until the end of the study protocol (p < .05). A significant increase was noticed for pulmonary arterial pressure and pulmonary vascular resistance index immediately after autotransfusion (p < .01). These values were corrected in part after 15 to 30 mins, but remained higher throughout the observaton period compared with baseline (29.5% and 89.8%, respectively, p < .05). The recently introduced relationship between cardiac index and oxygen extraction has been proposed to avoid problems of mathematical coupling between oxygen consumption and DO2 measurements. This relationship followed a similar course in all experiments throughout each phase. A shift downward and to the right represented the endpoint of the hemorrhagic phase. After autotransfusion, a shift toward baseline was noticed. Prothrombin time and partial thromboplastin time remained unchanged after autotransfusion. Free hemoglobin concentrations increased immediately after autotransfusion (+33%, p < .05), but returned to baseline values 48 hrs later. Histologic examination showed no changes in the examined organs. CONCLUSIONS: Reinfusion of large amounts of unprocessed blood (up to 40% of blood volume), collected with compresses from a noncontaminated surgical field is a cheap method, which may be of potential benefit in trauma patients, when more sophisticated autotransfusion devices are lacking. In the present study, this method resulted in transient but significant hemodynamic changes in the pulmonary circulation. Impairment of oxygen transport was noticed after the end of hemorrhage, but this impairment cannot be attributed to the autotransfusion technique alone, but also to factors such as hemorrhagic shock, surgical trauma, etc.


Subject(s)
Blood Transfusion, Autologous/methods , Hemodynamics , Oxygen Consumption , Shock, Hemorrhagic/therapy , Animals , Disease Models, Animal , Male , Prospective Studies , Pulmonary Circulation , Shock, Hemorrhagic/blood , Shock, Hemorrhagic/physiopathology , Swine
18.
Anesth Analg ; 82(1): 13-21, 1996 Jan.
Article in English | MEDLINE | ID: mdl-8712388

ABSTRACT

The purpose of this study was to prospectively evaluate whether heparin and protamine doses administered using a standardized protocol based on body weight and activated clotting time values are associated with either transfusion of hemostatic blood products (HBPs) or excessive postoperative bleeding. Analysis using 10 multiple logistic or linear regression models in 487 cardiac surgical patients included perioperative variables that may have an association with either transfusion of HBP and/or excessive postoperative chest tube drainage (CTD). Prolonged duration of cardiopulmonary bypass (CPB), lower pre-CPB heparin dose, lower core body temperature in the intensive care unit, combined procedures, older age, repeat procedures, a larger volume of salvaged red cells reinfused intraoperatively and abnormal laboratory coagulation results (prothrombin time, activated partial thromboplastin time, and platelet count) after CPB were associated with both transfusion of HBP and increased CTD. Female gender, lower total heparin dose, preoperative aspirin use and the number of HBPs administered intraoperatively were associated only with increased CTD, whereas a larger total protamine dose was associated only with perioperative transfusion of HBPs. Preoperative use of warfarin or heparin was not associated with excessive blood loss of perioperative transfusion of HBPs. In contrast to previous studies using bovine heparin, data from the present study do not support the use of reduced doses of porcine heparin during CPB.


Subject(s)
Blood Component Transfusion , Blood Loss, Surgical/prevention & control , Cardiopulmonary Bypass/adverse effects , Hemostasis, Surgical/methods , Age Factors , Aged , Animals , Chest Tubes , Drainage , Female , Heparin/adverse effects , Heparin/therapeutic use , Humans , Male , Middle Aged , Multivariate Analysis , Prospective Studies , Protamines/adverse effects , Protamines/therapeutic use , Risk Factors , Swine , Time Factors , Whole Blood Coagulation Time
19.
Spine (Phila Pa 1976) ; 20(9): 1061-7, 1995 May 01.
Article in English | MEDLINE | ID: mdl-7631236

ABSTRACT

STUDY DESIGN: This study analyzed the changes in the frontal plane of the deformed lower rib cage and the scoliosis-related alterations on the spine in patients with double major curve-pattern idiopathic scoliosis. OBJECTIVES: The results obtained preoperatively, after the Zielke operation, postoperatively after the Harrington instrumentation, and at the follow-up evaluation were compared to investigate which changes of the elements of the rib cage deformity are caused by each of the two instrumentations. SUMMARY OF BACKGROUND DATA: Previously, Wojcik reported on the effects of a Zielke operation on the lower rib in mild S-shaped idiopathic scoliosis. No previous data exist regarding the lower rib cage deformities in severe idiopathic double major-pattern scoliosis and their changes after combined VDS-Zielke and Harrington instrumentation. METHODS: Fifteen patients who underwent the staged Zielke operation followed by Harrington rod instrumentation were followed-up for an average period of 31.1 months. The methods used in our study included Cobb angle and a segmental analysis (T7-T12) of each of convex and concave rib-vertebra angles, rib-vertebra angle differences, vertebral rotation, and vertebral tilt. RESULTS: In this series, the apical convex ribs showed an increased droop preoperatively compared with the concave apical ribs. The VDS-Zielke operation corrected the lumbar scoliosis in an average of 63% of patients, whereas the thoracic scoliosis showed an immediate spontaneous correction of 30%. The VDS-Zielke operation also produced a significant correlation of the scoliosis-related vertebral tilt (T10-T12), derotated the lumbar vertebrae and the T12 vertebra significantly, elevated the "mobile" concave ribs, and increased the droop of the lower (T11, T12) "mobile" convex ribs. The Harrington instrumentation did not change the vertebral rotation, the vertebral tilt, the convex rib-vertebra angle, or the L4 obliquity, but significantly changed the apical concave rib-vertebra angle. The combined Zielke-Harrington instrumentation reduced the thoracic kyphosis and the thoracolumbar junction-kyphosis significantly, whereas the lumbar lordosis remained practically unchanged. CONCLUSIONS: Only the anterior VDS-Zielke instrumentation significantly corrects severe spinal deformities, elevates the three lower ribs on the concavity, and increases the droop of the two lower ribs on the convexity in the severe idiopathic double major curve-pattern scoliosis combined operated (Zielke-Harrington). Therefore, the Harrington instrumentation should have only limited use in cosmetic scoliosis surgery and should be replaced with posterior multi-hook instrumentation with a derotation effect.


Subject(s)
Orthopedic Fixation Devices , Ribs/abnormalities , Scoliosis/surgery , Adolescent , Adult , Female , Humans , Kyphosis/surgery , Male
20.
Anesthesiology ; 81(3): 591-601; discussion 27A-28A, 1994 Sep.
Article in English | MEDLINE | ID: mdl-8092504

ABSTRACT

BACKGROUND: Epidural clonidine produces effective postoperative analgesia in humans. Observed side effects include hypotension, bradycardia, sedation, and dryness of the mouth. A recent clinical study demonstrated that 150 micrograms intrathecal clonidine administered postoperatively as the sole analgesic agent was effective but produced hypotension and sedation. Animal studies have provided evidence of a biphasic effect on blood pressure after intrathecal clonidine administration, but no data concerning this effect in humans currently exist. This study was performed to evaluate the dose-response hemodynamic and analgesic profiles of intrathecal clonidine administered after a standard surgical intervention, without perioperative administration of additional analgesics, local anesthetics, or tranquilizers. METHODS: In a randomized prospective double-blind study, 30 women who underwent elective cesarean section during general anesthesia with thiopental, nitrous oxide, and halothane were studied. Forty-five minutes after tracheal extubation, a lumbar intrathecal puncture was performed, and the patients received 150 (group 1), 300 (group 2), or 450 (group 3) micrograms clonidine. Postoperative analgesia was assessed on a visual analog scale at rest and after deep cough at standard time points up to 24 h. At the same time points, blood pressure, heart rate, sedation, and respiratory rate also were recorded. RESULTS: Intrathecal clonidine decreased pain in all three groups both at rest and with coughing very shortly after injection, in a dose-dependent fashion. Clonidine 450 and 300 micrograms reduced pain scores significantly earlier (3rd and 6th min after intrathecal injection respectively), compared with 150 micrograms clonidine. Pain relief, defined as the time to first request for supplemental analgesic by patients, lasted 402 +/- 75 min in group 1, 570 +/- 76 min in group 2, and 864 +/- 80 min in group 3; significant differences among all groups; P < 0.01-0.001). Clonidine reduced mean arterial pressure compared with baseline only in group 1 (21 +/- 13%, P < 0.05). Delayed hypotension or bradycardia were not encountered after any of the three dose studies. Sedation was evident in all groups, but group 3 patients were significantly more sedated than group 1 and 2 patients. Respiratory rate and motor activity of the lower extremities were unaffected in all three groups (differences not significant). CONCLUSIONS: These results demonstrate dose-dependent analgesia after intrathecal clonidine at doses as great as 450 micrograms. The nearly immediate analgesic effect observed after intrathecal injection of 300 and 450 micrograms clonidine strongly argues for a spinal rather than a systemic site of action of this alpha 2-adrenergic agonist. After 300 and 450 micrograms intrathecal clonidine a relative hemodynamic stability is observed, suggesting a pressor effect at peripheral sites.


Subject(s)
Analgesia, Obstetrical/methods , Clonidine/administration & dosage , Hemodynamics/drug effects , Adult , Blood Pressure/drug effects , Cesarean Section , Clonidine/adverse effects , Conscious Sedation , Dose-Response Relationship, Drug , Double-Blind Method , Female , Heart Rate/drug effects , Humans , Hypotension/chemically induced , Injections, Spinal , Pregnancy
SELECTION OF CITATIONS
SEARCH DETAIL
...