Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 59
Filter
1.
Am J Respir Crit Care Med ; 149(2 Pt 1): 295-305, 1994 Feb.
Article in English | MEDLINE | ID: mdl-8306022

ABSTRACT

The impact of a new therapy that includes pressure-controlled inverse ratio ventilation followed by extracorporeal CO2 removal on the survival of patients with severe ARDS was evaluated in a randomized controlled clinical trial. Computerized protocols generated around-the-clock instructions for management of arterial oxygenation to assure equivalent intensity of care for patients randomized to the new therapy limb and those randomized to the control, mechanical ventilation limb. We randomized 40 patients with severe ARDS who met the ECMO entry criteria. The main outcome measure was survival at 30 days after randomization. Survival was not significantly different in the 19 mechanical ventilation (42%) and 21 new therapy (extracorporeal) (33%) patients (p = 0.8). All deaths occurred within 30 days of randomization. Overall patient survival was 38% (15 of 40) and was about four times that expected from historical data (p = 0.0002). Extracorporeal treatment group survival was not significantly different from other published survival rates after extracorporeal CO2 removal. Mechanical ventilation patient group survival was significantly higher than the 12% derived from published data (p = 0.0001). Protocols controlled care 86% of the time. Average PaO2 was 59 mm Hg in both treatment groups. Intensity of care required to maintain arterial oxygenation was similar in both groups (2.6 and 2.6 PEEP changes/day; 4.3 and 5.0 FIO2 changes/day). We conclude that there was no significant difference in survival between the mechanical ventilation and the extracorporeal CO2 removal groups. We do not recommend extracorporeal support as a therapy for ARDS. Extracorporeal support for ARDS should be restricted to controlled clinical trials.


Subject(s)
Carbon Dioxide/blood , Extracorporeal Membrane Oxygenation/methods , Positive-Pressure Respiration , Respiration, Artificial/methods , Respiratory Distress Syndrome/therapy , Adult , Combined Modality Therapy , Female , Hospital Costs/statistics & numerical data , Humans , Length of Stay/statistics & numerical data , Life Tables , Male , Respiratory Distress Syndrome/mortality , Survival Analysis , Survival Rate , Treatment Outcome
2.
Am J Crit Care ; 2(6): 436-43, 1993 Nov.
Article in English | MEDLINE | ID: mdl-8275147

ABSTRACT

OBJECTIVE: To determine nursing resource utilization (acuity hours and dollars) by trauma patients based on analysis of a nursing acuity system and five trauma scoring systems. METHODS: Retrospective review of 448 trauma patients who required transport by aircraft to a level I trauma center. Values from the institution's automated nursing acuity system were compared with the Glasgow Coma Scale score, trauma score, revised trauma score, CRAMS score and injury severity score to obtain acuity hours and financial cost of care for trauma patients. RESULTS: Consistently, analysis of scores computed by five scoring instruments confirmed that nursing resource utilization is greatest for patients who are severely injured but likely to recover. For example, patients with a trauma score of 1 required 49 (+/- 66) mean acuity hours of care; those with a trauma score of 8 needed 189 (+/- 229) mean acuity hours; and those with a trauma score of 16 used 73 (+/- 120) mean acuity hours. Mean dollar costs were $980 (+/- 1293), $3812 (+/- 4518) and $1492 (+/- 2473), respectively. CONCLUSIONS: Nursing resource utilization can be determined for trauma patients by using an automated nursing acuity system and trauma scoring systems. Data acquired in this way provide a concrete basis for healthcare and reimbursement reform, for administrators who design nursing allocations and for nursing educators who prepare graduates to meet the needs of healthcare consumers.


Subject(s)
Critical Care/economics , Specialties, Nursing/economics , Trauma Severity Indices , Wounds and Injuries/economics , Health Care Costs , Health Resources/statistics & numerical data , Humans , Retrospective Studies , Time Factors , United States , Wounds and Injuries/nursing
4.
Circulation ; 87(6): 1829-39, 1993 Jun.
Article in English | MEDLINE | ID: mdl-8504495

ABSTRACT

BACKGROUND: Coronary patency has been used as a measure of thrombolysis success after acute myocardial infarction. The Thrombolysis in Myocardial Infarction (TIMI) Study Group perfusion grades have gained wide acceptance, with grades 0 (no distal flow) and 1 perfusion (minimal flow) being designated as thrombolysis failures and grades 2 (partial perfusion) and 3 (complete perfusion) as thrombolysis successes. However, the significance of the individual TIMI grades on clinical outcome has not been adequately assessed. METHODS AND RESULTS: To evaluate the functional significance of TIMI perfusion grades, we compared 1-day coronary patency status with ventriculographic, enzymatic, and ECG indexes of acute myocardial infarction in 298 patients treated with anistreplase or alteplase within 4 hours of myocardial infarction symptom onset. Radionuclide ejection fraction was determined at 1 week and at 1 month. Perfusion grades for the entire study population were distributed as 12% (n = 37) grades 0/1, 13% (n = 40) grade 2, and 74% (n = 221) grade 3. Patency profile did not differ between the two thrombolytic regimens. Further coronary interventions were performed after the 1-day patency determination in 43% of patients (43%, 48%, 42%, respectively, in grades 0/1, 2, and 3 patients). The outcome of grade 2 patients did not differ from grades 0/1 patients in ejection fraction, enzyme peaks, ECG markers, or morbidity index. In contrast, grade 3 patients, compared with grades 0-2 patients, showed 1) a greater global ejection fraction at 1 week (54% versus 49%, p = 0.006) and at 1 month (54% versus 49%, p = 0.01), 2) a greater infarct zone ejection fraction at 1 week (41% versus 33%, p = 0.003) and at 1 month (42% versus 32%, p = 0.003), 3) smaller enzyme peaks, significant for lactate dehydrogenase, and shorter times to enzyme peaks, significant for all four enzymes, 4) a smaller QRS score at discharge and at 1 month, and 5) a trend toward a lower morbidity index. CONCLUSIONS: Grade 3 flow predicts significantly better outcomes than lesser grades of flow and represents an important measure of reperfusion success.


Subject(s)
Anistreplase/therapeutic use , Myocardial Infarction/drug therapy , Thrombolytic Therapy , Tissue Plasminogen Activator/therapeutic use , Clinical Enzyme Tests , Coronary Angiography , Coronary Vessels/physiopathology , Double-Blind Method , Electrocardiography , Female , Gated Blood-Pool Imaging , Humans , Male , Middle Aged , Myocardial Infarction/diagnosis , Myocardial Infarction/epidemiology , Vascular Patency/physiology
5.
Surgery ; 113(6): 603-7, 1993 Jun.
Article in English | MEDLINE | ID: mdl-8506516

ABSTRACT

BACKGROUND: Several studies have suggested an association between blood transfusions and infection in surgical patients. However, previous reports have not documented the relationship of transfusion to specific infection sites and have not adequately explored the importance of timing and type of blood product. METHODS: We reviewed the records of all patients undergoing operation for colon cancer at a large community hospital during the years 1974 to 1987. Data on hospital wound and other infections, wound infection risk factors, and type and timing of transfusions were analyzed. RESULTS: Increased wound infection rates were associated with administration of both whole blood and packed red blood cells. However, multivariate analysis suggested that only the administration of packed red cells after operation independently predicted wound infections. Other independent variables were the presence of a colostomy and/or drain. A highly predictive model for wound infection was constructed with these three variables. CONCLUSIONS: Blood transfusions, especially with packed red cells, after operation are an independent risk factor for wound infection.


Subject(s)
Surgical Wound Infection/etiology , Transfusion Reaction , Colonic Neoplasms/surgery , Colostomy/adverse effects , Humans , Multivariate Analysis , Retrospective Studies , Risk Factors
6.
Neuropediatrics ; 23(5): 228-34, 1992 Oct.
Article in English | MEDLINE | ID: mdl-1454140

ABSTRACT

Between 1970 and 1986, 120 children with central nervous system malignancy were treated with radiation therapy. These included 44 low-grade astrocytomas, 11 high grade astrocytomas, 32 medulloblastomas, 15 ependymomas/ependymoblastomas, 3 primitive neuroectodermal tumors and 8 pineal tumors. Seven children were treated without biopsy. Fifty-one treated children were evaluated for the effects of therapy on growth, endocrine function, IQ and hair regrowth. Mean height was 1.5 standard deviations below the mean height for the patient's age at study (range 0-5.7). Height was significantly less in patients receiving radiation to the pituitary and those with somatomedin-C deficiency. Height was also decreased with whole CNS radiation and spine dose > 20 Gy but not to a significant degree. Pituitary radiation in any dose increased the chance of endocrine deficiency (p = 0.004) and 21 of 51 patients had somatomedin-C deficiency. Mean IQ was 92.7 (+/- 18.8), with a slight trend toward decreased IQ with increasing whole brain dose of radiation. Hair regrowth was complete in 20 of 46 evaluated patients, diminished regrowth occurring with increasing volume and dose of radiation. No difference in the measured late effects could be detected with respect to age at treatment, sex, histology or location of tumor.


Subject(s)
Central Nervous System Neoplasms/radiotherapy , Growth/radiation effects , Hair/radiation effects , Body Height/radiation effects , Brain Neoplasms/classification , Brain Neoplasms/mortality , Brain Neoplasms/radiotherapy , Central Nervous System Neoplasms/mortality , Child , Child, Preschool , Dose-Response Relationship, Radiation , Female , Growth Hormone/blood , Humans , Hypopituitarism/diagnosis , Hypopituitarism/etiology , Infant , Infant, Newborn , Intelligence/radiation effects , Intelligence Tests , Male , Radiotherapy/adverse effects
7.
Am Heart J ; 124(3): 557-64, 1992 Sep.
Article in English | MEDLINE | ID: mdl-1514481

ABSTRACT

To assess the effects of thrombolysis and reperfusion on late potentials after myocardial infarction, 101 patients (79 men, age 63.2 +/- 10.5 years) underwent signal-averaged ECG studies at 10.7 +/- 9.2 days, with the use of a 40 to 250 Hz band-pass filter. Patients were divided into four groups: (1) 54 patients treated with thrombolytic agents at 2.8 +/- 1.1 hours, with 81% "early" patency/reperfusion (TIMI grades 2 and 3); (2) 47 patients treated conventionally with 45% "late" patency/reperfusion; (3) 56 patients with patency (TIMI grades 2 and 3); and (4) 26 patients without patency (TIMI grades 0 and 1). A late potential was present when greater than or equal to 2 of 3 defined criteria were present. There was a significant difference in the incidence of late potentials between groups 1 and 2 (22% vs 43%, respectively; p = 0.048) and between groups 3 and 4 (18% vs 50%, respectively; p = 0.006). Late potentials also tended to occur less often after "early" than after "late" patency/reperfusion (12.5% vs 25%). The odds ratio for developing a late potential was 0.39 for thrombolysis versus no thrombolysis (p less than 0.05) and 0.22 for patency/reperfusion (TIMI grades 2 and 3) versus no patency/reperfusion (TIMI grades 0 and 1) (p less than 0.05). By analysis of covariance the effects of thrombolysis on late potentials were entirely explained by reperfusion. Thus the risk of late potentials after myocardial infarction is high but is reduced by thrombolysis and reperfusion. In addition, the effectiveness of "early" reperfusion appears to be greater than that of "late" but requires further clarification.


Subject(s)
Coronary Vessels/physiopathology , Electrocardiography , Myocardial Infarction/drug therapy , Thrombolytic Therapy , Adult , Aged , Aged, 80 and over , Analysis of Variance , Female , Humans , Male , Middle Aged , Myocardial Infarction/physiopathology , Myocardial Reperfusion , Streptokinase/therapeutic use , Time Factors , Tissue Plasminogen Activator/therapeutic use , Urokinase-Type Plasminogen Activator/therapeutic use , Vascular Patency
8.
J Heart Lung Transplant ; 11(3 Pt 2): S111-9, 1992.
Article in English | MEDLINE | ID: mdl-1622989

ABSTRACT

We have prospectively monitored 268 patients by our previously described method of routine immunofluorescence of endomyocardial biopsy specimens. We have classified these patients according to their rejection pattern: cellular, vascular, and mixed. The criteria for these designations have been previously described. In this study we retrospectively reviewed coronary angiograms of these patients to assess the presence and time-course of developing allograft coronary artery disease. All available explanted hearts and postmortem hearts were also assessed by light microscopic examination for acute coronary vasculitis and allograft coronary artery disease and by immunofluorescent microscopy for vascular immune complex deposition in a manner identical to immunofluorescent microscopic examination of endomyocardial biopsy specimens. Patients were also monitored for sensitization to immunoprophylactically administered murine monoclonal CD3 antibody (OKT3) and those demonstrated to be sensitized were separately analyzed. Clinical features and treatment of patients were retrospectively reviewed. We found that 141 patients could be classified as having cellular rejection, 76 as having vascular rejection, and 52 as having a mixed rejection pattern. The allograft survival in vascular rejection patients was significantly worse than in allografts of patients with cellular or mixed rejection, confirming our earlier results. Most importantly, we found a significant difference in the time to the development of allograft coronary artery disease based on the rejection pattern. This difference existed whether or not patients sensitized to OKT3 were excluded from evaluation. Patients with mixed rejection had an intermediate time to the development of allograft coronary artery disease between that of patients with cellular and vascular rejection.(ABSTRACT TRUNCATED AT 250 WORDS)


Subject(s)
Coronary Disease/pathology , Graft Rejection , Postoperative Complications/pathology , Coronary Disease/immunology , Coronary Disease/therapy , Female , Follow-Up Studies , HLA-DR Antigens/immunology , Humans , Male , Middle Aged , Muromonab-CD3/therapeutic use , Postoperative Complications/immunology , Postoperative Complications/therapy , Prospective Studies
9.
J Heart Lung Transplant ; 11(3 Pt 2): S142-58, 1992.
Article in English | MEDLINE | ID: mdl-1622993

ABSTRACT

To examine factors potentially predictive of outcome after repeat heart transplantation, data were analyzed for 449 recipients of second allografts reported to the registry of the International Society for Heart and Lung Transplantation and a matched group of 421 primary transplant recipients. Survival was markedly decreased in repeat transplantation patients (1 year actuarial survival rate, 48% vs 79%; p less than 0.001). Univariate analysis showed no impact on survival of recipient age or gender, ischemic time, or transplant center experience. Accelerated coronary artery disease as the cause of allograft failure, longer interval between transplants, lack of preoperative mechanical assistance, and second transplantation after 1985 were predictive of increased survival after repeat transplantation. An "ideal candidate" defined by these predictive variables had a 1-year survival rate of 64%. In addition to the International Society for Heart and Lung Transplantation registry, a multicenter data base was developed with data for 125 repeat transplant recipients and 1325 primary transplant recipients at 13 transplant centers in the United States. In this group of patients the 1-year survival rate was greater than that in the International Society for Heart and Lung Transplantation registry (60% vs 48%), and the impact of the predictive variables listed previously was decreased. The incidence of rejection, infection, and accelerated coronary artery disease was not different between secondary and primary allograft recipients. Nonskin malignancies occurred more frequently in repeat transplantation patients (8% vs 4%; p less than 0.05). Recipients of second allografts were more likely to have major surgical complications, had a higher level of sensitization to HLA antigens, and were more likely to have a positive donor-specific crossmatch (17% vs 2%). A trend toward improved survival was noted in patients with repetition in the second donor of mismatched HLA antigens present in the first donor (1-year survival rate of 68% vs 47%; p = 0.06). We conclude that longer interval between transplants, accelerated coronary artery disease as cause of allograft loss, and lack of preoperative mechanical assistance are predictive of longer survival after repeat transplantation. Nonetheless, the "ideal candidate" for repeat transplantation has an anticipated survival rate significantly less than that expected for primary transplant recipients.


Subject(s)
Coronary Disease/surgery , Heart Transplantation/mortality , Adult , Coronary Disease/etiology , Female , Graft Rejection , Humans , Immunosuppressive Agents/therapeutic use , Male , Reoperation/mortality , Survival Rate , Time Factors
10.
J Clin Invest ; 89(3): 867-77, 1992 Mar.
Article in English | MEDLINE | ID: mdl-1541678

ABSTRACT

An interferon-gamma, tumor necrosis factor, and interleukin-1-inducible, high-output pathway synthesizing nitric oxide (NO) from L-arginine was recently identified in rodents. High-dose interleukin-2 (IL-2) therapy is known to induce the same cytokines in patients with advanced cancer. Therefore, we examined renal cell carcinoma (RCC; n = 5) and malignant melanoma (MM; n = 7) patients for evidence of cytokine-inducible NO synthesis. Activity of this pathway was evaluated by measuring serum and urine nitrate (the stable degradation product of NO) during IL-2 therapy. IL-2 administration caused a striking increase in NO generation as reflected by serum nitrate levels (10- and 8-fold increase [P less than 0.001, P less than 0.003] for RCC and MM patients, respectively) and 24-h urinary nitrate excretion (6.5- and 9-fold increase [both P less than 0.001] for RCC and MM patients, respectively). IL-2-induced renal dysfunction made only a minor contribution to increased serum nitrate levels. Metabolic tracer studies using L-[guanidino-15N2]arginine demonstrated that the increased nitrate production was derived from a terminal guanidino nitrogen atom of L-arginine. Our results showing increased endogenous nitrate synthesis in patients receiving IL-2 demonstrate for the first time that a cytokine-inducible, high-output L-arginine/NO pathway exists in humans.


Subject(s)
Arginine/metabolism , Interleukin-2/pharmacology , Nitric Oxide/metabolism , Adult , Carcinoma, Renal Cell/metabolism , Carcinoma, Renal Cell/therapy , Female , Humans , Interferon-gamma/pharmacology , Interleukin-2/therapeutic use , Kidney Neoplasms/metabolism , Kidney Neoplasms/therapy , Kidney Tubules/drug effects , Male , Melanoma/metabolism , Melanoma/therapy , Middle Aged , Vascular Resistance/drug effects
11.
Am J Infect Control ; 20(1): 4-10, 1992 Feb.
Article in English | MEDLINE | ID: mdl-1554148

ABSTRACT

Surveillance for hospital-acquired infections is required in U.S. hospitals, and statistical methods have been used to predict the risk of infection. We used the HELP (Health Evaluation through Logical Processing) Hospital Information System at LDS Hospital to develop computerized methods to identify and verify hospital-acquired infections. The criteria for hospital-acquired infection are standardized and based on the guidelines of the Study of the Efficacy of Nosocomial Infection Control and the Centers for Disease Control. The computer algorithms are automatically activated when key items of information, such as microbiology results, are reported. Computer surveillance identified more hospital-acquired infections than did traditional methods and has replaced manual surveillance in our 520-bed hospital. Data on verified hospital-acquired infections are electronically transferred to a microcomputer to facilitate outbreak investigation and the generation of reports on infection rates. Recently, we used the HELP system to employ statistical methods to automatically identify high-risk patients. Patient data from more than 6000 patients were used to develop a high-risk equation. Stepwise logistic regression identified 10 risk factors for nosocomial infection. The HELP system now uses this logistic-regression equation to monitor and determine the risk status for all hospitalized patients each day. The computer notifies infection control practitioners each morning of patients who are newly classified as being at high risk. Of 605 hospital-acquired infections during a 6-month period, 472 (78%) occurred in high-risk patients, and 380 (63%) were predicted before the onset of infection. Computerized regression equations to identify patients at risk of having hospital-acquired infections can help focus prevention efforts.


Subject(s)
Cross Infection/prevention & control , Hospital Information Systems , Infection Control/methods , Inpatients/classification , Cross Infection/diagnosis , Hospital Bed Capacity, 500 and over , Humans , Pilot Projects , Population Surveillance , Regression Analysis , Risk Factors , Software , Utah
12.
N Engl J Med ; 326(5): 281-6, 1992 Jan 30.
Article in English | MEDLINE | ID: mdl-1728731

ABSTRACT

BACKGROUND: Randomized, controlled trials have shown that prophylactic antibiotics are effective in preventing surgical-wound infections. However, it is uncertain how the timing of antibiotic administration affects the risk of surgical-wound infection in actual clinical practice. METHODS: We prospectively monitored the timing of antibiotic prophylaxis and studied the occurrence of surgical-wound infections in 2847 patients undergoing elective clean or "clean-contaminated" surgical procedures at a large community hospital. The administration of antibiotics 2 to 24 hours before the surgical incision was defined as early; that during the 2 hours before the incision, as preoperative; that during the 3 hours after the incision, as perioperative; and that more than 3 but less than 24 hours after the incision, as postoperative. RESULTS: Of the 1708 patients who received the prophylactic antibiotics preoperatively, 10 (0.6 percent) subsequently had surgical-wound infections. Of the 282 patients who received the antibiotics perioperatively, 4 (1.4 percent) had such infections (P = 0.12; relative risk as compared with the preoperatively treated group, 2.4; 95 percent confidence interval, 0.9 to 7.9). Of 488 patients who received the antibiotics postoperatively, 16 (3.3 percent) had wound infections (P less than 0.0001; relative risk, 5.8; 95 percent confidence interval, 2.6 to 12.3). Finally, of 369 patients who had antibiotics administered early, 14 (3.8 percent) had wound infections (P less than 0.0001; relative risk, 6.7; 95 percent confidence interval, 2.9 to 14.7). Stepwise logistic-regression analysis confirmed that the administration of antibiotics in the preoperative period was associated with the lowest risk of surgical-wound infection. CONCLUSIONS: We conclude that in surgical practice there is considerable variation in the timing of prophylactic administration of antibiotics and that administration in the two hours before surgery reduces the risk of wound infection.


Subject(s)
Anti-Bacterial Agents/administration & dosage , Premedication/methods , Surgical Wound Infection/prevention & control , Adult , Age Factors , Aged , Bacteria/drug effects , Bacteria/isolation & purification , Female , Humans , Male , Middle Aged , Prospective Studies , Regression Analysis , Risk , Sex Factors , Surgical Wound Infection/microbiology
13.
J Am Coll Cardiol ; 19(1): 1-10, 1992 Jan.
Article in English | MEDLINE | ID: mdl-1729317

ABSTRACT

One measure of the success of thrombolysis is the early patency status of the infarct-related coronary artery. The Thrombolysis in Myocardial Infarction (TIMI) study group designated patency grades 0 (occluded) or 1 (minimal perfusion) as thrombolysis failure and grade 2 (partial perfusion) or 3 (complete perfusion) as success. To evaluate their true functional significance, perfusion grades were compared with enzymatic and electrocardiographic (ECG) indexes of myocardial infarction in 359 patients treated within 4 h with anistreplase (APSAC) or streptokinase. Serum enzymes and ECGs were assessed serially. Patency was determined at 90 to 240 min (median 2.1 h) and graded by an observer who had no knowledge of patient data. Results for the two drug arms were similar and combined. Distribution of patency was grade 0 = 20%, n = 72; grade 1 = 8% n = 27; grade 2 = 16%, n = 58 and grade 3 = 56%, n = 202. Interventions were performed after angiography but within 24 h in 51% (n = 37), 70% (n = 19), 41% (n = 24) and 14% (n = 28) of patients with grades 0, 1, 2 and 3, respectively. Outcomes were compared among the four patency groups by the orthogonal contrast method. Patients with perfusion grade 2 did not differ significantly from those with grade 0 or 1 in enzymatic peaks, time to peak activity and evolution of summed ST segments, Q waves and R waves (contrast 2). Conversely, comparisons of patients with grade 3 perfusion with those with grades 0 to 2 yielded significant differences for enzymatic peaks and time to peak activity for three of the four enzymes (p = 0.02 to 0.0001) and ECG indexes of myocardial infarction (p = 0.02 to 0.0001) (contrast 3). Thus, patients with grade 2 flow have indexes of myocardial infarction similar to those in patients with an occluded artery (grades 0 and 1 flow). Only early grade 3 flow results in a significantly better outcome than that of the other grades. Because early achievement of grade 2 flow does not appear to lead to optimal myocardial salvage, the frequency of achieving grade 3 perfusion alone may best measure the reperfusion success of thrombolytic therapy.


Subject(s)
Anistreplase/therapeutic use , Clinical Enzyme Tests , Electrocardiography/drug effects , Myocardial Infarction/drug therapy , Streptokinase/therapeutic use , Thrombolytic Therapy , Vascular Patency/drug effects , Coronary Angiography , Double-Blind Method , Drug Therapy, Combination , Heparin/administration & dosage , Humans , Myocardial Infarction/diagnosis , Myocardium/enzymology , Prognosis , Time Factors , Treatment Outcome
14.
Am J Cardiol ; 68(9): 848-56, 1991 Oct 01.
Article in English | MEDLINE | ID: mdl-1927942

ABSTRACT

The effects of thrombolytic therapy on enzymatic and electrocardiographic indexes of myocardial infarction were examined in 370 patients who were enrolled within 4 hours of onset of symptoms and were randomized to blinded therapy with intravenous anistreplase (30 U/5 min, n = 188) or streptokinase (1.5 million IU/1 hour, n = 182). Creatine kinase and its MB isoenzyme were initially measured every 4 to 6 hours, and lactic dehydrogenase (LDH) and its cardiac isoenzyme (LDH-1) every 8 to 12 hours. Electrocardiograms were obtained before, and at 90 minutes and 8 hours after starting thrombolysis, and on discharge. Enzymatic and electrocardiographic measures of infarction were compared between drug treatment and patency groups. Early patency was associated with significant reductions in peak values for each of 4 cardiac enzymes (averaging 21 to 25%, p less than 0.01 to 0.001), even though later rescue procedures were often used in the nonpatient group; times to peaks were also reduced for 3 of the enzymes. Treatment with anistreplase was associated with enzymatic peaks that tended to be lower than with streptokinase (6 to 16%), approaching or reaching significance for LDH (p less than or equal to 0.07) and LDH-1 (p less than or equal to 0.04); times to peaks were similar. Early patency favorably affected electrocardiographic indexes. Summed ST-segment elevations resolved more rapidly (p less than or equal to 0.04), summed Q-wave amplitude was reduced by 32% (p less than or equal to 0.01), and total QRS infarct score on discharge was 22% less (p less than or equal to 0.006) in those achieving early patency. Small differences in electrocardiographic indexes between the 2 drug treatment groups were not significant. These results support use of early reperfusion to reduce infarct size in acute myocardial infarction with administration of streptokinase and anistreplase.


Subject(s)
Anistreplase/therapeutic use , Electrocardiography , Myocardial Infarction/drug therapy , Streptokinase/therapeutic use , Thrombolytic Therapy , Anistreplase/administration & dosage , Anistreplase/pharmacology , Creatine Kinase/blood , Humans , Isoenzymes , L-Lactate Dehydrogenase/blood , Myocardial Infarction/enzymology , Myocardial Infarction/physiopathology , Streptokinase/administration & dosage , Streptokinase/pharmacology , Time Factors , Vascular Patency/drug effects
15.
Am Heart J ; 122(4 Pt 1): 1007-15, 1991 Oct.
Article in English | MEDLINE | ID: mdl-1718156

ABSTRACT

Spontaneous variability of ventricular arrhythmia in patients with chronic heart failure is not well described. We measured this variability in 23 consecutive patients with chronic heart failure who were prospectively enrolled in the placebo limb of a trial concerned with treatment of heart failure. Patients underwent from one to three periods of ambulatory monitoring separated by 1 to 3 months while they were not receiving antiarrhythmic drug treatment. The variability in frequency of premature ventricular complexes (PVCs) was determined at interrecording intervals of 1, 2, and 3 months. The percentage reductions in total PVCs required to exceed the 95% confidence limits of spontaneous variability at these intervals were 91%, 90%, and 97%, respectively. Corresponding values for repetitive beats (beats in couplets and beats in ventricular tachycardia events) were 98%, 80%, and 97% and for ventricular tachycardia events 98%, 83%, and 98%, respectively. The percentage increases in total PVCs, repetitive beats, and ventricular tachycardia events required to identify aggravation of arrhythmia in this study population were 1301%, 4050%, and 6147%, respectively, at 1-month intervals and 2950%, 2868%, and 5938%, respectively, at 3-month intervals. The percentage reductions required to show a true drug effect at 2- and 3-month intervals were 63% and 84% for patients with an ejection fraction less than 0.22 and 89% and 98% for those with an ejection fraction greater than or equal to 0.22 (p less than 0.05 for both). Ventricular arrhythmia would have been missed in 6 (26%) of the 23 patients if only one screening ambulatory recording was available. Thus marked variability in PVCs occurs in patients with chronic heart failure.(ABSTRACT TRUNCATED AT 250 WORDS)


Subject(s)
Cardiac Complexes, Premature/physiopathology , Heart Failure/complications , Tachycardia/physiopathology , Adult , Aged , Aging/physiology , Cardiac Complexes, Premature/complications , Chronic Disease , Electrocardiography, Ambulatory , Female , Heart Failure/classification , Heart Failure/physiopathology , Humans , Male , Middle Aged , Prospective Studies , Sex Factors , Stroke Volume , Tachycardia/complications
16.
J Air Med Transp ; 10(10): 18, 20-22, 24 passim, 1991 Oct.
Article in English | MEDLINE | ID: mdl-10115146

ABSTRACT

Cardiovascular disease results in more deaths and higher medical costs than any other medical problem. Cardiac patients may be transported to centers for specialized care. We evaluated historical, current event, and physiologic items (n = 32) for ability to predict use of specialized care and hospital costs. For 199 patients studied, seven items were prognostic. A model classifying patients by presence of predictors was developed. For the group without predictors and the group with multiple predictors, sensitivity and specificity were respectively very good. For 125 (63%) of the patients in middle categories, the model was not sufficiently sensitive to be prognostic. A scoring system for all cardiac patients could not be developed. Patients requiring only diagnostics were responsible for financial deficits, as were all Medicare patient groups. Patients staying longer than seven days, or having surgery, or both, were responsible for the largest deficits (if on Medicare) and the highest profits (if not on Medicare). Advance validation of the need to transport is difficult, with far-reaching medical and financial implications.


Subject(s)
Cardiology Service, Hospital/economics , Cardiovascular Diseases/classification , Health Care Costs/statistics & numerical data , Outcome Assessment, Health Care/methods , Severity of Illness Index , Adult , Aged , Cardiac Catheterization/economics , Critical Care/statistics & numerical data , Hospital Bed Capacity, 300 to 499 , Humans , Medicare , Middle Aged , Models, Statistical , Patient Transfer , Prognosis , United States , Utah
17.
Am J Cardiol ; 68(2): 166-70, 1991 Jul 15.
Article in English | MEDLINE | ID: mdl-2063776

ABSTRACT

Intracerebral hemorrhage is an important concern after thrombolytic therapy for acute myocardial infarction, but risk factors are controversial. Accordingly, we assessed risk factors in 107 treated patients of whom 4 had intracerebral hemorrhage. Intracerebral hemorrhage occurred at a mean of 25 hours (range 3.5 to 48) after therapy and was fatal in 2 patients. Significant differences were found between patients with and without intracerebral hemorrhage for age (77 +/- 7 vs 62 +/- 11 years, p less than or equal to 0.01), and initial (161 +/- 23 vs 135 +/- 23 mm Hg, p less than or equal to 0.03) and maximal (171 +/- 30 vs 146 +/- 20, p less than or equal to 0.02) systolic blood pressures. Initial and maximal diastolic blood pressures also tended to be higher (101 +/- 25 vs 86 +/- 16, p less than or equal to 0.07; 104 +/- 24 vs 90 +/- 13, p less than or equal to 0.06). Differences did not achieve significance for comparisons of gender, height, weight, site of infarction, time to therapy, specific thrombolytic agent used, concomitant therapy, interventions and partial thromboplastin time. It is concluded that age (greater than or equal to 70 years) and elevated blood pressure (greater than or equal to 150/95 mm Hg) are important risk factors for intracerebral hemorrhage. The overall balance of benefit and risk of thrombolysis should continue to be assessed by large mortality trials.


Subject(s)
Cerebral Hemorrhage/chemically induced , Thrombolytic Therapy/adverse effects , Age Factors , Blood Pressure , Cerebral Hemorrhage/physiopathology , Female , Humans , Male , Middle Aged , Myocardial Infarction/drug therapy , Risk Factors
18.
Am Heart J ; 121(4 Pt 1): 1062-70, 1991 Apr.
Article in English | MEDLINE | ID: mdl-2008827

ABSTRACT

To assess the rate and variability of atherostenosis progression in patients with coronary artery disease at baseline angiography, we used a simplified quantitative method of analysis to study single angiograms in 54 patients and paired angiograms in 29 patients. All discrete lesions were identified, then traced and digitized to determine lumen diameter (LD), and summed to give the total LD; the differences in LD for paired angiograms were summed to give total stenosis change (TSC). The following results were obtained: Correlation between LD measured by our method and LD determined by the Brown/Dodge method was excellent (r = 0.99, N = 54). There also was a high correlation between interobserver (r = 0.98, N = 54) and intraobserver (r = 0.99, N = 54) findings. Short-term TSC (N = 9, angiograms paired at less than 1 week) was negligible (0.03 +/- 0.38 mm). Long-term (N = 20, angiograms paired at 0.6 to 4.3 years) total LD differed significantly from baseline total LD (4.1 +/- 2.5 mm vs 6.0 +/- 3 mm; p less than 0.001), and TSC (2.0 +/- 1.3 mm) in long-term patients differed significantly from TSC in short-term patients (p less than 0.001). These results show that true coronary disease progression occurring over 1 to 4 years can be distinguished from intraobserver, interobserver, and interstudy variability by means of a simplified method and provide approximate rates and variability of progression. These results will be useful for power calculations in therapeutic trials aimed at slowing progression. Further prospective studies with the use of this method appear indicated.


Subject(s)
Coronary Angiography , Coronary Disease/diagnostic imaging , Adult , Aged , Aged, 80 and over , Cardiac Catheterization , Cineangiography/instrumentation , Cineangiography/methods , Coronary Disease/epidemiology , Coronary Disease/pathology , Coronary Vessels/pathology , Female , Humans , Male , Middle Aged , Observer Variation , Prognosis , Radiographic Image Interpretation, Computer-Assisted/instrumentation , Radiographic Image Interpretation, Computer-Assisted/methods , Regression Analysis , Software , Time Factors
19.
J Heart Lung Transplant ; 10(2): 217-21; discussion 221-2, 1991.
Article in English | MEDLINE | ID: mdl-1903303

ABSTRACT

Because administration of murine monoclonal anti-CD3 antibody (OKT3) may result in the formation of human antimouse antibody, which complexes with OKT3, we conducted this study to assess the incidence and effect of human antimouse antibody formation during prophylactic administration of OKT3 in heart transplantation. Human antimouse antibody developed in eight of 55 (14%) cardiac allograft recipients receiving OKT3 prophylaxis as measured by enzyme-linked immunosorbent assay. Additionally, two recipients had an inexplicable rise in CD3+ lymphocytes during therapy without detectable antibody. The outcome of these 10 sensitized recipients was compared with that of 45 nonsensitized recipients. Age, preoperative diagnosis, hemodynamics, and the need for intravenous inotropes or mechanical assistance before transplantation were similar in both groups. No female patients were in the sensitized group, whereas 33% of the nonsensitized group were female patients. A trend toward greater sensitization when prophylaxis was extended to 21 days (28%) compared with the more conventional 14-day administration (10%) was not statistically significant. Retransplantation because of rejection was required in a single patient in each group. Allograft survival was significantly lower by 3 months in the sensitized group, and allograft loss caused by rejection selectively accounted for that difference. In survivors, rejection frequency and infectious complications were similar. These findings suggest that sensitization to OKT3 occurs at low frequency after prophylactic administration in heart transplantation but is associated with an increased frequency of graft loss because of rejection.


Subject(s)
Antibodies, Monoclonal/immunology , Antibodies/immunology , Graft Rejection/immunology , Heart Transplantation/immunology , Immunosuppression Therapy , Animals , Antibodies/analysis , Antibodies, Monoclonal/therapeutic use , Enzyme-Linked Immunosorbent Assay , Female , Humans , Immunization , Incidence , Male , Mice/immunology , Middle Aged , Muromonab-CD3 , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...