Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Occup Environ Med ; 66(5): 335-8, 2009 May.
Article in English | MEDLINE | ID: mdl-19017689

ABSTRACT

BACKGROUND: Mortality trends in the USA show that deaths from asbestosis are increasing, while deaths related to other pneumoconiosis are declining. OBJECTIVES: To analyse the association between asbestos consumption and asbestosis mortality trends. METHODS: In an epidemiological time series study, we used a modern computer-intensive local regression method to evaluate the relationship between asbestos consumption per capita (1900-2006) as the predictor variable and number of deaths from asbestosis (1968-2004). The predictor variable was progressively lagged by annual increments from 30 to 60 years and the goodness of fit assessed for each lag period. The model having the smallest Akaike's Information Criteria was used to derive extrapolated estimates of future mortality based on more recent asbestos consumption data. RESULTS: Asbestos consumption per capita reached a peak in 1951 and gradually declined until 1973, when it started to drop rapidly. In 2006, it was 0.0075 kg/person/year. There were 25 564 deaths from asbestosis over the period 1968-2004. The best-fitting model (adjusted coefficient of determination (R(2)) = 99.7%) for 1968-2004 deaths from asbestosis used asbestos consumption per capita 48 years prior (1920-1956) and the log value of asbestos consumption per capita 43 years prior (1925-1961). This model predicts a total of 29 667 deaths (95% CI 19 629 to 39 705) to occur during 2005-2027 (an average of 1290 deaths per year). CONCLUSIONS: This study demonstrates a clear association between asbestos consumption and deaths from asbestosis and indicates that asbestosis deaths are not expected to decrease sharply in the next 10-15 years.


Subject(s)
Asbestos/supply & distribution , Asbestosis/mortality , Environmental Exposure/adverse effects , Adolescent , Adult , Aged , Aged, 80 and over , Asbestos/toxicity , Female , Forecasting , Humans , Male , Middle Aged , Pneumoconiosis/mortality , United States/epidemiology , Young Adult
2.
Appl Ergon ; 32(6): 541-7, 2001 Dec.
Article in English | MEDLINE | ID: mdl-11703040

ABSTRACT

This study investigated the effect of wearing a back belt on subjects' heart rate, oxygen consumption, systolic and diastolic blood pressure, and respiratory frequency during asymmetric repetitive lifting. Thirty subjects with materials-handling experience utilized three different belts (ten subjects per belt). Subjects completed six 30-min lifting sessions--three while wearing a belt and three without. Data analyses were conducted on the second, third, and fourth lifting periods. A 9.4 kg box, without handles, was lifted 3 times/min, starting at 10 cm above the floor, ending at 79 cm, with a 60 degree twist to the right. Data analysis indicates that belt-wearing did not have a significant effect on the overall mean values for heart rate, systolic and diastolic blood pressure, and respiratory frequency. Belt-wearing had a significant effect on the overall mean oxygen consumption of the subjects.


Subject(s)
Back Injuries/prevention & control , Hemodynamics , Lifting , Protective Devices , Respiration , Adult , Back Injuries/etiology , Blood Pressure , Female , Heart Rate , Humans , Lifting/adverse effects , Male , Oxygen Consumption
3.
Spine (Phila Pa 1976) ; 26(16): 1794-8, 2001 Aug 15.
Article in English | MEDLINE | ID: mdl-11493853

ABSTRACT

STUDY DESIGN: A crossover design was used to evaluate kinematic measurements collected with an infrared-based motion measurement system. OBJECTIVES: To evaluate belt effects on spine kinematics during asymmetric lifting of large and small boxes and to test for carryover effects between trials from belts. SUMMARY OF BACKGROUND DATA: Conflicting evidence in the literature exists regarding whether belts are beneficial or detrimental to manual material handlers. Studies have not examined belt effects when lifting different sized boxes, nor carryover effects from belts. METHODS: Twenty-eight subjects with manual-handling experience (17 male and 11 female) were randomly assigned to lift either a large or small box (weighing 9.4 kg), from a sagittally symmetric origin at pallet height to a 79 cm height, 60 degrees to the right. Spine flexion, lateral bending and twisting, hip and knee flexion, and angular velocity measurements of the torso with respect to the pelvis were collected for each of three lifting periods, 50 lifts each at 3 lifts per minute, with 18-minute breaks between periods. RESULTS: Belts significantly reduced maximum spine flexion, spine flexion and extension angular velocities, and torso left lateral bending angular velocity, and increased hip and knee flexion, regardless of box size. When lifting large boxes, belts significantly reduced torso right lateral bending and torso left twisting. No significant differential carryover effects were detected from belts. CONCLUSIONS: Subjects with belts lifted more slowly and used more of a squat-lift technique, regardless of box size. Belts reduced more torso motions while lifting large boxes.


Subject(s)
Back/physiology , Braces , Lifting , Spine/physiology , Adolescent , Adult , Biomechanical Phenomena , Female , Humans , Male , Occupational Diseases/prevention & control , Spinal Injuries/prevention & control , Weight-Bearing/physiology
4.
JAMA ; 284(21): 2727-32, 2000 Dec 06.
Article in English | MEDLINE | ID: mdl-11105177

ABSTRACT

CONTEXT: Despite scientific uncertainties about effectiveness, wearing back belts in the hopes of preventing costly and disabling low back injury in employees is becoming common in the workplace. OBJECTIVE: To evaluate the effectiveness of using back belts in reducing back injury claims and low back pain. DESIGN AND SETTING: Prospective cohort study. From April 1996 through April 1998, we identified material-handling employees in 160 new retail merchandise stores (89 required back belt use; 71 had voluntary back belt use) in 30 states (from New Hampshire to Michigan in the north and from Florida to Texas in the south); data collection ended December 1998, median follow-up was 6(1/2) months. PARTICIPANTS: A referred sample of 13,873 material handling employees provided 9377 baseline interviews and 6311 (67%) follow-up interviews; 206 (1.4%) refused baseline interview. MAIN OUTCOME MEASURES: Incidence rate of material-handling back injury workers' compensation claims and 6-month incidence rate of self-reported low back pain. RESULTS: Neither frequent back belt use nor a belt-requirement store policy was significantly associated with back injury claim rates or self-reported back pain. Rate ratios comparing back injury claims of those who reported wearing back belts usually every day and once or twice a week vs those who reported wearing belts never or once or twice a month were 1.22 (95% confidence interval [CI], 0.87-1.70) and 0.95 (95% CI, 0.56-1.59), respectively. The respective odds ratios for low back pain incidence were 0.97 (95% CI, 0.83-1.13) and 0.92 (95% CI, 0.73-1.16). CONCLUSIONS: In the largest prospective cohort study of back belt use, adjusted for multiple individual risk factors, neither frequent back belt use nor a store policy that required belt use was associated with reduced incidence of back injury claims or low back pain. JAMA. 2000;284:2727-2732.


Subject(s)
Back Injuries/prevention & control , Back Pain/prevention & control , Occupational Diseases/prevention & control , Protective Clothing , Workplace/standards , Adult , Back Injuries/epidemiology , Back Pain/epidemiology , Female , Humans , Male , Occupational Diseases/epidemiology , Prospective Studies , Protective Clothing/statistics & numerical data , Regression Analysis , United States , Workers' Compensation , Workplace/statistics & numerical data
5.
Stat Med ; 18(23): 3355-63, 1999 Dec 15.
Article in English | MEDLINE | ID: mdl-10602157

ABSTRACT

Public health decision making based on data sources that are characterized by a lack of independence and other complicating factors requires the development of innovative statistical techniques. Studies of injuries in occupational cohorts require methods to account for recurrent injuries to workers over time and the temporary removal of workers from the 'risk set' while recuperating. In this study, the times until injury events are modelled in an occupational cohort of employees in a large power utility company where employees are susceptible to recurrent events. The injury history over a ten-year period is used to compare the hazards of specific jobs, adjusted for age when first hired, and race/ethnicity differences. Subject-specific random effects and multiple event-times are accommodated through the application of frailty models which characterize the dependence of recurrent events over time. The counting process formulation of the proportional hazards regression model is used to estimate the effects of covariates for subjects with discontinuous intervals of risk. In this application, subjects are not at risk of injury during recovery periods or other illness, changes in jobs, or other reasons. Previous applications of proportional hazards regression in frailty models have not needed to account for the changing composition of the risk set which is required to adequately model occupational injury data. Published in 1999 by John Wiley & Sons, Ltd. This article is a US Government work and is in the public domain in the United States.


Subject(s)
Models, Statistical , Occupational Diseases/epidemiology , Power Plants , Wounds and Injuries/epidemiology , Adolescent , Adult , Cohort Studies , Humans , Proportional Hazards Models , Recurrence , Time Factors
6.
Stat Med ; 15(17-18): 1951-60, 1996.
Article in English | MEDLINE | ID: mdl-8888487

ABSTRACT

Study designs in public health research often require the estimation of intervention effects that have been applied to a cluster of subjects in a common geographic area, rather than randomly assigned to individual subjects, and where the outcome is dichotomous. Statistical methods that account for the intracluster correlation of measurements must be used or the standard errors of regression coefficients will be under-estimated. Generalized estimating equations (GEE) can be used to account for this correlation, although there are no straightforward methods to determine sample-size requirements for adequate power. A simulation study was performed to calculate power in a GEE model for a proposed study of the effect of an intervention, designed to reduce lower-back injuries among nursing personnel employed in nursing homes. Nursing homes will be randomly assigned to either an intervention or control group and all employees within a nursing home will be treated alike. Historical injury data indicates that the baseline-injury risk for each home can be reasonably modelled using a beta distribution. It is assumed that the risk for any individual nurse within a nursing home follows a Bernoulli probability distribution expressed as a logit function of fixed covariates, which have values of odds ratios determined from previous studies which represent characteristics of the study population, and a random-intercept term which is specific for each home. Results indicate that failure to account for intracluster correlation can lead to overestimates of power as well as inflation of type I error by as much as 20 per cent. Although the GEE method accounted for the intracluster correlation when present, estimates of the intracluster correlation were negatively biased when no intracluster correlation was present. In addition, and possibly related to the negatively biased estimates of intracluster correlation, we also found inflated type I error estimates from the GEE method.


Subject(s)
Health Services Research/statistics & numerical data , Models, Statistical , Sample Size , Small-Area Analysis , Algorithms , Bias , Computer Simulation , Health Services Research/methods , Humans , Likelihood Functions , Logistic Models , Low Back Pain/prevention & control , Nursing Staff , Occupational Diseases/prevention & control
7.
Lifetime Data Anal ; 1(2): 161-70, 1995.
Article in English | MEDLINE | ID: mdl-9385098

ABSTRACT

The median service lifetime of respiratory safety devices produced by different manufacturers is determined using frailty models to account for unobserved differences in manufacturing processes and raw materials. The gamma and positive stable frailty distributions are used to obtain survival distribution estimates when the baseline hazard is assumed to be Weibull. Frailty distributions are compared using laboratory test data of the failure times for 104 respirator cartridges produced by 10 different manufacturers tested with three different challenge agents. Likelihood ratio tests indicate that both frailty models provide a significant improvement over a Weibull model assuming independence. Results are compared to fixed effects approaches for analysis of this data.


Subject(s)
Respiratory Protective Devices , Survival Analysis , Equipment Failure/statistics & numerical data , Humans , Likelihood Functions , Multivariate Analysis , Regression Analysis
8.
Conn Med ; 58(3): 165-71, 1994 Mar.
Article in English | MEDLINE | ID: mdl-8039381

ABSTRACT

OBJECTIVE: To identify predictors of treatment outcomes in methadone maintenance programs and to determine whether HIV counseling and testing influenced these outcomes. DESIGN: Retrospective record review. SETTING: Four methadone maintenance programs in four cities in Connecticut, USA. PARTICIPANTS: Five hundred and ninety-four clients, who began treatment over an 18-month period and for whom records were available, took part. INTERVENTIONS: HIV counseling and testing. MAIN OUTCOME MEASURES: Risk of treatment discontinuation and persistent in-treatment illicit drug use. RESULTS: The most important predictor of treatment discontinuation and of persistent in-treatment illicit drug use was self-reported pretreatment cocaine use. After controlling for this and demographic risk factors, clients who received initial HIV counseling, when compared with clients who did not, had a similar 12-month discontinuation risk (54% vs 59%; P = 0.08) but were less likely to show persistent illicit drug use (46% vs 53%; P = 0.01). Among counseled entrants who were tested for HIV antibodies, those receiving positive results had a 12-month discontinuation risk similar to those receiving negative results (50% vs 52%), but more often showed persistent illicit drug use (57% vs 44%), although this difference may have been due to chance (P = 0.28). The majority of clients who discontinued treatment did so because they were discharged for noncompliance with clinic rules, usually for failing to pay fees. CONCLUSIONS: HIV counseling and testing do not have a substantial adverse effect on methadone treatment outcomes. In the clinics under study, failure to pay clinic fees was an important factor contributing to discontinuation of treatment.


Subject(s)
AIDS Serodiagnosis , HIV Infections/prevention & control , Methadone/therapeutic use , Opioid-Related Disorders/rehabilitation , Patient Education as Topic , AIDS Serodiagnosis/psychology , Adult , Connecticut , Female , HIV Infections/transmission , Humans , Male , Opioid-Related Disorders/psychology , Outcome and Process Assessment, Health Care , Patient Dropouts/psychology , Substance Abuse Detection/psychology
9.
Epidemiol Infect ; 112(1): 13-23, 1994 Feb.
Article in English | MEDLINE | ID: mdl-8119352

ABSTRACT

The effects of ingested Salmonella enteritidis (SE) dose on incubation period and on the severity and duration of illness were estimated in a cohort of 169 persons who developed gastroenteritis after eating hollandaise sauce made from grade-A shell eggs. The cohort was divided into three groups based on self-reported dose of sauce ingested. As dose increased, median incubation period decreased (37 h in the low exposure group v. 21 h in the medium exposure group v. 17.5 h in the high exposure group, P = 0.006) and greater proportions reported body aches (71 v. 85 v. 94%, P = 0.0009) and vomiting (21 v. 56 v. 57%, P = 0.002). Among 118 case-persons who completed a follow-up questionnaire, increased dose was associated with increases in median weight loss in kilograms (3.2 v. 4.5 v. 5.0, P = 0.0001), maximum daily number of stools (12.5 v. 15.0 v. 20.0, P = 0.02), subjective rating of illness severity (P = 0.0007), and the number of days of confinement to bed (3.0 v. 6.5 v. 6.5, P = 0.04). In this outbreak, ingested dose was an important determinant of the incubation period, symptoms and severity of acute salmonellosis.


Subject(s)
Disease Outbreaks , Eggs/microbiology , Food Microbiology , Salmonella Food Poisoning/microbiology , Salmonella enteritidis , Adult , Animals , Bacteriophage Typing , Chickens , Cohort Studies , Eating , Feces/microbiology , Follow-Up Studies , Food Handling , Gastroenteritis/epidemiology , Gastroenteritis/microbiology , Humans , Middle Aged , Salmonella Food Poisoning/epidemiology , Salmonella enteritidis/classification , Salmonella enteritidis/growth & development , Salmonella enteritidis/isolation & purification , Severity of Illness Index , Surveys and Questionnaires , Time Factors , Weight Loss
10.
Stat Med ; 12(3-4): 241-8, 1993 Feb.
Article in English | MEDLINE | ID: mdl-8456209

ABSTRACT

Bivariate survival analysis models that incorporate random effects or 'frailty' provide a useful framework for determining the effectiveness of interventions. These models are based on the notion that two paired survival times are correlated because they share a common unobserved value of a random variate from a frailty distribution. In some applications, however, investigators may have some information that characterizes pairs and thus provides information about their frailty. Alternatively, there may be an interest in assessing whether the correlation within certain types of pairs is different from the correlation within other types of pairs. In this paper, we present a method to incorporate 'pair-wise' covariate information into the dependence parameter of the bivariate survival function. We provide an example using data from the Framingham Heart Study to investigate the times until the occurrence of two events within an individual: the first detection of hypertension and the first cardiovascular disease event. We model the dependence between these two events as a function of the age of the individual at the time of enrollment into the Framingham Study.


Subject(s)
Cardiovascular Diseases/mortality , Hypertension/complications , Likelihood Functions , Proportional Hazards Models , Survival Analysis , Adult , Age Factors , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/etiology , Causality , Connecticut/epidemiology , Female , Follow-Up Studies , Forecasting , Humans , Hypertension/drug therapy , Male , Middle Aged , Time Factors
11.
Stat Med ; 12(3-4): 301-10, 1993 Feb.
Article in English | MEDLINE | ID: mdl-8456213

ABSTRACT

Survival analysis methods are valuable for detecting intervention effects because detailed information from patient records and sensitive outcome measures are used. The burn unit at a large university hospital replaced routine bathing with total body bathing using chlorhexidine gluconate for antimicrobial effect. A Cox proportional hazards model was used to analyse time from admission until either infection with Staphylococcus aureus or discharge for 155 patients, controlling for burn severity and two time-dependent covariates: days until first wound excision and days until first administration of prophylactic antibiotics. The risk of infection was 55 per cent higher in the historical control group, although not statistically significant. There was also some indication that early wound excision may be important as an infection-control measure for burn patients.


Subject(s)
Baths/standards , Burns/therapy , Chlorhexidine/analogs & derivatives , Clinical Protocols/standards , Infection Control/standards , Proportional Hazards Models , Staphylococcal Infections/epidemiology , Anti-Bacterial Agents/therapeutic use , Body Surface Area , Burn Units , Burns/classification , Burns/complications , Chlorhexidine/administration & dosage , Chlorhexidine/therapeutic use , Hospitals, University , Humans , Incidence , Infection Control/methods , Length of Stay/statistics & numerical data , Povidone-Iodine/administration & dosage , Povidone-Iodine/therapeutic use , Risk Factors , Staphylococcal Infections/etiology , Staphylococcal Infections/prevention & control , Time Factors , Treatment Outcome
12.
AIDS ; 6(1): 115-21, 1992 Jan.
Article in English | MEDLINE | ID: mdl-1543554

ABSTRACT

OBJECTIVE: To identify predictors of treatment outcomes in methadone maintenance programs and to determine whether HIV counseling and testing influenced these outcomes. DESIGN: Retrospective record review. SETTING: Four methadone maintenance programs in four cities in Connecticut, USA. PARTICIPANTS: Five hundred and ninety-four clients, who began treatment over an 18-month period and for whom records were available, took part. INTERVENTIONS: HIV counseling and testing. MAIN OUTCOME MEASURES: Risk of treatment discontinuation and persistent in-treatment illicit drug use. RESULTS: The most important predictor of treatment discontinuation and of persistent in-treatment illicit drug use was self-reported pre-treatment cocaine use. After controlling for this and demographic risk factors, clients who received initial HIV counseling, when compared with clients who did not, had a similar 12-month discontinuation risk (54 versus 59%; P = 0.08) but were less likely to show persistent illicit drug use (46 versus 53%; P = 0.01). Among counseled entrants who were tested for HIV antibodies, those receiving positive results had a 12-month discontinuation risk similar to those receiving negative results (50 versus 52%), but more often showed persistent illicit drug use (57 versus 44%), although this difference may have been due to chance (P = 0.28). The majority of clients who discontinued treatment did so because they were discharged for non-compliance with clinic rules, usually for failing to pay fees. CONCLUSIONS: HIV counseling and testing do not have a substantial adverse effect on methadone treatment outcomes. In the clinics under study, failure to pay clinic fees was an important factor contributing to discontinuation of treatment.


Subject(s)
Counseling , HIV Infections/prevention & control , Methadone/therapeutic use , Substance Abuse, Intravenous/rehabilitation , Adult , Female , HIV Infections/diagnosis , Humans , Male , Retrospective Studies , Risk Factors
13.
Am J Public Health ; 81(8): 1067-9, 1991 Aug.
Article in English | MEDLINE | ID: mdl-1854005

ABSTRACT

In January 1988, Oregon became the first state to require hospital-based reporting of attempted suicide (AS) in all adolescents less than 18 years old. From January to December 1988, 644 cases of AS were reported (annual rate of 214 per 100,000 population, ages 10 to 17 years). We compared these 644 cases of AS with all 137 Oregon adolescents less than 18 years old who committed suicide in Oregon during the 10-year-period 1979 through 1988, and found that the strongest predictor of outcome was method used.


Subject(s)
Suicide, Attempted/statistics & numerical data , Adolescent , Child , Female , Humans , Male , Oregon/epidemiology , Suicide/statistics & numerical data
14.
JAMA ; 264(19): 2529-33, 1990 Nov 21.
Article in English | MEDLINE | ID: mdl-2122013

ABSTRACT

Using data from a large measles outbreak that occurred in Dane County (Wisconsin) in 1986, we conducted a case-control study to evaluate risk factors for vaccine failure and assessed the cost-effectiveness of school-based revaccination strategies. Vaccination before a change in the measles vaccine stabilizer in 1979 (odds ratio, 5.5; 95% confidence interval, 1.05 to 28.9) and vaccination before age 15 months (odds ratio, 13.9; 95% confidence interval, 5.9 to 32.6) were identified as risk factors. Revaccination strategies for all students ($3444 per case prevented), students vaccinated before 1980 ($3166 per case prevented), and students vaccinated before age 15 months ($2546 per case prevented) were evaluated, assuming use of measles-mumps-rubella vaccine after the initial case was detected in a school. However, a large proportion of cases (43% to 53%) may not have been preventable using these strategies. Therefore, revaccination in all schools assessed to be at risk for measles may be necessary to prevent large outbreaks until a two-dose vaccination schedule is fully implemented.


Subject(s)
Measles Vaccine , Measles/etiology , Vaccination/economics , Adolescent , Case-Control Studies , Cost-Benefit Analysis , Disease Outbreaks/prevention & control , Drug Administration Schedule , Female , Humans , Measles/epidemiology , Measles/prevention & control , Measles Vaccine/administration & dosage , Seasons , Sensitivity and Specificity , Wisconsin/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...