Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
1.
Int Arch Occup Environ Health ; 96(1): 1-26, 2023 01.
Article in English | MEDLINE | ID: mdl-35604441

ABSTRACT

PURPOSE: Human health risk assessments of glyphosate have focused on animal toxicology data for determining neurotoxic potential. Human epidemiological studies have not yet been systematically reviewed for glyphosate neurotoxicity hazard identification. The objective of this systematic literature review was to summarize the available epidemiology of glyphosate exposure and neurological outcomes in humans. METHODS: As of December 2021, 25 eligible epidemiological studies of glyphosate exposure and neurological endpoints were identified and assessed for five quality dimensions using guidance from the U.S. Environmental Protection Agency. Studies that assessed personal use of glyphosate were prioritized, whereas those assessing indirect exposure (other than personal use) were rated as low quality, since biomonitoring data indicate that indirect metrics of glyphosate exposure almost always equate to non-detectable glyphosate doses. RESULTS: Overall, the scientific evidence on glyphosate and neurotoxicity in humans is sparse and methodologically limited, based on nine included epidemiological studies of neurodegenerative outcomes (two high quality), five studies of neurobehavioral outcomes (two high quality), six studies of neurodevelopmental outcomes (none high quality), and five studies of other and mixed neurological outcomes (one high quality). The five high-quality studies showed no association between glyphosate use and risk of depression, Parkinson disease, or peripheral nerve conduction velocity. Results were mixed among the eight moderate-quality studies, which did not demonstrate consistent associations with any neurological endpoints or categories. Low-quality studies were considered uninformative about possible neurotoxic effects due primarily to questionable assessments of indirect exposure. CONCLUSIONS: No association has been demonstrated between glyphosate and any neurological outcomes in humans. To move the state of science forward, epidemiological studies should focus on scenarios involving direct and frequent use of glyphosate while collecting information on validated health outcomes, concomitant agricultural exposures, and relevant personal characteristics.


Subject(s)
Herbicides , Neurotoxicity Syndromes , Animals , Humans , Environmental Exposure/adverse effects , Herbicides/toxicity , Glycine/toxicity , Risk Assessment , Neurotoxicity Syndromes/epidemiology , Neurotoxicity Syndromes/etiology , Glyphosate
2.
Ann Epidemiol ; 36: 64, 2019 08.
Article in English | MEDLINE | ID: mdl-31327674
3.
J Rheumatol ; 44(10): 1476-1483, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28765257

ABSTRACT

OBJECTIVE: Tenosynovial giant cell tumor (TGCT) is a rare benign proliferative and inflammatory disease arising from synovia of joints, bursae, or tendon sheaths. We aimed to estimate incidence rate and prevalence of TGCT in Denmark, to describe patient characteristics and treatment modalities among patients with TGCT, and to estimate risk of TGCT recurrence. METHODS: Using registry data on pathology examinations and inpatient and outpatient hospital diagnoses, we identified adult patients with diagnoses of diffuse TGCT (D-TGCT) or localized TGCT (L-TGCT) between 1997 and 2012, followed through 2012. We described patients' characteristics, treatment modalities, and recurrence. RESULTS: We identified 2087 patients with L-TGCT and 574 patients with D-TGCT. Their incidence rates per million person-years were 30.3 (95% CI 29.1-31.7) and 8.4 (95% CI 7.7-9.1), respectively. At the end of 2012, prevalence per 100,000 persons was 44.3 (95% CI 42.4-46.3) for L-TGCT and 11.5 (95% CI 10.6-12.6) for D-TGCT. Women made up 61% of the patients with L-TGCT and 51% of the patients with D-TGCT. Median age at diagnosis was 47 years. Ten-year risk of recurrence was 9.8% (95% CI 8.4-11.3%) after L-TGCT and 19.1% (95% CI 15.7-22.7%) after D-TGCT. CONCLUSION: This study contributes evidence about epidemiology of TGCT based on routinely collected population-based data gathered in a setting of universal equal access to healthcare and complete followup.


Subject(s)
Giant Cell Tumor of Tendon Sheath/epidemiology , Soft Tissue Neoplasms/epidemiology , Adolescent , Adult , Aged , Aged, 80 and over , Denmark/epidemiology , Female , Humans , Incidence , Male , Middle Aged , Neoplasm Recurrence, Local , Prevalence , Registries , Young Adult
4.
Clin Epidemiol ; 4: 87-93, 2012.
Article in English | MEDLINE | ID: mdl-22570568

ABSTRACT

BACKGROUND: The prevalence of metastatic bone disease in the US population is not well understood. We sought to estimate the current number of US adults with metastatic bone disease using two large administrative data sets. METHODS: Prevalence was estimated from a commercially insured cohort (ages 18-64 years, MarketScan database) and from a fee-for-service Medicare cohort (ages ≥65 years, Medicare 5% database) with coverage on December 31, 2008, representing approximately two-thirds of the US population in each age group. We searched for claims-based evidence of metastatic bone disease from January 1, 2004, using a combination of relevant diagnosis and treatment codes. The number of cases in the US adult population was extrapolated from age- and sex-specific prevalence estimated in these cohorts. Results are presented for all cancers combined and separately for primary breast, prostate, and lung cancer. RESULTS: In the commercially insured cohort (mean age = 42.3 years [SD = 13.1]), we identified 9505 patients (0.052%) with metastatic bone disease. Breast cancer was the most common primary tumor type (n = 4041). In the Medicare cohort (mean age = 75.6 years [SD = 7.8]), we identified 6427 (0.495%) patients with metastatic bone disease. Breast (n = 1798) and prostate (n = 1862) cancers were the most common primary tumor types. We estimate that 279,679 (95% confidence interval: 274,579-284,780) US adults alive on December 31, 2008, had evidence of metastatic bone disease in the previous 5 years. Breast, prostate, and lung cancers accounted for 68% of these cases. CONCLUSION: Our findings suggest that approximately 280,000 US adults were living with metastatic bone disease on December 31, 2008. This likely underestimates the true frequency; not all cases of metastatic bone disease are diagnosed, and some diagnosed cases might lack documentation in claims data.

5.
Pharmacoepidemiol Drug Saf ; 21(8): 857-64, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22450901

ABSTRACT

PURPOSE: In July 2007, the Centers for Medicare & Medicaid Services released a national coverage determination (NCD) for erythropoiesis-stimulating agent (ESA) use in cancer patients, mandating payment restrictions likely to reduce ESA use and possibly increase red blood cell transfusions. We aimed to quantify ESA and transfusion use pre-NCD and post-NCD. METHODS: Medicare 5% sample data, 2005-2007, were used. Patients were 66 years or older, had lung, breast, or colorectal cancer or lymphomas, and initiated chemotherapy in pre-NCD and post-NCD periods (September-November 2006, September-November 2007). ESA use and transfusions were identified from claims. Differences in proportions of patients using ESAs and receiving transfusions pre-NCD and post-NCD were evaluated using logistic regression; differences in transfusion event rates were evaluated using a Poisson model. RESULTS: The pre-NCD cohort included 1897 patients and the post-NCD cohort 1877. In the pre-NCD cohort, 31% of patients had lung cancer, 29% lymphoma, 20% colorectal cancer, and 20% breast cancer; distribution was similar in the post-NCD cohort. Overall, ESA use decreased from 35.0% pre-NCD to 15.2% post-NCD. Transfusion use increased from 9.3% to 10.4%, and transfusion event rates from 19.0 to 21.8 per 100 patient-quarters. Results adjusted for baseline characteristics and comorbid conditions were similar. ESA use reduction achieved statistical significance; transfusion use and rate increases did not. CONCLUSIONS: ESA use decreased sharply post-NCD. This was accompanied by an estimated 1.1% (95% confidence interval -0.8% to 3.0%) absolute increase in transfusion use.


Subject(s)
Anemia/therapy , Centers for Medicare and Medicaid Services, U.S./legislation & jurisprudence , Erythrocyte Transfusion/statistics & numerical data , Hematinics/administration & dosage , Insurance Coverage/legislation & jurisprudence , Aged , Aged, 80 and over , Anemia/drug therapy , Anemia/etiology , Antineoplastic Agents/adverse effects , Centers for Medicare and Medicaid Services, U.S./economics , Comorbidity , Drug Utilization , Female , Hematinics/economics , Hematinics/therapeutic use , Humans , Insurance Claim Review/statistics & numerical data , Male , Neoplasms/drug therapy , Retrospective Studies , United States
6.
Clin J Am Soc Nephrol ; 5(2): 182-8, 2010 Feb.
Article in English | MEDLINE | ID: mdl-20019122

ABSTRACT

BACKGROUND AND OBJECTIVES: Observational studies relating epoetin alfa (EPO) dose and mortality frequently use analytic methods that do not control time-dependent confounding by indication (CBI). The relationship between EPO dose and 1-year mortality, adjusting for the effects of time-dependent CBI, was examined using a marginal structural model. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: This retrospective cohort study included 27,791 hemodialysis patients between July 2000 and June 2002. Patients were grouped at successive 2-wk intervals into a zero-dose category or four nonzero-dose categories. Ordinal regression was used to calculate inverse probability of treatment weights of patients receiving their own dose level given their covariate and treatment history. Three treatment models with an increasing number of treatment predictors were evaluated to assess the effect of model specification. A small number of excessively large patient weights were truncated. Relative hazards for higher-dose groups compared with the lowest nonzero-dose group varied by treatment model specification and by level of weight truncation. RESULTS: Results differed appreciably between the simplest treatment model, which incorporated only hemoglobin and EPO dosing history with 2% weight truncation (hazard ratio: 1.51; 95% confidence interval: 1.09, 1.89 for highest-dose patients), and the most comprehensive treatment model with 1% weight truncation (hazard ratio: 0.98; 95% confidence interval: 0.76, 1.74). CONCLUSIONS: There is appreciable CBI at higher EPO doses, and EPO dose was not associated with increased mortality in marginal structural model analyses that more completely addressed this confounding.


Subject(s)
Erythropoietin/administration & dosage , Hematinics/administration & dosage , Kidney Diseases/mortality , Kidney Diseases/therapy , Models, Statistical , Renal Dialysis , Adult , Aged , Biomarkers/blood , Confounding Factors, Epidemiologic , Dose-Response Relationship, Drug , Epoetin Alfa , Female , Hemoglobins/metabolism , Humans , Kidney Diseases/blood , Kidney Diseases/drug therapy , Logistic Models , Male , Middle Aged , Recombinant Proteins , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , United States/epidemiology
7.
Am J Kidney Dis ; 54(3): 554-60, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19592144

ABSTRACT

Findings from randomized controlled trials examining the efficacy of therapy with erythropoiesis-stimulating agents (ESAs) to normalize hemoglobin levels in patients with chronic kidney disease or kidney failure have raised questions regarding the safety of this class of drugs. However, no trial to date has specifically assessed the safety of ESA-dosing algorithms used to achieve the lower hemoglobin targets typically using in clinical practice. Although a wealth of nonexperimental data is available for dialysis patients, analyses based on these data are more susceptible to confounding bias than randomized controlled trials. Conducting valid pharmacoepidemiologic studies of drug effects in hemodialysis patients is complicated by the extent of their comorbidities, frequent hospitalizations, various concomitant medications, and an exceedingly high mortality rate. The need for greater ESA doses for the treatment of anemia in sicker patients potentially and plausibly generates confounding by indication, the control of which is complicated by the presence of time-dependent confounding. Here, we describe sources of bias in nonexperimental studies of ESA therapy in hemodialysis patients and critically appraise analytical methods that may help minimize bias in such studies.


Subject(s)
Hematinics/administration & dosage , Hematinics/adverse effects , Models, Statistical , Randomized Controlled Trials as Topic/mortality , Biomedical Research/methods , Drug Evaluation/methods , Drug Evaluation/mortality , Humans , Mortality , Randomized Controlled Trials as Topic/methods
9.
Am J Kidney Dis ; 51(1): 62-70, 2008 Jan.
Article in English | MEDLINE | ID: mdl-18155534

ABSTRACT

BACKGROUND: Confounding-by-indication is a bias in nonexperimental studies that occurs when outcomes are compared for treated and untreated patients and the treatment or medication dose is related to predictors of the outcome. Two recent publications reported that greater epoetin alfa (EPO) doses were associated with increased mortality rates. We assessed whether confounding-by-indication might account for these results. STUDY DESIGN: We used a retrospective cohort study design. SETTING & PARTICIPANTS: Hemodialysis patients were randomly selected from a large dialysis organization from July 2000 to June 2002 and were required to have completed a 9-month baseline period. PREDICTOR: EPO dose assessed during months 7 to 9 of the baseline period and monthly throughout the follow-up period. Hemoglobin (Hb) was assessed as average value during months 4 to 6 of the baseline period and monthly throughout the follow-up period. All other covariates were assessed during months 1 to 6 of the baseline period. OUTCOME: All-cause mortality during the 1 year of follow-up. Baseline Cox models were fitted with log EPO and Hb with and without adjustment for baseline patient characteristics. Time-dependent models were fitted with time-varying log EPO and Hb and, separately, lagged log EPO and Hb, with adjustment for baseline patient characteristics. RESULTS: 22,955 patients met our inclusion criteria. In the unadjusted model, we observed increased mortality risk with increasing EPO dose (hazard ratio [HR], 1.31 per log unit increase; 95% confidence interval [CI], 1.26 to 1.36). Adjustment for baseline patient characteristics resulted in an appreciably decreased HR (HR, 1.21; 95% CI, 1.15 to 1.28). In the lagged time-dependent analyses, estimates ranged from HR of 0.93 (95% CI, 0.92 to 0.95) to HR of 1.01 (95% CI, 0.99 to 1.03) for the 1- and 2-month lagged models, respectively. LIMITATIONS: This analysis was limited to prevalent hemodialysis patients, and inhospital EPO dosing information was unavailable. CONCLUSIONS: The observed mortality risk estimates associated with EPO dose in nonexperimental studies in dialysis patients may be highly sensitive to the analytic method used. This highlights the complexity of evaluating the association between EPO dose, Hb level, and mortality in these studies.


Subject(s)
Erythropoietin/administration & dosage , Erythropoietin/adverse effects , Renal Dialysis/mortality , Adult , Aged , Cohort Studies , Dose-Response Relationship, Drug , Drug Administration Schedule , Epoetin Alfa , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/drug therapy , Kidney Failure, Chronic/mortality , Male , Middle Aged , Recombinant Proteins , Retrospective Studies
10.
Environ Health Perspect ; 115(3): 370-6, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17431485

ABSTRACT

OBJECTIVE: We estimated 2,4-dichlorophenoxyacetic acid (2,4-D) exposure and systemic dose in farm family members following an application of 2,4-D on their farm. METHODS: Farm families were recruited from licensed applicators in Minnesota and South Carolina. Eligible family members collected all urine during five 24-hr intervals, 1 day before through 3 days after an application of 2,4-D. Exposure profiles were characterized with 24-hr urine 2,4-D concentrations, which then were related to potential predictors of exposure. Systemic dose was estimated using the urine collections from the application day through the third day after application. RESULTS: Median urine 2,4-D concentrations at baseline and day after application were 2.1 and 73.1 microg/L for applicators, below the limit of detection, and 1.2 microg/L for spouses, and 1.5 and 2.9 microg/L for children. The younger children (4-11 years of age) had higher median post-application concentrations than the older children (> or = 12 years of age) (6.5 vs. 1.9 microg/L). The geometric mean systemic doses (micrograms per kilogram body weight) were 2.46 (applicators), 0.8 (spouses), 0.22 (all children), 0.32 (children 4-11 years of age), and 0.12 (children > or = 12 years of age). Exposure to the spouses and children was primarily determined by direct contact with the application process and the number of acres treated. Multivariate models identified glove use, repairing equipment, and number of acres treated as predictors of exposure in the applicators. CONCLUSIONS: We observed considerable heterogeneity of 2,4-D exposure among farm family members, primarily attributable to level of contact with the application process. Awareness of this variability and the actual magnitude of exposures are important for developing exposure and risk characterizations in 2,4-D-exposed agricultural populations.


Subject(s)
2,4-Dichlorophenoxyacetic Acid/urine , Herbicides/urine , Adolescent , Adult , Agriculture , Child , Child, Preschool , Environmental Monitoring , Family , Female , Humans , Male , Middle Aged , Minnesota
11.
J Expo Sci Environ Epidemiol ; 17(4): 350-7, 2007 Jul.
Article in English | MEDLINE | ID: mdl-16788681

ABSTRACT

In pesticide biomonitoring studies, researchers typically collect either single voids or daily (24-h) urine samples. Collection of 24-h urine samples is considered the "gold-standard", but this method places a high burden on study volunteers, requires greater resources, and may result in misclassification of exposure or underestimation of dose due to noncompliance with urine collection protocols. To evaluate the potential measurement error introduced by single void samples, we present an analysis of exposure and dose for two commonly used pesticides based on single morning void (MV) and 24-h urine collections in farmers and farm children. The agreement between the MV concentration and its corresponding 24-h concentration was analyzed using simple graphical and statistical techniques and risk assessment methodology. A consistent bias towards overprediction of pesticide concentration was found among the MVs, likely in large part due to the pharmacokinetic time course of the analytes in urine. These results suggest that the use of single voids can either over- or under-estimate daily exposure if recent pesticide applications have occurred. This held true for both farmers as well as farm children, who were not directly exposed to the applications. As a result, single void samples influenced the number of children exposed to chlorpyrifos whose daily dose estimates were above levels of toxicologic significance. In populations where fluctuations in pesticide exposure are expected (e.g., farm families), the pharmacokinetics of the pesticide and the timing of exposure events and urine collection must be understood when relying on single voids as a surrogate for longer time-frames of exposure.


Subject(s)
2,4-Dichlorophenoxyacetic Acid/urine , Agriculture , Family , Pesticides/urine , Pyridones/urine , Adolescent , Biomarkers/urine , Child , Child, Preschool , Chlorpyrifos/pharmacokinetics , Chlorpyrifos/urine , Environmental Exposure/analysis , Environmental Monitoring/methods , Female , Humans , Male , Occupational Exposure/analysis , Pesticides/pharmacokinetics , Urinalysis/methods , Urinalysis/standards
12.
Epidemiology ; 17(3): 248-9, 2006 May.
Article in English | MEDLINE | ID: mdl-16617269
13.
J Expo Sci Environ Epidemiol ; 16(5): 447-56, 2006 Sep.
Article in English | MEDLINE | ID: mdl-16570094

ABSTRACT

We used urinary biological monitoring to characterize chlorpyrifos (O,O-diethyl-O-(3,5,6-trichloro-2-pyridinyl) phosphororthioate) exposure to farm family members from Minnesota and South Carolina who participated in the Farm Family Exposure Study. Five consecutive 24-h urine samples were obtained from 34 families of licensed pesticide applicators 1 day before through 3 days after a chlorpyrifos application. Daily 3,5,6-trichloro-2-pyridinol (TCP) urinary concentrations characterized exposure profiles of the applicator, the spouse, and children aged 4-17 years. Self-reported and observed determinants of exposure were compared to the maximum postapplication TCP concentration. All participants had detectable (> or = 1 microg/l) urinary TCP concentrations at baseline. Applicators' peak TCP levels occurred the day after the application (geometric mean (GM) = 19.0 microg/l). Postapplication TCP change from baseline in the spouses and children was negligible, and the only reliable predictor of exposure was assisting with the application for children aged 12 years and older. The applicators' exposure was primarily influenced by the chemical formulation (GM = 11.3 microg/l for granular and 30.9 microg/l for liquid), and the number of loads applied. Repairing equipment, observed skin contact, and eating during the application were moderately associated TCP levels for those who applied liquid formulations. Estimated absorbed doses (microg chlorpyrifos/kg bodyweight) were calculated based on TCP excretion summed over the 4 postapplication days and corrected for pharmacokinetic recovery. The GM doses were 2.1, 0.7, and 1.0 microg/kg bodyweight for applicators, spouses, and children, respectively. Chlorpyrifos exposure to farm family members from the observed application was largely determined by the extent of contact with the mixing, loading, and application process.


Subject(s)
Agriculture , Chlorpyrifos/toxicity , Occupational Exposure , Adolescent , Adult , Child , Child, Preschool , Chlorpyrifos/urine , Environmental Exposure , Environmental Monitoring/methods , Female , Humans , Male , Middle Aged , Pesticides/toxicity , Pyridones/urine
14.
Epidemiology ; 17(1): 69-74, 2006 Jan.
Article in English | MEDLINE | ID: mdl-16357597

ABSTRACT

BACKGROUND: Epidemiologists often assess lifetime pesticide exposure by questioning participants about use of specific pesticides and associated work practices. Recently, Dosemeci and colleagues proposed an algorithm to estimate lifetime average exposure intensity from questionnaire information. We evaluated this algorithm against measured urinary pesticide concentrations for farmers who applied glyphosate (n = 48), 2,4-D (n = 34), or chlorpyrifos (n = 34). METHODS: Algorithm scores were calculated separately based on trained field observers' and farmers' evaluations of application conditions. Statistical analyses included nonparametric correlations, assessment of categorical agreement, and categorical evaluation of exposure distributions. RESULTS: Based on field observers' assessments, there were moderate correlations between algorithm scores and urine concentrations for glyphosate (r = 0.47; 95% confidence interval [CI] = 0.21 to 0.66) and 2,4-D (0.45; 0.13 to 0.68). Correlations were lower when algorithm scores were based on participants' self-reports (for glyphosate, r = 0.23 [CI = -0.07 to 0.48]; for 2,4-D, r = 0.25 [-0.10 to 0.54]). For chlorpyrifos, there were contrasting correlations for liquid (0.42; 0.01 to 0.70) and granular formulations (-0.44; -0.83 to 0.29) based on both observers' and participants' inputs. Percent agreement in categorical analyses for the 3 pesticides ranged from 20% to 44%, and there was appreciable overlap in the exposure distributions across categories. CONCLUSIONS: Our results demonstrate the importance of collecting type of pesticide formulation and suggest a generic exposure assessment is likely to result in appreciable exposure misclassification for many pesticides.


Subject(s)
Agriculture , Environmental Monitoring , Occupational Exposure/classification , Humans , Minnesota , Protective Clothing , South Carolina
15.
Scand J Work Environ Health ; 31 Suppl 1: 98-104; discussion 63-5, 2005.
Article in English | MEDLINE | ID: mdl-16190155

ABSTRACT

OBJECTIVES: The Farm Family Exposure Study was conducted to evaluate real-world pesticide exposure for farmers, spouses, and children. METHODS: Eligible farm families from Minnesota and South Carolina were randomly selected from a roster of licensed private pesticide applicators. Eligibility required that the family include a farmer, spouse, and at least one child between the ages of 4 and 17 years, that the family live on the farm, that the farmer planned to apply one of the target pesticides [glyphosate, chlorpyrifos, 2,4-dichlorophenoxy acetic acid (2,4-D)] to at least 10 acres (4.1 hectares) of land within 1 mile (1.6 kilometers) of the house. For each family member, geometric means were calculated for 24-hour composite urinary samples, with a 1 ppb (part per billion) limit of detection, the day before, the day of, and for 3 days after the application. RESULTS: For the farmers, the peak geometric mean concentrations were 3 ppb for glyphosate, 64 ppb for 2,4-D, and 19 ppb for the primary chlorpyrifos metabolite. For the spouses and children, the percentage with detectable values varied by chemical, although the average values for each chemical did not vary during the study period. The applicators had the highest urine pesticide concentrations, children had much lower values, and spouses had the lowest values. Exposure to family members was largely, though not exclusively, determined by the degree of direct contact with the application process. The exposure profile varied for the three chemicals for each family member. CONCLUSIONS: The data of this study indicate the importance of chemical-specific considerations when exposure assessments are planned in epidemiologic studies.


Subject(s)
Environmental Exposure , Family , Pesticides/toxicity , Humans , Minnesota , South Carolina
16.
Scand J Work Environ Health ; 31 Suppl 1: 105-9; discussion 63-5, 2005.
Article in English | MEDLINE | ID: mdl-16190156

ABSTRACT

Epidemiologic studies rely on participants' recall to classify their exposure to specific pesticides. As exposure classification evolves, an important issue is whether the recall of pesticide application details can be used to derive measures of exposure intensity or cumulative exposure. A preliminary analysis of biomonitoring data for farmers before, during, and after a pesticide application suggests variation for different pesticides in the proportion with detectable urinary concentrations, urinary levels, and patterns of uptake and elimination. These findings, and the limited predictive modeling done to date, suggest that chemical-specific differences need to be considered in exposure classification schemes. An analysis of biomonitoring data for farm spouses and children found few with appreciable changes in the urinary concentration after a pesticide application. These findings point to the need to validate assumptions about exposures in studies of people who are not directly involved in pesticide application.


Subject(s)
Agricultural Workers' Diseases/chemically induced , Environmental Exposure , Epidemiologic Studies , Family , Occupational Exposure , Pesticides/toxicity , Humans
18.
J Expo Anal Environ Epidemiol ; 15(6): 491-9, 2005 Nov.
Article in English | MEDLINE | ID: mdl-15900312

ABSTRACT

PURPOSE: The Farm Family Exposure Study was initiated to characterize pesticide exposure to farm family members around the time of one pesticide application in a manner that will facilitate exposure assessment in epidemiologic studies of pesticides. METHODS: A sample of farm families with children was recruited by randomly selecting farmers from lists of licensed pesticide applicators in Minnesota and South Carolina. Eligible families were selected from among those who planned to apply one of three chemicals, glyphosate, 2,4-D, or chlorpyrifos, as part of their normal operations. The applicator, spouse, and all children in the family ages 4-17 years were included in the study. The applicator and spouse completed self-administered questionnaires addressing demographics, farming practices and potential exposures to them and their children. Field observers documented the application, recorded application practices, equipment, potential exposures, and the presence of children or spouses in the immediate vicinity of pesticide activities. All study participants were asked to collect each urine void for 5 days, 1 day before through 3 days after the application. Pesticides were measured in 24-h composite urine samples with a one part per billion limit of detection. RESULTS: Of 11,164 applicators screened, 994 families met the inclusion criteria. Of these, 95 families were enrolled. Enrollees were similar in most characteristics to their peers who were not participants in the study. In total, there were 106 applications, 10 of which involved more than one chemical. This resulted in urinary data for 48 farmers and spouses and their 79 children for glyphosate, 34 farmers and spouses and their 50 children for chlorpyrifos, and 34 farmers and spouses and their 53 children for 2,4-D. Compliance with the 24-h urine collection was particularly good for the adult participants. There were more missing samples for children than for adults, but overall compliance was high. CONCLUSION: The Farm Family Exposure Study should provide insights about pesticide exposure under real world conditions and thereby facilitate improved exposure assessment in epidemiologic studies of agricultural populations.


Subject(s)
Agriculture , Environmental Monitoring , Family , Pesticides/toxicity , Adolescent , Child , Child, Preschool , Humans , Minnesota , Pesticides/urine , Pilot Projects , Sensitivity and Specificity , South Carolina
19.
Environ Health Perspect ; 112(3): 321-6, 2004 Mar.
Article in English | MEDLINE | ID: mdl-14998747

ABSTRACT

Glyphosate is the active ingredient in Roundup agricultural herbicides and other herbicide formulations that are widely used for agricultural, forestry, and residential weed control. As part of the Farm Family Exposure Study, we evaluated urinary glyphosate concentrations for 48 farmers, their spouses, and their 79 children (4-18 years of age). We evaluated 24-hr composite urine samples for each family member the day before, the day of, and for 3 days after a glyphosate application. Sixty percent of farmers had detectable levels of glyphosate in their urine on the day of application. The geometric mean (GM) concentration was 3 ppb, the maximum value was 233 ppb, and the highest estimated systemic dose was 0.004 mg/kg. Farmers who did not use rubber gloves had higher GM urinary concentrations than did other farmers (10 ppb vs. 2.0 ppb). For spouses, 4% had detectable levels in their urine on the day of application. Their maximum value was 3 ppb. For children, 12% had detectable glyphosate in their urine on the day of application, with a maximum concentration of 29 ppb. All but one of the children with detectable concentrations had helped with the application or were present during herbicide mixing, loading, or application. None of the systemic doses estimated in this study approached the U.S. Environmental Protection Agency reference dose for glyphosate of 2 mg/kg/day. Nonetheless, it is advisable to minimize exposure to pesticides, and this study did identify specific practices that could be modified to reduce the potential for exposure.


Subject(s)
Agriculture , Environmental Exposure , Glycine/analogs & derivatives , Glycine/urine , Herbicides/urine , Occupational Exposure , Adolescent , Adult , Child , Child, Preschool , Cross-Sectional Studies , Family Relations , Female , Humans , Male , Middle Aged , Protective Clothing , Glyphosate
20.
J Toxicol Clin Toxicol ; 40(7): 885-92, 2002.
Article in English | MEDLINE | ID: mdl-12507058

ABSTRACT

Glyphosate is among the pesticides most frequently reported to the California EPA Pesticide Illness Surveillance Program. We analyzed glyphosate-related calls to the Pesticide Illness Surveillance Program in order to assess the number of reports involving systemic symptoms and to better understand the nature and severity of reported cases. Data on glyphosate and other pesticides are available for the years 1982-1997 including: type of exposure (agricultural/other); target organ(s) affected (skin/eye/respiratory/systemic); exposure(s); an assessment of causal relationship (possible, probable, or definite); and limited medical text. Of 815 total glyphosate calls, most involved topical irritation of the eye (n = 399), skin (n = 250), upper airway (n = 7), or combinations of these sites (n = 32) without systemic symptoms. Of the 187 systemic cases, only 22 had symptoms recorded as probably or definitely related to glyphosate exposure alone. The reported symptoms were not severe, expected to be limited in duration, and frequently inconsistent with the route of exposure and/or previous experience with glyphosate. We conclude that call volume is not a reliable indicator of the actual incidence or severity of glyphosate-related incidents in California.


Subject(s)
Glycine/analogs & derivatives , Glycine/poisoning , Herbicides/poisoning , Occupational Exposure/statistics & numerical data , Accidents, Occupational , Administration, Topical , Adult , Agricultural Workers' Diseases/chemically induced , Agricultural Workers' Diseases/epidemiology , California/epidemiology , Databases, Factual , Humans , Middle Aged , Occupational Exposure/adverse effects , Poison Control Centers , Glyphosate
SELECTION OF CITATIONS
SEARCH DETAIL
...