Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
1.
Am Heart J ; 273: 72-82, 2024 07.
Article in English | MEDLINE | ID: mdl-38621575

ABSTRACT

BACKGROUND: The reduction in cardiovascular disease (CVD) events with edetate disodium (EDTA) in the Trial to Assess Chelation Therapy (TACT) suggested that chelation of toxic metals might provide novel opportunities to reduce CVD in patients with diabetes. Lead and cadmium are vasculotoxic metals chelated by EDTA. We present baseline characteristics for participants in TACT2, a randomized, double-masked, placebo-controlled trial designed as a replication of the TACT trial limited to patients with diabetes. METHODS: TACT2 enrolled 1,000 participants with diabetes and prior myocardial infarction, age 50 years or older between September 2016 and December 2020. Among 959 participants with at least one infusion, 933 had blood and/or urine metals measured at the Centers for Diseases Control and Prevention using the same methodology as in the National Health and Nutrition Examination Survey (NHANES). We compared metal levels in TACT2 to a contemporaneous subset of NHANES participants with CVD, diabetes and other inclusion criteria similar to TACT2's participants. RESULTS: At baseline, the median (interquartile range, IQR) age was 67 (60, 72) years, 27% were women, 78% reported white race, mean (SD) BMI was 32.7 (6.6) kg/m2, 4% reported type 1 diabetes, 46.8% were treated with insulin, 22.3% with GLP1-receptor agonists or SGLT-2 inhibitors, 90.2% with aspirin, warfarin or P2Y12 inhibitors, and 86.5% with statins. Blood lead was detectable in all participants; median (IQR) was 9.19 (6.30, 13.9) µg/L. Blood and urine cadmium were detectable in 97% and median (IQR) levels were 0.28 (0.18, 0.43) µg/L and 0.30 (0.18, 0.51) µg/g creatinine, respectively. Metal levels were largely similar to those in the contemporaneous NHANES subset. CONCLUSIONS: TACT2 participants were characterized by high use of medication to treat CVD and diabetes and similar baseline metal levels as in the general US population. TACT2 will determine whether chelation therapy reduces the occurrence of subsequent CVD events in this high-risk population. CLINICAL TRIALS REGISTRATION: ClinicalTrials.gov. Identifier: NCT02733185. https://clinicaltrials.gov/study/NCT02733185.


Subject(s)
Chelation Therapy , Humans , Female , Male , Middle Aged , Aged , Chelation Therapy/methods , Double-Blind Method , Edetic Acid/therapeutic use , Lead/blood , Lead/urine , Cadmium/urine , Cadmium/blood , Chelating Agents/therapeutic use , Cardiovascular Diseases/prevention & control , Cardiovascular Diseases/blood
2.
J Am Heart Assoc ; 13(2): e031256, 2024 Jan 16.
Article in English | MEDLINE | ID: mdl-38205795

ABSTRACT

BACKGROUND: Chronic lead exposure is associated with both subclinical and clinical cardiovascular disease. We evaluated whether declines in blood lead were associated with changes in systolic and diastolic blood pressure in adult American Indian participants from the SHFS (Strong Heart Family Study). METHODS AND RESULTS: Lead in whole blood was measured in 285 SHFS participants in 1997 to 1999 and 2006 to 2009. Blood pressure and measures of cardiac geometry and function were obtained in 2001 to 2003 and 2006 to 2009. We used generalized estimating equations to evaluate the association of declines in blood lead with changes in blood pressure; cardiac function and geometry measures were considered secondary. Mean blood lead was 2.04 µg/dL at baseline. After ≈10 years, mean decline in blood lead was 0.67 µg/dL. In fully adjusted models, the mean difference in systolic blood pressure comparing the highest to lowest tertile of decline (>0.91 versus <0.27 µg/dL) in blood lead was -7.08 mm Hg (95% CI, -13.16 to -1.00). A significant nonlinear association between declines in blood lead and declines in systolic blood pressure was detected, with significant linear associations where blood lead decline was 0.1 µg/dL or higher. Declines in blood lead were nonsignificantly associated with declines in diastolic blood pressure and significantly associated with declines in interventricular septum thickness. CONCLUSIONS: Declines in blood lead levels in American Indian adults, even when small (0.1-1.0 µg/dL), were associated with reductions in systolic blood pressure. These findings suggest the need to further study the cardiovascular impacts of reducing lead exposures and the importance of lead exposure prevention.


Subject(s)
Cardiovascular Diseases , Hypertension , Lead , Adult , Humans , American Indian or Alaska Native , Blood Pressure , Cardiovascular Diseases/complications , Hypertension/diagnosis , Hypertension/epidemiology , Lead/blood
3.
Cancer Epidemiol Biomarkers Prev ; 33(1): 80-87, 2024 01 09.
Article in English | MEDLINE | ID: mdl-37823832

ABSTRACT

BACKGROUND: Biomarkers of exposure are tools for understanding the impact of tobacco use on health outcomes if confounders like demographics, use behavior, biological half-life, and other sources of exposure are accounted for in the analysis. METHODS: We performed multiple regression analysis of longitudinal measures of urinary biomarkers of alkaloids, tobacco-specific nitrosamines, polycyclic aromatic hydrocarbons, volatile organic compounds (VOC), and metals to examine the sample-to-sample consistency in Waves 1 and 2 of the Population Assessment of Tobacco and Health (PATH) Study including demographic characteristics and use behavior variables of persons who smoked exclusively. Regression coefficients, within- and between-person variance, and intra-class correlation coefficients (ICC) were compared with biomarker smoking/nonsmoking population mean ratios and biological half-lives. RESULTS: Most biomarkers were similarly associated with sex, age, race/ethnicity, and product use behavior. The biomarkers with larger smoking/nonsmoking population mean ratios had greater regression coefficients related to recency of exposure. For VOC and alkaloid metabolites, longer biological half-life was associated with lower within-person variance. For each chemical class studied, there were biomarkers that demonstrated good ICCs. CONCLUSIONS: For most of the biomarkers of exposure reported in the PATH Study, for people who smoke cigarettes exclusively, associations are similar between urinary biomarkers of exposure and demographic and use behavior covariates. Biomarkers of exposure within-subject consistency is likely associated with nontobacco sources of exposure and biological half-life. IMPACT: Biomarkers measured in the PATH Study provide consistent sample-to-sample measures from which to investigate the association of adverse health outcomes with the characteristics of cigarettes and their use.


Subject(s)
Alkaloids , Tobacco Products , Volatile Organic Compounds , Humans , Half-Life , Biomarkers , Smoking/epidemiology
4.
Environ Res ; 215(Pt 3): 114101, 2022 12.
Article in English | MEDLINE | ID: mdl-35977585

ABSTRACT

BACKGROUND: Many American Indian (AI) communities are in areas affected by environmental contamination, such as toxic metals. However, studies assessing exposures in AI communities are limited. We measured blood metals in AI communities to assess historical exposure and identify participant characteristics associated with these levels in the Strong Heart Study (SHS) cohort. METHOD: Archived blood specimens collected from participants (n = 2014, all participants were 50 years of age and older) in Arizona, Oklahoma, and North and South Dakota during SHS Phase-III (1998-1999) were analyzed for cadmium, lead, manganese, mercury, and selenium using inductively coupled plasma triple quadrupole mass spectrometry. We conducted descriptive analyses for the entire cohort and stratified by selected subgroups, including selected demographics, health behaviors, income, waist circumference, and body mass index. Bivariate associations were conducted to examine associations between blood metal levels and selected socio-demographic and behavioral covariates. Finally, multivariate regression models were used to assess the best model fit that predicted blood metal levels. FINDINGS: All elements were detected in 100% of study participants, with the exception of mercury (detected in 73% of participants). The SHS population had higher levels of blood cadmium and manganese than the general U.S. population 50 years and older. The median blood mercury in the SHS cohort was at about 30% of the U.S. reference population, potentially due to low fish consumption. Participants in North Dakota and South Dakota had the highest blood cadmium, lead, manganese, and selenium, and the lowest total mercury levels, even after adjusting for covariates. In addition, each of the blood metals was associated with selected demographic, behavioral, income, and/or weight-related factors in multivariate models. These findings will help guide the tribes to develop education, outreach, and strategies to reduce harmful exposures and increase beneficial nutrient intake in these AI communities.


Subject(s)
American Indian or Alaska Native , Cadmium , Lead , Manganese , Mercury , Selenium , Cadmium/blood , Humans , Lead/blood , Manganese/blood , Mercury/blood , Middle Aged , Selenium/blood , American Indian or Alaska Native/statistics & numerical data
5.
Int J Hyg Environ Health ; 234: 113713, 2021 05.
Article in English | MEDLINE | ID: mdl-33621861

ABSTRACT

We developed an inductively coupled plasma mass spectrometry (ICP-MS) method using Universal Cell Technology (UCT) with a PerkinElmer NexION ICP-MS, to measure arsenic (As), chromium (Cr), and nickel (Ni) in human urine samples. The advancements of the UCT allowed us to expand the calibration range to make the method applicable for both low concentrations of biomonitoring applications and high concentrations that may be observed from acute exposures and emergency response. Our method analyzes As and Ni in kinetic energy discrimination (KED) mode with helium (He) gas, and Cr in dynamic reaction cell (DRC) mode with ammonia (NH3) gas. The combination of these elements is challenging because a carbon source, ethanol (EtOH), is required for normalization of As ionization in urine samples, which creates a spectral overlap (40Ar12C+) on 52Cr. This method additionally improved lab efficiency by combining elements from two of our previously published methods(Jarrett et al., 2007; Quarles et al., 2014) allowing us to measure Cr and Ni concentrations in urine samples collected as part of the National Health and Nutrition Examination Survey (NHANES) beginning with the 2017-2018 survey cycle. We present our rigorous validation of the method selectivity and accuracy using National Institute of Standards and Technology (NIST) Standard Reference Materials (SRM), precision using in-house prepared quality control materials, and a discussion of the use of a modified UCT, a BioUCell, to address an ion transmission phenomenon we observed on the NexION 300 platform when using higher elemental concentrations and high cell gas pressures. The rugged method detection limits, calculated from measurements in more than 60 runs, for As, Cr, and Ni are 0.23 µg L-1, 0.19 µg L-1, and 0.31 µg L-1, respectively.


Subject(s)
Arsenic , Biological Monitoring , Chromium , Humans , Nickel , Nutrition Surveys , Technology
6.
Nicotine Tob Res ; 23(5): 790-797, 2021 05 04.
Article in English | MEDLINE | ID: mdl-33590857

ABSTRACT

INTRODUCTION: Concurrent use of tobacco cigarettes and e-cigarettes ("dual use") is common among tobacco users. Little is known about differences in demographics and toxicant exposure among subsets of dual users. AIMS AND METHODS: We analyzed data from adult dual users (current every/some day users of tobacco cigarettes and e-cigarettes, n = 792) included in the PATH Study Wave 1 (2013-2014) and provided urine samples. Samples were analyzed for biomarkers of exposure to nicotine and selected toxicants (tobacco-specific nitrosamine NNK [NNAL], lead, cadmium, naphthalene [2-naphthol], pyrene [1-hydroxypyrene], acrylonitrile [CYMA], acrolein [CEMA], and acrylamide [AAMA]). Subsets of dual users were compared on demographic, behavioral, and biomarker measures to exclusive cigarette smokers (n = 2411) and exclusive e-cigarette users (n = 247). RESULTS: Most dual users were predominant cigarette smokers (70%), followed by daily dual users (13%), non-daily concurrent dual users (10%), and predominant vapers (7%). Dual users who smoked daily showed significantly higher biomarker concentrations compared with those who did not smoke daily. Patterns of e-cigarette use had little effect on toxicant exposure. Dual users with high toxicant exposure were generally older, female, and smoked more cigarettes per day. Dual users who had low levels of biomarkers of exposure were generally younger, male, and smoked non-daily. CONCLUSIONS: In 2013-2014, most dual users smoked cigarettes daily and used e-cigarettes occasionally. Cigarette smoking appears to be the primary driver of toxicant exposure among dual users, with little-to-no effect of e-cigarette use on biomarker levels. Results reinforce the need for dual users to stop smoking tobacco cigarettes to reduce toxicant exposure. IMPLICATIONS: With considerable dual use of tobacco cigarettes and e-cigarettes in the United States, it is important to understand differences in toxicant exposure among subsets of dual users, and how these differences align with user demographics. Findings suggest most dual users smoke daily and use e-cigarettes intermittently. Low exposure to toxicants was most common among younger users, males, and intermittent smokers; high exposure to toxicants was most common among older users, females, and heavier cigarette smokers. Results underscore the heterogeneity occurring within dual users, and the need to quit smoking cigarettes completely in order to reduce toxicant exposure.


Subject(s)
Cigarette Smoking/urine , Electronic Nicotine Delivery Systems , Health Behavior , Nicotine/urine , Tobacco Products/adverse effects , Vaping/urine , Adult , Biomarkers/urine , Cigarette Smoking/adverse effects , Cigarette Smoking/epidemiology , Female , Humans , Male , Metals, Heavy/urine , Middle Aged , Nitrosamines/urine , Polycyclic Aromatic Hydrocarbons/urine , Pyrenes/urine , Smokers , Nicotiana , United States , Vaping/epidemiology
7.
Clin Chem Lab Med ; 59(4): 671-679, 2021 03 26.
Article in English | MEDLINE | ID: mdl-33098630

ABSTRACT

OBJECTIVES: Matrix differences among serum samples from non-pregnant and pregnant patients could bias measurements. Standard Reference Material 1949, Frozen Human Prenatal Serum, was developed to provide a quality assurance material for the measurement of hormones and nutritional elements throughout pregnancy. METHODS: Serum from non-pregnant women and women in each trimester were bottled into four levels based on pregnancy status and trimester. Liquid chromatography tandem mass spectrometry (LC-MS/MS) methods were developed and applied to the measurement of thyroid hormones, vitamin D metabolites, and vitamin D-binding protein (VDBP). Copper, selenium, and zinc measurements were conducted by inductively coupled plasma dynamic reaction cell MS. Thyroid stimulating hormone (TSH), thyroglobulin (Tg), and thyroglobulin antibody concentrations were analyzed using immunoassays and LC-MS/MS (Tg only). RESULTS: Certified values for thyroxine and triiodothyronine, reference values for vitamin D metabolites, VDBP, selenium, copper, and zinc, and information values for reverse triiodothyronine, TSH, Tg, and Tg antibodies were assigned. Significant differences in serum concentrations were evident for all analytes across the four levels (p≤0.003). TSH measurements were significantly different (p<0.0001) among research-only immunoassays. Tg concentrations were elevated in research-only immunoassays vs. Federal Drug Administration-approved automated immunoassay and LC-MS/MS. Presence of Tg antibodies increased differences between automated immunoassay and LC-MS/MS. CONCLUSIONS: The analyte concentrations' changes consistent with the literature and the demonstration of matrix interferences in immunoassay Tg measurements indicate the functionality of this material by providing a relevant matrix-matched reference material for the different stages of pregnancy.


Subject(s)
Selenium , Trace Elements , Biomarkers/blood , Chromatography, Liquid , Copper , Female , Humans , Pregnancy , Tandem Mass Spectrometry , Thyroglobulin/blood , Thyroid Gland , Thyrotropin , Trace Elements/blood , Vitamin D/blood , Vitamins , Zinc
8.
Environ Res ; 190: 109943, 2020 11.
Article in English | MEDLINE | ID: mdl-32750552

ABSTRACT

Navajo Nation residents are at risk for exposure to uranium and other co-occurring metals found in abandoned mine waste. The Navajo Birth Cohort Study (NBCS) was initiated in 2010 to address community concerns regarding the impact of chronic environmental exposure to metals on pregnancy and birth outcomes. The objectives of this paper were to 1) evaluate maternal urine concentrations of key metals at enrollment and delivery from a pregnancy cohort; and 2) compare the NBCS to the US general population by comparing representative summary statistical values. Pregnant Navajo women (N = 783, age range 14-45 years) were recruited from hospital facilities on the Navajo Nation during prenatal visits and urine samples were collected by trained staff in pre-screened containers. The U.S. Centers for Disease Control and Prevention (CDC), National Center for Environmental Health's (NCEH) Division of Laboratory Sciences (DLS) analyzed urine samples for metals. Creatinine-corrected urine concentrations of cadmium decreased between enrollment (1st or 2nd trimester) and delivery (3rd trimester) while urine uranium concentrations were not observed to change. Median and 95th percentile values of maternal NBCS urine concentrations of uranium, manganese, cadmium, and lead exceeded respective percentiles for National Health and Nutrition Evaluation Survey (NHANES) percentiles for women (ages 14-45 either pregnant or not pregnant.) Median NBCS maternal urine uranium concentrations were 2.67 (enrollment) and 2.8 (delivery) times greater than the NHANES median concentration, indicating that pregnant Navajo women are exposed to metal mixtures and have higher uranium exposure compared to NHANES data for women. This demonstrates support for community concerns about uranium exposure and suggests a need for additional analyses to evaluate the impact of maternal metal mixtures exposure on birth outcomes.


Subject(s)
Environmental Exposure , Uranium , Adolescent , Adult , Cohort Studies , Environmental Exposure/analysis , Female , Humans , Middle Aged , Nutrition Surveys , Pregnancy , Young Adult
9.
Neurology ; 94(14): e1495-e1501, 2020 04 07.
Article in English | MEDLINE | ID: mdl-32127386

ABSTRACT

OBJECTIVE: To identify the etiology of an outbreak of spastic paraparesis among women and children in the Western Province of Zambia suspected to be konzo. METHODS: We conducted an outbreak investigation of individuals from Mongu District, Western Province, Zambia, who previously developed lower extremity weakness. Cases were classified with the World Health Organization definition of konzo. Active case finding was conducted through door-to-door evaluation in affected villages and sensitization at local health clinics. Demographic, medical, and dietary history was used to identify common exposures in all cases. Urine and blood specimens were taken to evaluate for konzo and alternative etiologies. RESULTS: We identified 32 cases of konzo exclusively affecting children 6 to 14 years of age and predominantly females >14 years of age. Fourteen of 15 (93%) cases ≥15 years of age were female, 11 (73%) of whom were breastfeeding at the time of symptom onset. Cassava was the most commonly consumed food (median [range] 14 [4-21] times per week), while protein-rich foods were consumed <1 time per week for all cases. Of the 30 patients providing urine specimens, median thiocyanate level was 281 (interquartile range 149-522) µmol/L, and 73% of urine samples had thiocyanate levels >136 µmol/L, the 95th percentile of the US population in 2013 to 2014. CONCLUSION: This investigation revealed the first documented cases of konzo in Zambia, occurring in poor communities with diets high in cassava and low in protein, consistent with previous descriptions from neighboring countries.


Subject(s)
Paraparesis, Spastic/epidemiology , Adolescent , Age Factors , Breast Feeding , Child , Cyanides/analysis , Diet , Disease Outbreaks , Female , Humans , Male , Manihot/chemistry , Muscle Weakness/epidemiology , Muscle Weakness/etiology , Protein Deficiency/epidemiology , Rain , Seasons , Thiocyanates/urine , Young Adult , Zambia/epidemiology
10.
Environ Res ; 183: 109208, 2020 04.
Article in English | MEDLINE | ID: mdl-32058143

ABSTRACT

OBJECTIVE: The objective of the current study is to report on urine, blood and serum metal concentrations to characterize exposures to trace elements and micronutrient levels in both pregnant women and women of child-bearing age in the U.S. National Health and Nutrition Examination Survey (NHANES) years 1999-2016. METHODS: Urine and blood samples taken from NHANES participants were analyzed for thirteen urine metals, three blood metals, three serum metals, speciated mercury in blood and speciated arsenic in urine. Adjusted and unadjusted least squares geometric means and 95% confidence intervals were calculated for all participants among women aged 15-44 years. Changes in exposure levels over time were also examined. Serum cotinine levels were used to adjust for smoke exposure, as smoking is a source of metal exposure. RESULTS: Detection rates for four urine metals from the ATSDR Substance Priority List: arsenic, lead, mercury and cadmium were ~83-99% for both pregnant and non-pregnant women of child bearing age. A majority of metal concentrations were higher in pregnant women compared to non-pregnant women. Pregnant women had higher mean urine total arsenic, urine mercury, and urine lead; however, blood lead and mercury were higher in non-pregnant women. Blood lead, cadmium, mercury, as well as urine antimony, cadmium and lead in women of childbearing age decreased over time, while urine cobalt increased over time. CONCLUSIONS: Pregnant women in the US have been exposed to several trace metals, with observed concentrations for some trace elements decreasing since 1999.


Subject(s)
Arsenic , Maternal Exposure , Mercury , Trace Elements , Adolescent , Adult , Cadmium , Child , Female , Humans , Maternal Exposure/statistics & numerical data , Nutrition Surveys , Pregnancy , United States , Young Adult
11.
Cancer Epidemiol Biomarkers Prev ; 29(3): 659-667, 2020 03.
Article in English | MEDLINE | ID: mdl-31988072

ABSTRACT

BACKGROUND: Monitoring population-level toxicant exposures from smokeless tobacco (SLT) use is important for assessing population health risks due to product use. In this study, we assessed tobacco biomarkers of exposure (BOE) among SLT users from the Wave 1 (2013-2014) of the Population Assessment of Tobacco and Health (PATH) Study. METHODS: Urinary biospecimens were collected from adults ages 18 and older. Biomarkers of nicotine, tobacco-specific nitrosamines (TSNA), polycyclic aromatic hydrocarbons (PAH), volatile organic compounds (VOC), metals, and inorganic arsenic were analyzed and reported among exclusive current established SLT users in comparison with exclusive current established cigarette smokers, dual SLT and cigarette users, and never tobacco users. RESULTS: In general, SLT users (n = 448) have significantly higher concentrations of BOE to nicotine, TSNAs, and PAHs compared with never tobacco users; significant dose-response relationships between frequency of SLT use and biomarker concentrations were also reported among exclusive SLT daily users. Exclusive SLT daily users have higher geometric mean concentrations of total nicotine equivalent-2 (TNE2) and TSNAs than exclusive cigarette daily smokers. In contrast, geometric mean concentrations of PAHs and VOCs were substantially lower among exclusive SLT daily users than exclusive cigarette daily smokers. CONCLUSIONS: Our study produced a comprehensive assessment of SLT product use and 52 biomarkers of tobacco exposure. Compared with cigarette smokers, SLT users experience greater concentrations of some tobacco toxicants, including nicotine and TSNAs. IMPACT: Our data add information on the risk assessment of exposure to SLT-related toxicants. High levels of harmful constituents in SLT remain a health concern.


Subject(s)
Tobacco Use/adverse effects , Tobacco, Smokeless/toxicity , Adolescent , Adult , Biomarkers/urine , Carcinogens/analysis , Carcinogens/toxicity , Female , Humans , Longitudinal Studies , Male , Middle Aged , Nicotine/toxicity , Nicotine/urine , Nitrosamines , Polycyclic Aromatic Hydrocarbons/toxicity , Polycyclic Aromatic Hydrocarbons/urine , Prevalence , Smokers/statistics & numerical data , Tobacco Use/epidemiology , Tobacco Use/urine , United States/epidemiology , Volatile Organic Compounds/toxicity , Volatile Organic Compounds/urine , Young Adult
13.
J Public Health Manag Pract ; 25 Suppl 1, Lead Poisoning Prevention: S23-S30, 2019.
Article in English | MEDLINE | ID: mdl-30507766

ABSTRACT

CONTEXT: The Lead and Multielement Proficiency (LAMP) program is an external quality assurance program promoting high-quality blood-lead measurements. OBJECTIVES: To investigate the ability of US laboratories, participating in the Centers for Disease Control and Prevention (CDC) LAMP program to accurately measure blood-lead levels (BLL) 0.70 to 47.5 µg/dL using evaluation criteria of ±2 µg/dL or 10%, whichever is greater. METHODS: The CDC distributes bovine blood specimens to participating laboratories 4 times per year. We evaluated participant performance over 5 challenges on samples with BLL between 0.70 and 47.5 µg/dL. The CDC sent 15 pooled samples (3 samples shipped in 5 rounds) to US laboratories. The LAMP laboratories used 3 primary technologies to analyze lead in blood: inductively coupled plasma mass spectrometry, graphite furnace atomic absorption spectroscopy, and LeadCare technologies based on anodic stripping voltammetry. Laboratories reported their BLL analytical results to the CDC. The LAMP uses these results to provide performance feedback to the laboratories. SETTING: The CDC sent blood samples to approximately 50 US laboratories for lead analysis. PARTICIPANTS: Of the approximately 200 laboratories enrolled in LAMP, 38 to 46 US laboratories provided data used in this report (January 2017 to March 2018). RESULTS: Laboratory precision ranged from 0.26 µg/dL for inductively coupled plasma mass spectrometry to 1.50 µg/dL for LeadCare instruments. All participating US LAMP laboratories reported accurate BLL for 89% of challenge samples, using the ±2 µg/dL or 10% evaluation criteria. CONCLUSIONS: Laboratories participating in the CDC's LAMP program can accurately measure blood lead using the current Clinical Laboratory Improvement Amendments of 1988 guidance of ±4 µg/dL or ±10%, with a success rate of 96%. However, when we apply limits of ±2 µg/dL or ±10%, the success rate drops to 89%. When challenged with samples that have target values between 3 and 5 µg/dL, nearly 100% of reported results fall within ±4 µg/dL, while 5% of the results fall outside of the acceptability criteria used by the CDC's LAMP program. As public health focuses on lower blood lead levels, laboratories must evaluate their ability to successfully meet these analytical challenges surrounding successfully measuring blood lead. In addition proposed CLIA guidelines (±2 µg/dL or 10%) would be achievable performance by a majority of US laboratories participating in the LAMP program.


Subject(s)
Clinical Laboratory Techniques/standards , Lead/analysis , Quality Assurance, Health Care/methods , Centers for Disease Control and Prevention, U.S./organization & administration , Centers for Disease Control and Prevention, U.S./statistics & numerical data , Clinical Laboratory Techniques/methods , Clinical Laboratory Techniques/statistics & numerical data , Humans , Lead/blood , Program Development/methods , Quality Assurance, Health Care/statistics & numerical data , United States
14.
Environ Int ; 122: 310-315, 2019 01.
Article in English | MEDLINE | ID: mdl-30503317

ABSTRACT

INTRODUCTION: Cross-sectional studies suggest that postnatal blood lead (PbB) concentrations are negatively associated with child growth. Few studies prospectively examined this association in populations with lower PbB concentrations. We investigated longitudinal associations of childhood PbB concentrations and subsequent anthropometric measurements in a multi-ethnic cohort of girls. METHODS: Data were from The Breast Cancer and the Environment Research Program at three sites in the United States (U.S.): New York City, Cincinnati, and San Francisco Bay Area. Girls were enrolled at ages 6-8 years in 2004-2007. Girls with PbB concentrations collected at ≤10 years old (mean 7.8 years, standard deviation (SD) 0.82) and anthropometry collected at ≥3 follow-up visits were included (n = 683). The median PbB concentration was 0.99 µg/d (10th percentile = 0.59 µg/dL and 90th percentile = 2.00 µg/dL) and the geometric mean was 1.03 µg/dL (95% Confidence Interval (CI): 0.99, 1.06). For analyses, PbB concentrations were dichotomized as <1 µg/dL (n = 342) and ≥1 µg/dL (n = 341). Anthropometric measurements of height, body mass index (BMI), waist circumference (WC), and percent body fat (%BF) were collected at enrollment and follow-up visits through 2015. Linear mixed effects regression estimated how PbB concentrations related to changes in girls' measurements from ages 7-14 years. RESULTS: At 7 years, mean difference in height was -2.0 cm (95% CI: -3.0, -1.0) for girls with ≥1 µg/dL versus <1 µg/dL PbB concentrations; differences persisted, but were attenuated, with age to -1.5 cm (95% CI: -2.5, -0.4) at 14 years. Mean differences for BMI, WC, and BF% at 7 years between girls with ≥1 µg/dL versus <1 µg/dL PbB concentrations were -0.7 kg/m2 (95% CI: -1.2, -0.2), -2.2 cm (95% CI: -3.8, -0.6), and -1.8% (95% CI: -3.2, -0.4), respectively. Overall, these differences generally persisted with advancing age and at 14 years, differences were -0.8 kg/m2 (95% CI: -1.5, -0.02), -2.9 cm (95% CI: -4.8, -0.9), and -1.7% (95% CI: -3.1, -0.4) for BMI, WC, and BF%, respectively. CONCLUSIONS: These findings suggest that higher concentrations of PbB during childhood, even though relatively low by screening standards, may be inversely associated with anthropometric measurements in girls.


Subject(s)
Body Mass Index , Environmental Exposure , Lead/blood , Waist Circumference , Adolescent , Child , Cross-Sectional Studies , Environmental Exposure/analysis , Environmental Exposure/statistics & numerical data , Female , Humans , New York City/epidemiology
15.
Clin Chim Acta ; 485: 1-6, 2018 Oct.
Article in English | MEDLINE | ID: mdl-29894782

ABSTRACT

BACKGROUND: Comprehensive information on the effect of time and temperature storage on the measurement of elements in human, whole blood (WB) by inductively coupled plasma-dynamic reaction cell-mass spectrometry (ICP-DRC-MS) is lacking, particularly for Mn and Se. METHODS: Human WB was spiked at 3 concentration levels, dispensed, and then stored at 5 different temperatures: -70 °C, -20 °C, 4 °C, 23 °C, and 37 °C. At 3 and 5 weeks, and at 2, 4, 6, 8, 10, 12, 36 months, samples were analyzed for Pb, Cd, Mn, Se and total Hg, using ICP-DRC-MS. We used a multiple linear regression model including time and temperature as covariates to fit the data with the measurement value as the outcome. We used an equivalence test using ratios to determine if results from the test storage conditions, warmer temperature and longer time, were comparable to the reference storage condition of 3 weeks storage time at -70 °C. RESULTS: Model estimates for all elements in human WB samples stored in polypropylene cryovials at -70 °C were equivalent to estimates from samples stored at 37 °C for up to 2 months, 23 °C up to 10 months, and -20 °C and 4 °C for up to 36 months. Model estimates for samples stored for 3 weeks at -70 °C were equivalent to estimates from samples stored for 2 months at -20 °C, 4 °C, 23 °C and 37 °C; 10 months at -20 °C, 4 °C, and 23 °C; and 36 months at -20 °C and 4 °C. This equivalence was true for all elements and pools except for the low concentration blood pool for Cd. CONCLUSIONS: Storage temperatures of -20 °C and 4 °C are equivalent to -70 °C for stability of Cd, Mn, Pb, Se, and Hg in human whole blood for at least 36 months when blood is stored in sealed polypropylene vials. Increasing the sample storage temperature from -70 °C to -20 °C or 4 °C can lead to large energy savings. The best analytical results are obtained when storage time at higher temperature conditions (e.g. 23 °C and 37 °C) is minimized because recovery of Se and Hg is reduced. Blood samples stored in polypropylene cryovials also lose volume over time and develop clots at higher temperature conditions (e.g., 23 °C and 37 °C), making them unacceptable for elemental testing after 10 months and 2 months, respectively.


Subject(s)
Cadmium/blood , Lead/blood , Magnesium/blood , Mercury/blood , Selenium/blood , Temperature , Humans , Mass Spectrometry , Time Factors
16.
Pediatrics ; 140(2)2017 Aug.
Article in English | MEDLINE | ID: mdl-28771411

ABSTRACT

In 2012, the Centers for Disease Control and Prevention (CDC) adopted its Advisory Committee on Childhood Lead Poisoning Prevention recommendation to use a population-based reference value to identify children and environments associated with lead hazards. The current reference value of 5 µg/dL is calculated as the 97.5th percentile of the distribution of blood lead levels (BLLs) in children 1 to 5 years old from 2007 to 2010 NHANES data. We calculated and updated selected percentiles, including the 97.5th percentile, by using NHANES 2011 to 2014 blood lead data and examined demographic characteristics of children whose blood lead was ≥90th percentile value. The 97.5th percentile BLL of 3.48 µg/dL highlighted analytical laboratory and clinical interpretation challenges of blood lead measurements ≤5 µg/dL. Review of 5 years of results for target blood lead values <11 µg/dL for US clinical laboratories participating in the CDC's voluntary Lead and Multi-Element Proficiency quality assurance program showed 40% unable to quantify and reported a nondetectable result at a target blood lead value of 1.48 µg/dL, compared with 5.5% at a target BLL of 4.60 µg/dL. We describe actions taken at the CDC's Environmental Health Laboratory in the National Center for Environmental Health, which measures blood lead for NHANES, to improve analytical accuracy and precision and to reduce external lead contamination during blood collection and analysis.


Subject(s)
Lead Poisoning/blood , Lead Poisoning/prevention & control , Lead/blood , Child, Preschool , Female , Humans , Infant , Laboratory Proficiency Testing , Male , Mass Screening , Nutrition Surveys , Quality Assurance, Health Care , Reference Values , United States
17.
Lancet Glob Health ; 5(4): e458-e466, 2017 04.
Article in English | MEDLINE | ID: mdl-28153514

ABSTRACT

BACKGROUND: Outbreaks of unexplained illness frequently remain under-investigated. In India, outbreaks of an acute neurological illness with high mortality among children occur annually in Muzaffarpur, the country's largest litchi cultivation region. In 2014, we aimed to investigate the cause and risk factors for this illness. METHODS: In this hospital-based surveillance and nested age-matched case-control study, we did laboratory investigations to assess potential infectious and non-infectious causes of this acute neurological illness. Cases were children aged 15 years or younger who were admitted to two hospitals in Muzaffarpur with new-onset seizures or altered sensorium. Age-matched controls were residents of Muzaffarpur who were admitted to the same two hospitals for a non-neurologic illness within seven days of the date of admission of the case. Clinical specimens (blood, cerebrospinal fluid, and urine) and environmental specimens (litchis) were tested for evidence of infectious pathogens, pesticides, toxic metals, and other non-infectious causes, including presence of hypoglycin A or methylenecyclopropylglycine (MCPG), naturally-occurring fruit-based toxins that cause hypoglycaemia and metabolic derangement. Matched and unmatched (controlling for age) bivariate analyses were done and risk factors for illness were expressed as matched odds ratios and odds ratios (unmatched analyses). FINDINGS: Between May 26, and July 17, 2014, 390 patients meeting the case definition were admitted to the two referral hospitals in Muzaffarpur, of whom 122 (31%) died. On admission, 204 (62%) of 327 had blood glucose concentration of 70 mg/dL or less. 104 cases were compared with 104 age-matched hospital controls. Litchi consumption (matched odds ratio [mOR] 9·6 [95% CI 3·6 - 24]) and absence of an evening meal (2·2 [1·2-4·3]) in the 24 h preceding illness onset were associated with illness. The absence of an evening meal significantly modified the effect of eating litchis on illness (odds ratio [OR] 7·8 [95% CI 3·3-18·8], without evening meal; OR 3·6 [1·1-11·1] with an evening meal). Tests for infectious agents and pesticides were negative. Metabolites of hypoglycin A, MCPG, or both were detected in 48 [66%] of 73 urine specimens from case-patients and none from 15 controls; 72 (90%) of 80 case-patient specimens had abnormal plasma acylcarnitine profiles, consistent with severe disruption of fatty acid metabolism. In 36 litchi arils tested from Muzaffarpur, hypoglycin A concentrations ranged from 12·4 µg/g to 152·0 µg/g and MCPG ranged from 44·9 µg/g to 220·0 µg/g. INTERPRETATION: Our investigation suggests an outbreak of acute encephalopathy in Muzaffarpur associated with both hypoglycin A and MCPG toxicity. To prevent illness and reduce mortality in the region, we recommended minimising litchi consumption, ensuring receipt of an evening meal and implementing rapid glucose correction for suspected illness. A comprehensive investigative approach in Muzaffarpur led to timely public health recommendations, underscoring the importance of using systematic methods in other unexplained illness outbreaks. FUNDING: US Centers for Disease Control and Prevention.


Subject(s)
Acute Febrile Encephalopathy/diagnosis , Disease Outbreaks/statistics & numerical data , Fruit/toxicity , Litchi/toxicity , Neurotoxicity Syndromes/diagnosis , Acute Febrile Encephalopathy/epidemiology , Acute Febrile Encephalopathy/etiology , Adolescent , Case-Control Studies , Child , Cyclopropanes/analysis , Female , Glycine/analogs & derivatives , Glycine/analysis , Humans , Hypoglycins/analysis , India , Male , Neurotoxicity Syndromes/epidemiology , Neurotoxicity Syndromes/etiology , Odds Ratio
18.
Talanta ; 162: 114-122, 2017 Jan 01.
Article in English | MEDLINE | ID: mdl-27837806

ABSTRACT

We improved our inductively coupled plasma mass spectrometry (ICP-MS) whole blood method [1] for determination of lead (Pb), cadmium (Cd), and mercury (Hg) by including manganese (Mn) and selenium (Se), and expanding the calibration range of all analytes. The method is validated on a PerkinElmer (PE) ELAN® DRC II ICP-MS (ICP-DRC-MS) and uses the Dynamic Reaction Cell (DRC) technology to attenuate interfering background ion signals via ion-molecule reactions. Methane gas (CH4) eliminates background signal from 40Ar2+ to permit determination of 80Se+, and oxygen gas (O2) eliminates several polyatomic interferences (e.g. 40Ar15N+, 54Fe1H+) on 55Mn+. Hg sensitivity in DRC mode is a factor of two higher than vented mode when measured under the same DRC conditions as Mn due to collisional focusing of the ion beam. To compensate for the expanded method's longer analysis time (due to DRC mode pause delays), we implemented an SC4-FAST autosampler (ESI Scientific, Omaha, NE), which vacuum loads the sample onto a loop, to keep the sample-to-sample measurement time to less than 5min, allowing for preparation and analysis of 60 samples in an 8-h work shift. The longer analysis time also resulted in faster breakdown of the hydrocarbon oil in the interface roughing pump. The replacement of the standard roughing pump with a pump using a fluorinated lubricant, Fomblin®, extended the time between pump maintenance. We optimized the diluent and rinse solution components to reduce carryover from high concentration samples and prevent the formation of precipitates. We performed a robust calculation to determine the following limits of detection (LOD) in whole blood: 0.07µgdL-1 for Pb, 0.10µgL-1 for Cd, 0.28µgL-1 for Hg, 0.99µgL-1 for Mn, and 24.5µgL-1 for Se.


Subject(s)
Dietary Exposure/analysis , Environmental Monitoring/methods , Inhalation Exposure/analysis , Mass Spectrometry/methods , Trace Elements/blood , Cadmium/blood , Calibration , Environmental Monitoring/instrumentation , Humans , Lead/blood , Manganese/blood , Mercury/blood , Quality Control , Reference Standards , Reproducibility of Results , Selenium/blood , Trace Elements/standards
19.
J Anal At Spectrom ; 2014(2): 297-303, 2014.
Article in English | MEDLINE | ID: mdl-26229219

ABSTRACT

Biomonitoring and emergency response measurements are an important aspect of the Division of Laboratory Sciences of the National Center for Environmental Health, Centers for Disease Control and Prevention (CDC). The continuing advancement in instrumentation allows for enhancements to existing analytical methods. Prior to this work, chromium and nickel were analyzed on a sector field inductively coupled plasma-mass spectrometer (SF-ICP-MS). This type of instrumentation provides the necessary sensitivity, selectivity, accuracy, and precision but due to the higher complexity of instrumentation and operation, it is not preferred for routine high throughput biomonitoring needs. Instead a quadrupole based method has been developed on a PerkinElmer NexION™ 300D ICP-MS. The instrument is operated using 6.0 mL min-1 helium as the collision cell gas and in kinetic energy discrimination mode, interferences are successfully removed for the analysis of 52Cr (40Ar12C and 35Cl16O1H) and 60Ni (44Ca16O). The limits of detection are 0.162 µg L-1 Cr and 0.248 µg L-1 Ni. Method accuracy using NIST SRM 2668 level 1 (1.08 µg L-1 Cr and 2.31µg L-1 Ni) and level 2 (27.7 µg L-1 Cr and 115 µg L-1 Ni) was within the 95% confidence intervals reported in the NIST certificate. Among-run precision is less than 10% RSDs (N = 20) for in house quality control and NIST SRM urine samples. While the limits of detection (LOD) for the new quadrupole ICP-UCT-MS with KED method are similar to the SF-ICP-MS method, better measurement precision is observed for the quadrupole method. The new method presented provides fast, accurate, and more precise results on a less complex and more robust ICP-MS platform.

20.
J Radioanal Nucl Chem ; 299(3): 1555-1563, 2014 Mar.
Article in English | MEDLINE | ID: mdl-26300575

ABSTRACT

A newly developed procedure for determination of arsenic by radiochemical neutron activation analysis (RNAA) was used to measure arsenic at four levels in SRM 955c Toxic Elements in Caprine Blood and at two levels in SRM 2668 Toxic Elements in Frozen Human Urine for the purpose of providing mass concentration values for certification. Samples were freeze-dried prior to analysis followed by neutron irradiation for 3 h at a fluence rate of 1×1014cm-2s-1. After sample dissolution in perchloric and nitric acids, arsenic was separated from the matrix by extraction into zinc diethyldithiocarbamate in chloroform, and 76As quantified by gamma-ray spectroscopy. Differences in chemical yield and counting geometry between samples and standards were monitored by measuring the count rate of a 77As tracer added before sample dissolution. RNAA results were combined with inductively coupled plasma - mass spectrometry (ICP-MS) values from NIST and collaborating laboratories to provide certified values of (10.81 ± 0.54) µg/kg and (213.1 ± 0.73) µg/kg for SRM 2668 Levels I and II, and certified values of (21.66 ± 0.73) µg/kg, (52.7 ± 1.1) µg/kg, and (78.8 ± 4.9) µg/kg for SRM 955c Levels 2, 3, and 4 respectively. Because of discrepancies between values obtained by different methods for SRM 955c Level 1, an information value of < 5 µg/kg was assigned for this material.

SELECTION OF CITATIONS
SEARCH DETAIL
...