Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
J Pediatric Infect Dis Soc ; 8(2): 122-127, 2019 May 11.
Article in English | MEDLINE | ID: mdl-29522133

ABSTRACT

BACKGROUND: In 2007, a routine second dose of varicella vaccine was recommended in the United States for children aged 4 to 6 years to better control varicella-zoster virus circulation and outbreaks. Sentinel varicella outbreak surveillance was established to assess feasibility of surveillance and describe outbreaks that are occurring. METHODS: Through the Centers for Disease Control and Prevention Epidemiology Laboratory Capacity funding, health departments conducted active surveillance for varicella outbreaks in schools from 2012 to 2015. Outbreaks of varicella were defined as ≥5 cases in a school within at least 1 incubation period (21 days). School nurses, healthcare providers, or laboratories reported cases and outbreaks of varicella to health departments; demographic, vaccination, and clinical data were collected. RESULTS: Georgia, Houston, Maine, Minnesota, New York City, and Philadelphia participated in all 3 years; Puerto Rico and West Virginia participated in 2012 to 2013; and Kansas and Arkansas participated in 2014 to 2015. Twenty-nine outbreaks including 262 cases were reported. The median size of the outbreaks was 7 cases (range, 5-31 cases), and the median duration was 31 days (range, 4-100 days). Of the case-patients associated with larger outbreaks (≥8 cases), 55.4% were unvaccinated, and 15.7% and 18.1% had received 1 or 2 doses of vaccine, respectively. In small outbreaks (5-7 cases), 33.3% of case-patients were unvaccinated, and 16.7% and 38.5% had received 1 or 2 doses of vaccine, respectively. CONCLUSIONS: The majority of cases associated with outbreaks occurred in undervaccinated children (unvaccinated and 1-dose vaccine recipients). Outbreaks with a greater proportion of 2-dose vaccine recipients were smaller. Varicella outbreak surveillance is feasible, and continued monitoring of outbreaks remains important for describing the epidemiology of varicella during the 2-dose varicella vaccination program.


Subject(s)
Chickenpox/epidemiology , Sentinel Surveillance , Adolescent , Age Factors , Centers for Disease Control and Prevention, U.S. , Chickenpox/prevention & control , Chickenpox Vaccine/administration & dosage , Child , Child, Preschool , Disease Outbreaks/prevention & control , Disease Outbreaks/statistics & numerical data , Herpesvirus 3, Human/immunology , Humans , Immunization Programs , Infant , Schools , United States/epidemiology , Vaccination , Young Adult
3.
Consult Pharm ; 33(1): 37-47, 2018 Jan 01.
Article in English | MEDLINE | ID: mdl-29336277

ABSTRACT

OBJECTIVES: To describe hypoglycemic events in a Veterans Affairs (VA) community living center (CLC) population and to determine predictive risk factors associated with hypoglycemia. DESIGN: Retrospective, exploratory, observational chart review. SETTING: Tertiary-care VA Healthcare System CLC. PATIENTS: Residents residing in a VA CLC with at least one active order for insulin between June 1, 2009, and June 30, 2013, were evaluated over a 90-day study period. MAIN OUTCOME MEASURES: The primary outcome was the number of days to the first hypoglycemic event as described by the survival curve analysis. The secondary outcomes included the overall incidence of hypoglycemia, the association of potential risk factors on the proportion of hypoglycemic events, and the association of potential risk factors on the development of an additional hypoglycemic event. RESULTS: There was a 49% incidence of a hypoglycemic event in the 90-day study period with a 24% incidence within the first 7 days of resident admission, representing approximately half of all events that occurred. The only statistically significant risk factor for having a hypoglycemic event was the number of units of insulin/kg/day (hazard ratio = 1.008, 95% confidence interval 1.001, 1.015; P = 0.0317) that a resident was prescribed. CONCLUSIONS: Residents are at increased risk for hypoglycemia within the first seven days of admission to a CLC. It is imperative that providers closely monitor and reevaluate antidiabetic regimens at this time of transition.


Subject(s)
Hypoglycemia/etiology , Aged , Female , Humans , Independent Living , Male , Middle Aged , Retrospective Studies , Risk Factors , United States , United States Department of Veterans Affairs , Veterans
4.
Autism ; 22(8): 983-994, 2018 11.
Article in English | MEDLINE | ID: mdl-28914086

ABSTRACT

Daily living skills deficits are strongly associated with poor adult outcomes for individuals with high-functioning autism spectrum disorder, and yet, there are no group interventions targeting daily living skills. Seven adolescents with autism spectrum disorder and their parents participated in a feasibility pilot of a 12-week manualized group treatment targeting specific daily living skills (i.e. morning routine, cooking, laundry, and money management). Outcomes included the Vineland Adaptive Behavior Scales, Second Edition (Vineland-II) age equivalence scores and four goal attainment scaling scores. Adolescents demonstrated significant improvement on two Vineland-II subdomains and on all goal attainment scaling scores at post-treatment and 6-month follow-up. The intervention has promise for improving critical daily living skills' deficits that affect independent living and employment. Limitations and implications for future studies are discussed.


Subject(s)
Activities of Daily Living , Autism Spectrum Disorder/rehabilitation , Parents/education , Adolescent , Cooking , Financial Management , Goals , Humans , Hygiene , Independent Living , Laundering , Patient Acceptance of Health Care , Pilot Projects , Self Care
5.
J Assist Reprod Genet ; 34(11): 1427-1434, 2017 Nov.
Article in English | MEDLINE | ID: mdl-28942525

ABSTRACT

PURPOSE: The main purposes of the study were to investigate the endocrine function of ovarian tissue transplanted to heterotopic subcutaneous sites and the reproductive competence and telomere length of a nonhuman primate originating from transplanted tissue. METHODS: Ovarian cortex pieces were transplanted into the original rhesus macaques in the arm subcutaneously, in the abdomen next to muscles, or in the kidney. Serum estradiol (E2) and progesterone (P4) concentrations were measured weekly for up to 8 years following tissue transplantation. A monkey derived from an oocyte in transplanted ovarian tissue entered time-mated breeding and underwent controlled ovarian stimulation. Pregnancy and offspring were evaluated. Telomere lengths and oocytes obtained following controlled ovarian stimulation were assessed. RESULTS: Monkeys with transplants in the arm and abdomen had cyclic E2 of 100 pg/ml, while an animal with arm transplants had E2 of 50 pg/ml. One monkey with transplants in the abdomen and kidney had ovulatory cycles for 3 years. A monkey derived from an oocyte in transplanted tissue conceived and had a normal gestation until intrapartum fetal demise. She conceived again and delivered a healthy offspring at term. Controlled ovarian stimulations of this monkey yielded mature oocytes comparable to controls. Her telomere length was long relative to controls. CONCLUSIONS: Heterotopic ovarian tissue transplants yielded long-term endocrine function in macaques. A monkey derived from an oocyte in transplanted tissue was reproductively competent. Her telomere length did not show epigenetically induced premature cellular aging. Ovarian tissue transplantation to heterotopic sites for fertility preservation should move forward cautiously, yet optimistically.


Subject(s)
Fertility Preservation/methods , Oocytes/growth & development , Ovarian Follicle/transplantation , Ovary/transplantation , Reproduction/physiology , Animals , Cryopreservation , Estradiol/blood , Female , Macaca mulatta/genetics , Macaca mulatta/physiology , Ovarian Follicle/growth & development , Ovary/growth & development , Ovulation Induction/methods , Pregnancy , Progesterone/blood , Reproduction/genetics , Telomere Homeostasis/genetics
6.
J Reprod Immunol ; 121: 42-48, 2017 06.
Article in English | MEDLINE | ID: mdl-28622535

ABSTRACT

Vitamin D is thought to modulate innate immune responses, and recent studies have highlighted the autocrine and paracrine functions of vitamin D in the placenta. Our objective was to determine the relationship between maternal vitamin D status and placental antimicrobial peptide (AMP) expression in a group of racially and ethnically diverse pregnant adolescents. In this study, 158 pregnant adolescents were recruited from the Rochester Adolescent Maternity Program (RAMP) in Rochester, NY. Maternal serum concentrations of the vitamin D biomarkers, 25-hydroxyvitamin D (25(OH)D) and 1,25-dihydroxyvitamin D (1,25(OH)2D), were measured at mid-gestation (∼26 weeks) and at delivery. At the placental level, vitamin D regulatory proteins (cubilin, megalin, 1α-hydroxylase (CYP27B1), 24-hydroxylase (CYP24A1), vitamin D receptor (VDR)) and AMPs (cathelicidin and hepcidin) were analyzed using quantitative PCR and western blot techniques. Placental CYP27B1 mRNA expression was significantly positively associated with both placental cathelicidin mRNA expression (P<0.0001) and placental hepcidin mRNA expression (P=0.002). In teens with positive recto-vaginal group B streptococcus (GBS) colonization, placental mRNA expression of cathelicidin (P=0.007), cubilin (P=0.03), and CYP27B1 (P=0.04) were significantly lower compared to those who tested negative for this infection. A mediation analysis showed that the indirect relationship between GBS colonization and placental cathelicidin mRNA expression was mediated by the placental mRNA expression of the vitamin D proteins cubilin and CYP27B1 (P=0.02). Additional research is needed to identify the role and relative contributions of placental and systemic vitamin D metabolites in relation to potentially pathogenic microorganisms which may be present during pregnancy.


Subject(s)
25-Hydroxyvitamin D3 1-alpha-Hydroxylase/metabolism , Antimicrobial Cationic Peptides/metabolism , Placenta/metabolism , Receptors, Cell Surface/metabolism , Rectum/microbiology , Streptococcal Infections/immunology , Streptococcus agalactiae/immunology , Vagina/immunology , Vitamin D/metabolism , 25-Hydroxyvitamin D3 1-alpha-Hydroxylase/genetics , Adolescent , Adult , Antimicrobial Cationic Peptides/genetics , Female , Gene Expression Regulation, Bacterial , Host-Pathogen Interactions , Humans , Maternal Exposure , Pregnancy , Pregnancy, High-Risk , RNA, Messenger/analysis , Receptors, Cell Surface/genetics , Vagina/microbiology , Young Adult , Cathelicidins
7.
Public Health Rep ; 132(2): 188-195, 2017.
Article in English | MEDLINE | ID: mdl-28182514

ABSTRACT

OBJECTIVES: In January 2014, 4-methylcyclohexanemethanol spilled into the Elk River near Charleston, West Virginia, contaminating the water supply for about 120 000 households. The West Virginia American Water Company (WVAWC) issued a "do not use" water order for 9 counties. After the order was lifted (10 days after the spill), the communities' use of public water systems, information sources, alternative sources of water, and perceived impact of the spill on households were unclear to public health officials. To assist in recovery efforts, the West Virginia Bureau for Public Health and the Centers for Disease Control and Prevention conducted a Community Assessment for Public Health Emergency Response (CASPER). METHODS: We used the CASPER 2-stage cluster sampling design to select a representative sample of households to interview, and we conducted interviews in 171 households in April 2014. We used a weighted cluster analysis to generate population estimates in the sampling frame. RESULTS: Before the spill, 74.4% of households did not have a 3-day alternative water supply for each household member and pet. Although 83.6% of households obtained an alternative water source within 1 day of the "do not use" order, 37.4% of households reportedly used WVAWC water for any purpose. Nearly 3 months after the spill, 36.1% of households believed that their WVAWC water was safe, and 33.5% reported using their household water for drinking. CONCLUSIONS: CASPER results identified the need to focus on basic public health messaging and household preparedness efforts. Recommendations included (1) encouraging households to maintain a 3-day emergency water supply, (2) identifying additional alternative sources of water for future emergencies, and (3) increasing community education to address ongoing concerns about water.


Subject(s)
Chemical Hazard Release , Cyclohexanes/analysis , Disasters , Rivers/chemistry , Water Pollution, Chemical/analysis , Adolescent , Adult , Aged , Child , Child, Preschool , Cluster Analysis , Female , Humans , Infant , Interviews as Topic , Male , Middle Aged , Public Health , Qualitative Research , West Virginia , Young Adult
8.
Am J Clin Nutr ; 102(5): 1088-95, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26447159

ABSTRACT

BACKGROUND: Vitamin D and iron deficiencies frequently co-exist. It is now appreciated that mechanistic interactions between iron and vitamin D metabolism may underlie these associations. OBJECTIVE: We examined interrelations between iron and vitamin D status and their regulatory hormones in pregnant adolescents, who are a group at risk of both suboptimal vitamin D and suboptimal iron status. DESIGN: The trial was a prospective longitudinal study of 158 pregnant adolescents (aged ≤18 y). Maternal circulating biomarkers of vitamin D and iron were determined at midgestation (∼25 wk) and delivery (∼40 wk). Linear regression was used to assess associations between vitamin D and iron status indicators. Bivariate and multivariate logistic regressions were used to generate the OR of anemia as a function of vitamin D status. A mediation analysis was performed to examine direct and indirect relations between vitamin D status, hemoglobin, and erythropoietin in maternal serum. RESULTS: Maternal 25-hydroxyvitamin D [25(OH)D] was positively associated with maternal hemoglobin at both midgestation and at delivery (P < 0.01 for both). After adjustment for age at enrollment and race, the odds of anemia at delivery was 8 times greater in adolescents with delivery 25(OH)D concentrations <50 nmol/L than in those with 25(OH)D concentrations ≥50 nmol/L (P <0.001). Maternal 25(OH)D was inversely associated with erythropoietin at both midgestation (P <0.05) and delivery (P <0.001). The significant relation observed between 25(OH)D and hemoglobin could be explained by a direct relation between 25(OH)D and hemoglobin and an indirect relation that was mediated by erythropoietin. CONCLUSIONS: In this group of pregnant adolescents, suboptimal vitamin D status was associated with increased risk of iron insufficiency and vice versa. These findings emphasize the need for screening for multiple nutrient deficiencies during pregnancy and greater attention to overlapping metabolic pathways when selecting prenatal supplementation regimens.


Subject(s)
Anemia, Iron-Deficiency/epidemiology , Erythropoietin/blood , Maternal Nutritional Physiological Phenomena , Nutritional Status , Pregnancy Complications/epidemiology , Vitamin D Deficiency/epidemiology , 25-Hydroxyvitamin D 2/blood , Adolescent , Anemia, Iron-Deficiency/complications , Biomarkers/blood , Calcifediol/blood , Cohort Studies , Cross-Sectional Studies , Female , Hemoglobins/analysis , Humans , Linear Models , Longitudinal Studies , New York/epidemiology , Pregnancy , Pregnancy Complications/blood , Prospective Studies , Risk , Vitamin D Deficiency/blood , Vitamin D Deficiency/complications
9.
Pediatr Infect Dis J ; 34(10): 1105-9, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26186103

ABSTRACT

BACKGROUND: A routine 2-dose varicella vaccination program was adopted in 2007 in the US to help further decrease varicella disease and prevent varicella outbreaks. We describe trends and characteristics of varicella outbreaks reported to the Centers for Disease Control and Prevention (CDC) during 2005-2012 from 9 states. METHODS: Data on varicella outbreaks collected by 9 state health departments were submitted to CDC using the CDC outbreak reporting worksheet. Information was collected on dates of the outbreak, outbreak setting and number of cases by outbreak; aggregate data were provided on the numbers of outbreak-related cases by age group, vaccination status and laboratory confirmation. RESULTS: Nine hundred and twenty-nine outbreaks were reported from the 6 states, which provided data for each year during 2005-2012. Based on data from these 6 states, the number of outbreaks declined by 78%, decreasing from 147 in 2005 to 33 outbreaks in 2012 (P = 0.0001). There were a total of 1015 varicella outbreaks involving 13,595 cases reported by the 9 states from 2005 to 2012. The size and duration of outbreaks declined significantly over time (P < 0.001). The median size of outbreaks was 12, 9 and 7 cases and median duration of outbreaks was 38, 35 and 26 days during 2005-2006, 2007-2009 and 2010-2012, respectively. Majority of outbreaks (95%) were reported from schools, declining from 97% in 2005-2006 to 89% in 2010-2012. Sixty-five percent of outbreak-related cases occurred among 5-year to 9-year olds, with the proportion declining from 76% in 2005-2006 to 45% during 2010-2012. CONCLUSIONS: The routine 2-dose varicella vaccination program appears to have significantly reduced the number, size and duration of varicella outbreaks in the US.


Subject(s)
Chickenpox Vaccine , Chickenpox , Disease Outbreaks , Mass Vaccination/statistics & numerical data , Adolescent , Adult , Chickenpox/epidemiology , Chickenpox/prevention & control , Chickenpox Vaccine/administration & dosage , Chickenpox Vaccine/therapeutic use , Child , Child, Preschool , Disease Outbreaks/prevention & control , Disease Outbreaks/statistics & numerical data , Humans , Infant , Infant, Newborn , Retrospective Studies , United States/epidemiology , Young Adult
10.
J Nutr ; 144(11): 1710-7, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25332470

ABSTRACT

BACKGROUND: Dietary heme contributes to iron intake, yet regulation of heme absorption and tissue utilization of absorbed heme remains undefined. OBJECTIVES: In a rat model of iron overload, we used stable iron isotopes to examine heme- and nonheme-iron absorption in relation to liver hepcidin and to compare relative utilization of absorbed heme and nonheme iron by erythroid (RBC) and iron storage tissues (liver and spleen). METHODS: Twelve male Sprague-Dawley rats were randomly assigned to groups for injections of either saline or iron dextran (16 or 48 mg Fe over 2 wk). After iron loading, rats were administered oral stable iron in the forms of (57)Fe-ferrous sulfate and (58)Fe-labeled hemoglobin. Expression of liver hepcidin and duodenal iron transporters and tissue stable iron enrichment was determined 10 d postdosing. RESULTS: High iron loading increased hepatic hepcidin by 3-fold and reduced duodenal expression of divalent metal transporter 1 (DMT1) by 76%. Nonheme-iron absorption was 2.5 times higher than heme-iron absorption (P = 0.0008). Absorption of both forms of iron was inversely correlated with hepatic hepcidin expression (heme-iron absorption: r = -0.77, P = 0.003; nonheme-iron absorption: r = -0.80, P = 0.002), but hepcidin had a stronger impact on nonheme-iron absorption (P = 0.04). Significantly more (57)Fe was recovered in RBCs (P = 0.02), and more (58)Fe was recovered in the spleen (P = 0.01). CONCLUSIONS: Elevated hepcidin significantly decreased heme- and nonheme-iron absorption but had a greater impact on nonheme-iron absorption. Differential tissue utilization of heme vs. nonheme iron was evident between erythroid and iron storage tissues, suggesting that some heme may be exported into the circulation in a form different from that of nonheme iron.


Subject(s)
Ferrous Compounds , Hemoglobins , Iron/pharmacokinetics , Animals , Duodenum/metabolism , Ferrous Compounds/administration & dosage , Ferrous Compounds/chemistry , Ferrous Compounds/pharmacology , Hemoglobins/administration & dosage , Hemoglobins/chemistry , Hemoglobins/pharmacology , Hepcidins/genetics , Hepcidins/metabolism , Iron/metabolism , Iron, Dietary/pharmacokinetics , Iron-Dextran Complex/administration & dosage , Liver/metabolism , Male , Random Allocation , Rats , Rats, Sprague-Dawley
11.
Pediatr Infect Dis J ; 33(11): 1164-8, 2014 Nov.
Article in English | MEDLINE | ID: mdl-24911894

ABSTRACT

BACKGROUND: Universal 2-dose varicella vaccination was recommended in 2006 to further reduce varicella disease burden. This study examined 2-dose varicella vaccine effectiveness (VE) and rash severity in the setting of school-associated varicella outbreaks. METHODS: A case control study was conducted from January 2010 to May 2011 in all West Virginia public schools. Clinically diagnosed cases from varicella outbreaks were matched with classmate controls. Vaccination information was collected from school, health department and healthcare provider immunization information systems. RESULTS: Among the 133 cases and 365 controls enrolled, VE against all varicella was 83.2% [95% confidence interval (CI): 69.2%-90.8%] for 1-dose of varicella vaccine and 93.9% (95% CI: 86.9%-97.1%) for 2-dose; the incremental VE (2-dose vs. 1-dose) was 63.6% (95% CI: 32.6%-80.3%). In preventing moderate/severe varicella, 1-dose varicella vaccine was 88.2% (95% CI: 72.7%- 94.9%) effective, and 2-dose vaccination was 97.5% (95% CI: 91.6%-99.2%) effective, with the incremental VE of 78.6% (95% CI: 40.9%-92.3%). One-dose VE declined along with time since vaccination (VE = 93.0%, 88.0% and 81.8% in <5, 5-9 and ≥ 10 years after vaccination, P = 0.001 for trend). Both 1- and 2-dose breakthrough cases had milder rash than unvaccinated cases (<50 lesion: 24.6%, 49.1% and 70.0% in unvaccinated, 1-dose and 2-dose cases, P < 0.001), and no severe disease was found in 2-dose cases. CONCLUSIONS: Two-dose varicella vaccination is highly effective and confers higher protection than a 1-dose regimen. High 2-dose varicella vaccination coverage should maximize the benefits of the varicella vaccination program and further reduce varicella disease burden in the United States.


Subject(s)
Chickenpox Vaccine/administration & dosage , Chickenpox/epidemiology , Chickenpox/prevention & control , Disease Outbreaks/prevention & control , Vaccination/statistics & numerical data , Adolescent , Case-Control Studies , Chickenpox Vaccine/immunology , Child , Child, Preschool , Female , Humans , Immunization Schedule , Immunologic Memory , Male , Schools , Severity of Illness Index , West Virginia/epidemiology
12.
Virchows Arch ; 464(6): 709-16, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24771120

ABSTRACT

Clear cell papillary renal cell carcinoma (CCPRCC) is a novel tumor entity that was recently recognized as a new distinct epithelial tumor within the current classification system. Nonclassic morphologic variants have rarely been reported. We present six challenging cases of CCPRCC with prominent (>75 %) tubular, acinar, and/or solid component and angioleiomyomatous stroma. The tumors lacked well-organized papillary architecture. All tumors had a variously thick capsule formed by a layer of bands of smooth muscle. The leiomyomatous tissue often entirely encased patches of tubular structures, or it formed only small leiomyomatous islands within the epithelial component. There was a remarkable relationship between the vascular network and the epithelial component in the sense that every single tubule or acinus was associated with a fine capillary network, with the capillaries intimately surrounding the tubular or acinar circumference. CCPRCC with variant morphology expressed carbonic anhydrase IX (CA-IX) in cup-shaped distribution. In addition, the tumor cells stained positive for cytokeratin 34betaE12, CK7, and vimentin. Renal cell carcinoma (RCC), P504s/AMACR, Melan A, and HMB45 were negative in tumor cells in all cases examined. Fluorescence in situ hybridization studies showed the presence of a normal copy number for chromosomes 7, 17, 3q, and 3p. CCPRCC with variant morphology seems to have a favorable prognosis. In the current series, tumor stage was low at presentation, and none of the patients had local recurrence or metastatic disease. The distinction between CCPRCC with variant morphology and clear cell RCC is critical because no case of CCPRCC has behaved aggressively.


Subject(s)
Biomarkers, Tumor/analysis , Carcinoma, Papillary/pathology , Carcinoma, Renal Cell/pathology , Kidney Neoplasms/pathology , Aged , Carcinoma, Papillary/genetics , Carcinoma, Papillary/metabolism , Carcinoma, Renal Cell/genetics , Carcinoma, Renal Cell/metabolism , Female , Humans , Immunohistochemistry , In Situ Hybridization, Fluorescence , Kidney Neoplasms/genetics , Kidney Neoplasms/metabolism , Male , Middle Aged
13.
Am J Ind Med ; 56(7): 733-41, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23450749

ABSTRACT

BACKGROUND: In 2000, a manufacturer of beryllium materials and products introduced a comprehensive program to prevent beryllium sensitization and chronic beryllium disease (CBD). We assessed the program's efficacy in preventing sensitization 9 years after implementation. METHODS: Current and former workers hired since program implementation completed questionnaires and provided blood samples for the beryllium lymphocyte proliferation test (BeLPT). Using these data, as well as company medical surveillance data, we estimated beryllium sensitization prevalence. RESULTS: Cross-sectional prevalence of sensitization was 0.7% (2/298). Combining survey results with surveillance results, a total of seven were identified as sensitized (2.3%). Early Program workers were more likely to be sensitized than Late Program workers; one of the latter was newly identified. All sensitization was identified while participants were employed. One worker was diagnosed with CBD during employment. CONCLUSIONS: The combination of increased respiratory and dermal protection, enclosure and improved ventilation of high-risk processes, dust migration control, improved housekeeping, and worker and management education showed utility in reducing sensitization in the program's first 9 years. The low rate (0.6%, 1/175) among Late Program workers suggests that continuing refinements have provided additional protection against sensitization compared to the program's early years.


Subject(s)
Berylliosis/prevention & control , Immunization , Occupational Exposure/adverse effects , Occupational Health , Primary Prevention/organization & administration , Adult , Berylliosis/epidemiology , Berylliosis/immunology , Beryllium/blood , Chronic Disease , Cross-Sectional Studies , Female , Humans , Inhalation Exposure , Male , Middle Aged , Prognosis , Program Development , Program Evaluation , Protective Clothing , Risk Assessment , Surveys and Questionnaires , Time Factors
14.
FASEB J ; 27(6): 2476-83, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23447582

ABSTRACT

Iron (Fe) deficiency is endemic worldwide. Little data are available regarding acute effects of dietary protein on intestinal Fe absorption. The current study evaluated the short-term effects of increasing dietary protein on Fe absorption and expression of genes involved in Fe homeostasis. Sprague Dawley rats (24, female) were randomly assigned to custom-formulated isocaloric diets containing 40, 20 (control), or 5% protein (as percentage of total kilocalories) for 7 d. Whole-body Fe balance studies demonstrated that Fe retention was greater in the 40% group than in the 5% group (30.8 vs. 7.3%; P<0.01). In a separate study utilizing stable iron isotopes, the 40% group absorbed 30% of ingested Fe, while the 20% group absorbed 18% (P=0.005). Whole-genome profiling revealed that increasing dietary protein from 5 to 40% increased duodenal transcript expression of divalent metal transporter 1 (DMT1) 3.2-fold, duodenal cytochrome b (Dcytb) 1.8-fold, and transferrin receptor (TfR) 1.8-fold. Consistent with these findings, DMT1 transcript expression was 4-fold higher in RNA prepared from duodenal mucosa in the 40% group compared to the 20% group (P<0.001). These data suggest that increasing dietary protein increases intestinal Fe absorption in part by up-regulating DMT1, Dcytb, and TfR.


Subject(s)
Cation Transport Proteins/genetics , Cytochromes b/genetics , Dietary Proteins/administration & dosage , Intestinal Absorption/genetics , Iron, Dietary/pharmacokinetics , Receptors, Transferrin/genetics , Up-Regulation , Animals , Caseins/administration & dosage , Duodenum/metabolism , FMN Reductase/genetics , Female , Intestinal Absorption/physiology , RNA, Messenger/genetics , RNA, Messenger/metabolism , Rats , Rats, Sprague-Dawley
15.
J Occup Environ Med ; 52(5): 505-12, 2010 May.
Article in English | MEDLINE | ID: mdl-20431418

ABSTRACT

OBJECTIVE: We evaluated a workplace preventive program's effectiveness, which emphasized skin and respiratory protection, workplace cleanliness, and beryllium migration control in lowering beryllium sensitization. METHODS: We compared sensitization prevalence and incidence rates for workers hired before and after the program using available cross sectional and longitudinal surveillance data. RESULTS: Sensitization prevalence was 8.9% for the Pre-Program Group and 2.1% for the Program Group. The sensitization incidence rate was 3.7/1000 person-months for the Pre-Program Group and 1.7/1000 person-months for the Program Group. After making adjustments for potential selection and information bias, sensitization prevalence for the Pre-Program Group was 3.8 times higher (95% CI = 1.5 to 9.3) than the Program Group. The sensitization incidence rate ratio comparing the Pre-Program Group to the Program Group was 1.6 (95% CI = 0.8 to 3.6). CONCLUSIONS: This preventive program reduced the prevalence of but did not eliminate beryllium sensitization.


Subject(s)
Alloys , Beryllium/adverse effects , Industry , Occupational Exposure/prevention & control , Oxides , Cross-Sectional Studies , Female , Humans , Male
16.
Public Health Rep ; 124 Suppl 1: 112-24, 2009.
Article in English | MEDLINE | ID: mdl-19618813

ABSTRACT

OBJECTIVES: In 2000, 7% of workers at a copper-beryllium facility were beryllium sensitized. Risk was associated with work near a wire annealing/pickling process. The facility then implemented a preventive program including particle migration control, respiratory and dermal protection, and process enclosure. We assessed the program's efficacy in preventing beryllium sensitization. METHODS: In 2000, the facility began testing new hires (program workers) with beryllium lymphocyte proliferation tests (BeLPTs) at hire and at intervals during employment. We compared sensitization incidence rates (IRs) and prevalence rates for workers hired before the program (legacy workers) with rates for program workers, including program worker subgroups. We also examined trends in BeLPTs from a single laboratory. RESULTS: In all, five of 43 legacy workers (IR = 3.8/1,000 person-months) and three of 82 program workers (IR = 1.9/1,000 person-months) were beryllium sensitized, for an incidence rate ratio (IRR) of 2.0 (95% confidence interval [CI] 0.5, 10.1). Two of 37 pre-enclosure program workers (IR = 2.4/1,000 person-months) and one of 45 post-enclosure program workers (IR = 1.4/1,000 person-months) were beryllium sensitized, for IRRs of 1.6 (95% CI 0.3, 11.9) and 2.8 (95% CI 0.4, 66.2), respectively, compared with legacy workers. Test for trend in prevalence rates was significant. Among 2,159 first-draw BeLPTs during 95 months, we identified seven months when high numbers of redraws were required, with one possible misclassification in this facility. CONCLUSIONS: Fewer workers became sensitized after implementation of the preventive program. However, low statistical power due to the facility's small workforce prevents a definitive conclusion about the program's efficacy. These findings have implications for other copper-beryllium facilities, where program components may merit application.


Subject(s)
Air Pollutants, Occupational/poisoning , Berylliosis/prevention & control , Beryllium/chemistry , Chemical Industry/standards , Occupational Exposure/prevention & control , Safety Management/methods , Adult , Berylliosis/etiology , Beryllium/blood , Copper/chemistry , Dust , Equipment Safety , Female , Humans , Male , Middle Aged , Monitoring, Physiologic/methods , Program Evaluation , Protective Clothing , Protective Devices , Risk Factors
17.
Am J Primatol ; 69(8): 917-29, 2007 Aug.
Article in English | MEDLINE | ID: mdl-17358011

ABSTRACT

The vervet monkey was evaluated as a primate model for use in assisted reproductive technologies (ARTs). Eight adult female vervets were hormonally monitored for their potential use as egg donors and those six females displaying regular menstrual cycles were subjected to controlled ovarian stimulation with recombinant human gonadotropins. Three animals failed to respond while laparoscopic follicular aspiration was performed on the other three females at 27-30 h post-human chorionic gonadotropin administration. A total of 62, 40, and 18 oocytes was recovered from these three animals of which 30, 20, and 4, respectively, matured to the metaphase II stage and were subsequently inseminated using intracytoplasmic sperm injection. An average of 40+/-15% (SEM) of the inseminated oocytes were fertilized based on pronucleus formation and timely cleavage. One embryo from each of the two stimulated females developed into expanded blastocysts. Two adult male vervets were assessed as sperm donors. Neither adjusted well to the restraint and collection procedure required for penile electroejaculation. Samples collected via rectal electroejaculation were very low in sperm motility and concentration; however, cauda epididymal aspirations from one male yielded an adequate concentration of motile sperm. These results emphasize the need to establish species-specific ovarian stimulation protocols and semen collection techniques if vervets are to be considered for basic and applied (ARTs) research on primate gametes or embryos.


Subject(s)
Cercopithecinae , Models, Animal , Sperm Injections, Intracytoplasmic , Animals , Blastocyst/cytology , Ejaculation , Embryo Transfer , Embryonic Development , Female , Male , Ovarian Follicle/diagnostic imaging , Ovulation Induction , Species Specificity , Sperm Motility , Ultrasonography
18.
Addiction ; 101(9): 1313-22, 2006 Sep.
Article in English | MEDLINE | ID: mdl-16911731

ABSTRACT

AIM: To evaluate the effects of prenatal marijuana exposure (PME) on the age of onset and frequency of marijuana use while controlling for identified confounds of early marijuana use among 14-year-olds. DESIGN: In this longitudinal cohort study, women were recruited in their fourth prenatal month. Women and children were followed throughout pregnancy and at multiple time-points into adolescence. SETTING AND PARTICIPANTS: Recruitment was from a hospital-based prenatal clinic. The women ranged in age from 18 to 42, half were African American and half Caucasian, and most were of lower socio-economic status. The women were generally light to moderate substance users during pregnancy and subsequently. At 14 years, 580 of the 763 offspring-mother pairs (76%) were assessed. A total of 563 pairs (74%) was included in this analysis. MEASUREMENTS: Socio-demographic, environmental, psychological, behavioral, biological and developmental factors were assessed. Outcomes were age of onset and frequency of marijuana use at age 14. PME predicted age of onset and frequency of marijuana use among the 14-year-old offspring. This finding was significant after controlling for other variables including the child's current alcohol and tobacco use, pubertal stage, sexual activity, delinquency, peer drug use, family history of drug abuse and characteristics of the home environment including parental depression, current drug use and strictness/supervision. CONCLUSIONS: Prenatal exposure to marijuana, in addition to other factors, is a significant predictor of marijuana use at age 14.


Subject(s)
Marijuana Smoking/adverse effects , Prenatal Exposure Delayed Effects , Adolescent , Adult , Black or African American , Age of Onset , Child , Female , Follow-Up Studies , Humans , Marijuana Smoking/epidemiology , Pregnancy , Prevalence , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...