Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 56
Filter
1.
Pharmacol Res Perspect ; 9(4): e00823, 2021 08.
Article in English | MEDLINE | ID: mdl-34339112

ABSTRACT

Many Americans take multiple medications simultaneously (polypharmacy). Polypharmacy's effects on mortality are uncertain. We endeavored to assess the association between polypharmacy and mortality in a large U.S. cohort and examine potential effect modification by chronic kidney disease (CKD) status. The REasons for Geographic And Racial Differences in Stroke cohort data (n = 29 627, comprised of U.S. black and white adults) were used. During a baseline home visit, pill bottle inspections ascertained medications used in the previous 2 weeks. Polypharmacy status (major [≥8 ingredients], minor [6-7 ingredients], and none [0-5 ingredients]) was determined by counting the total number of generic ingredients. Cox models (time-on-study and age-time-scale methods) assessed the association between polypharmacy and mortality. Alternative models examined confounding by indication and possible effect modification by CKD. Over 4.9 years median follow-up, 2538 deaths were observed. Major polypharmacy was associated with increased mortality in all models, with hazard ratios and 95% confidence intervals ranging from 1.22 (1.07-1.40) to 2.35 (2.15-2.56), with weaker associations in more adjusted models. Minor polypharmacy was associated with mortality in some, but not all, models. The polypharmacy-mortality association did not differ by CKD status. While residual confounding by indication cannot be excluded, in this large American cohort, major polypharmacy was consistently associated with mortality.


Subject(s)
Polypharmacy , Renal Insufficiency, Chronic/drug therapy , Renal Insufficiency, Chronic/mortality , Aged , Aged, 80 and over , Black People , Cohort Studies , Female , Humans , Male , Middle Aged , Renal Insufficiency, Chronic/ethnology , United States/epidemiology , United States/ethnology , White People
2.
Infect Control Hosp Epidemiol ; 40(6): 639-648, 2019 06.
Article in English | MEDLINE | ID: mdl-30963987

ABSTRACT

OBJECTIVE: To compare risk of surgical site infection (SSI) following cesarean delivery between women covered by Medicaid and private health insurance. STUDY DESIGN: Retrospective cohort. STUDY POPULATION: Cesarean deliveries covered by Medicaid or private insurance and reported to the National Healthcare Safety Network (NHSN) and state inpatient discharge databases by hospitals in California (2011-2013). METHODS: Deliveries reported to NHSN and state inpatient discharge databases were linked to identify SSIs in the 30 days following cesarean delivery, primary payer, and patient and procedure characteristics. Additional hospital-level characteristics were obtained from public databases. Relative risk of SSI by primary payer primary payer was assessed using multivariable logistic regression adjusting for patient, procedure, and hospital characteristics, accounting for facility-level clustering. RESULTS: Of 291,757 cesarean deliveries included, 48% were covered by Medicaid. SSIs were detected following 1,055 deliveries covered by Medicaid (0.75%) and 955 deliveries covered by private insurance (0.63%) (unadjusted odds ratio, 1.2; 95% confidence interval [CI], 1.1-1.3; P < .0001). The adjusted odds of SSI following cesarean deliveries covered by Medicaid was 1.4 (95% CI, 1.2-1.6; P < .0001) times the odds of those covered by private insurance. CONCLUSIONS: In this, the largest and only multicenter study to investigate SSI risk following cesarean delivery by primary payer, Medicaid-insured women had a higher risk of infection than privately insured women. These findings suggest the need to evaluate and better characterize the quality of maternal healthcare for and needs of women covered by Medicaid to inform targeted infection prevention and policy.


Subject(s)
Cesarean Section/adverse effects , Insurance, Health/statistics & numerical data , Medicaid/statistics & numerical data , Surgical Wound Infection/epidemiology , Adolescent , Adult , California/epidemiology , Cesarean Section/economics , Cesarean Section/statistics & numerical data , Child , Female , Hospitals , Humans , Logistic Models , Multivariate Analysis , Pregnancy , Private Sector , Retrospective Studies , Risk Factors , United States , Young Adult
3.
BMC Nutr ; 5: 7, 2019.
Article in English | MEDLINE | ID: mdl-32153921

ABSTRACT

BACKGROUND: Initiation of complementary feeding is often delayed in Bangladesh and likely contributes to the high burden of infant undernutrition in the country. METHODS: Pregnant women at 28­32 weeks' gestation were recruited for a cohort-based evaluation of a community-based nutrition education program. To identify predictors of the timing of introduction of solid/semi-solid/soft foods (complementary feeding initiation), we prospectively interviewed 2078women (1042 from intervention area, 1036 from control area) at time of recruitment and at child age 3 and 9 mo. Maternal knowledge and attitudes towards complementary feeding, nutritional importance and cost of complementary foods were assessed at child age 3 months. Two scales were created from the sum of correct responses. Tertiles were created for analysis (Knowledge: 0­7, 8­9, 10­15; Attitudes: 18­25, 26, 27­34). Infant age at complementary feeding initiation was characterized as early (≤4 months), timely (5­6 months) or late (≥7 months), based on maternal recall at child age 9 mo. We used stratified polytomous logistic regression, adjusted for socioeconomic status, infant gender, maternal age, literacy and parity to identify predictors of early or late vs. timely complementary feeding initiation. RESULTS: Complementary feeding initiation was early for 7%, timely for 49%, and late for 44% of infants. Only 19% of mothers knew the WHO recommended age for complementary feeding initiation. The knowledge score was not associated with timely complementary feeding initiation. Mothers with the most favorable attitudes (highest attitudes score tertile) were more likely to initiate late complementary feeding compared to those with the lowest attitudes score tertile (adjusted OR = 2.2, 95% CI: 1.1­4.4). CONCLUSION: Late introduction of complementary foods is still widely prevalent in Bangladesh. Improved maternal knowledge or favorable attitudes towards complementary feeding were not associated with timely introduction of complementary foods, indicating other factors likely determine timing of complementary feeding initiation. This presents an avenue for future research.

4.
J Nutr ; 147(5): 948-954, 2017 05.
Article in English | MEDLINE | ID: mdl-28298543

ABSTRACT

Background: Childhood undernutrition is a major public health problem in Bangladesh. Evaluating child nutrition programs is a priority.Objective: The objective of this study was to evaluate a community-based nutrition education program (implemented from 2011 to 2013) aimed at improving infant and young child feeding (IYCF) practices and growth in rural Bangladesh.Methods: A cohort-based evaluation was conducted that included 2400 women (1200 from Karimganj, the intervention subdistrict, and 1200 from Katiadi, the control subdistrict) enrolled at 28-31 wk gestation in 3 waves between January and October 2011. Follow-up occurred at 3, 9, 16, and 24 mo of offspring age. The main outcomes were exclusive breastfeeding (EBF), measured at 3 mo, timing of complementary feeding (CF) initiation and minimum acceptable diet (MAD), measured at 9 mo, and child growth [assessed via length-for-age z score (LAZ) and weight-for-length z score], measured at all follow-ups. The main exposures were subdistrict of residence and wave of enrollment. For IYCF practices as outcome, logistic regressions were used. Generalized estimating equations were used for child growth as outcome.Results: EBF rates at 3 mo remained unchanged between waves 1 and 3 in Karimganj (55.6% compared with 57.3%), but the proportion of infants receiving timely CF initiation and MAD at 9 mo increased significantly (CF: 27.1-54.7%; MAD: 8.4-35.3%). Mean LAZ at 24 mo remained unchanged between waves 1 and 3 in Karimganj (-2.18 compared with -1.98).Conclusions: The program was successful in improving the quality of infant diet at 9 mo and timely CF initiation, but not EBF at 3 mo or LAZ. These findings support the case for implementing simple messages in all programs aimed at improving infant diet, especially in settings in which supplementing overall household diet may not be feasible.


Subject(s)
Breast Feeding , Diet , Health Education , Health Promotion , Infant Nutritional Physiological Phenomena , Nutritional Status , Rural Population , Adult , Bangladesh , Body Height , Child Nutrition Disorders/prevention & control , Child Nutritional Physiological Phenomena , Child, Preschool , Feeding Behavior , Humans , Infant , Malnutrition/prevention & control , Mothers , Program Evaluation , Weight Gain , Young Adult
5.
PLoS One ; 11(10): e0165128, 2016.
Article in English | MEDLINE | ID: mdl-27776161

ABSTRACT

The association between suboptimal infant feeding practices and growth faltering is well-established. However, most of this evidence comes from cross-sectional studies. To prospectively assess the association between suboptimal infant feeding practices and growth faltering, we interviewed pregnant women at 28-32 weeks' gestation and followed-up their offspring at postnatal months 3, 9, 16 and 24 months in rural Bangladesh. Using maternal recall over the past 24 hours, exclusive breastfeeding (EBF) status at 3 months, age at complementary feeding (CF) initiation, and receipt of minimum acceptable diet (MAD; as defined by WHO) at 9 months were assessed. Infant length and weight measurements were used to produce length-for-age (LAZ) and weight-for-length (WLZ) z-scores at each follow-up. Generalized estimating equations were used to estimate associations of LAZ and WLZ with infant feeding practices. All models were adjusted for baseline SES, infant sex, maternal height, age, literacy and parity. Follow-up was completed by 2189, 2074, 1969 and 1885 mother-child dyads at 3, 9, 16 and 24 months, respectively. Stunting prevalence increased from 28% to 57% between infant age 3 and 24 months. EBF at 3 months and age at CF initiation were not associated with linear infant growth, but receipt of MAD at 9 months was. By age 24 months, infants receiving MAD had attained a higher LAZ compared to infants who did not receive MAD (adjusted ß = 0.25, 95% CI: 0.13-0.37). Although prevalence of stunting was already high at age 3 months, ensuring infants receive a diverse, high quality diet from 6 months onwards may reduce rates of stunting in the second year of life.


Subject(s)
Breast Feeding , Diet , Growth , Rural Population , Adult , Bangladesh , Female , Humans , Infant , Male , Social Class , Young Adult
6.
Matern Child Health J ; 20(8): 1598-606, 2016 08.
Article in English | MEDLINE | ID: mdl-26994608

ABSTRACT

Objective Evaluate variation in fruit and vegetable intake by Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) participation and poverty status among pregnant, and postpartum women participating in the Infant Feeding Practice Study II (IFPSII). Methods IFPSII (2005-2007) followed US women from third trimester through 1 year postpartum through mailed questionnaires measuring income, WIC participation, breastfeeding; and dietary history questionnaires (DHQ) assessing prenatal/postnatal fruit and vegetable consumption. Poverty measurements used U.S. Census Bureau Federal Poverty thresholds to calculate percent of poverty index ratio (PIR) corresponding to WIC's financial eligibility (≤185 % PIR). Comparison groups: WIC recipients; WIC eligible (≤185 % PIR), but non-recipients; and women not financially WIC eligible (>185 % PIR). IFPSII participants who completed at least one DHQ were included. Intake variation among WIC/poverty groups was assessed by Kruskal-Wallis tests and between groups by Mann-Whitney Wilcoxon tests and logistic regression. Mann-Whitney Wilcoxon tests examined postnatal intake by breastfeeding. Results Prenatal vegetable intake significantly varied by WIC/poverty groups (p = 0.04) with WIC recipients reporting significantly higher intake than women not financially WIC eligible (p = 0.02); association remained significant adjusting for confounders [odds ratio 0.66 (95 % confidence interval: 0.49-0.90)]. Prenatal fruit and postnatal consumption did not significantly differ by WIC/poverty groups. Postnatal intake was significantly higher among breastfeeding than non-breastfeeding women (fruit: p < 0.0001; vegetable: p = 0.006). Conclusions for Practice Most intakes did not significantly differ by WIC/poverty groups and thus prompts research on WIC recipient's dietary behaviors, reasons for non-participation in WIC, and the influence of the recent changes to the WIC food package.


Subject(s)
Food Assistance , Fruit , Poverty , Vegetables , Adult , Breast Feeding/statistics & numerical data , Female , Food Supply , Humans , Postpartum Period , Pregnancy , Pregnant Women , Surveys and Questionnaires , United States , Young Adult
7.
Infect Control Hosp Epidemiol ; 37(7): 863-6, 2016 07.
Article in English | MEDLINE | ID: mdl-26868605

ABSTRACT

Among dialysis facilities participating in a bloodstream infection (BSI) prevention collaborative, access-related BSI incidence rate improvements observed immediately following implementation of a bundle of BSI prevention interventions were sustained for up to 4 years. Overall, BSI incidence remained unchanged from baseline in the current analysis. Infect Control Hosp Epidemiol 2016;37:863-866.


Subject(s)
Cross Infection/prevention & control , Renal Dialysis/adverse effects , Sepsis/prevention & control , Ambulatory Care/methods , Ambulatory Care/statistics & numerical data , Cross Infection/epidemiology , Humans , Interinstitutional Relations , Patient Care Bundles , Renal Dialysis/methods , Sepsis/epidemiology
8.
Public Health Nutr ; 19(10): 1875-81, 2016 07.
Article in English | MEDLINE | ID: mdl-26563771

ABSTRACT

OBJECTIVE: To determine the association between household food security and infant complementary feeding practices in rural Bangladesh. DESIGN: Prospective, cohort study using structured home interviews during pregnancy and 3 and 9 months after delivery. We used two indicators of household food security at 3-months' follow-up: maternal Food Composition Score (FCS), calculated via the World Food Programme method, and an HHFS index created from an eleven-item food security questionnaire. Infant feeding practices were characterized using WHO definitions. SETTING: Two rural sub-districts of Kishoreganj, Bangladesh. SUBJECTS: Mother-child dyads (n 2073) who completed the 9-months' follow-up. RESULTS: Complementary feeding was initiated at age ≤4 months for 7 %, at 5-6 months for 49 % and at ≥7 months for 44 % of infants. Based on 24 h dietary recall, 98 % of infants were still breast-feeding at age 9 months, and 16 % received ≥4 food groups and ≥4 meals (minimally acceptable diet) in addition to breast milk. Mothers' diet was more diverse than infants'. The odds of receiving a minimally acceptable diet for infants living in most food-secure households were three times those for infants living in least food-secure households (adjusted OR=3·0; 95 % CI 2·1, 4·3). Socio-economic status, maternal age, literacy, parity and infant sex were not associated with infant diet. CONCLUSIONS: HHFS and maternal FCS were significant predictors of subsequent infant feeding practices. Nevertheless, even the more food-secure households had poor infant diet. Interventions aimed at improving infant nutritional status need to focus on both complementary food provision and education.


Subject(s)
Feeding Behavior , Food Supply , Infant Nutritional Physiological Phenomena , Bangladesh , Breast Feeding , Female , Humans , Infant , Infant Food , Mothers , Pregnancy , Prospective Studies
9.
J Health Care Poor Underserved ; 26(4): 1304-18, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26548680

ABSTRACT

The food environment is described by two measures: store-level (actual) and individual-level (perceived). Understanding the relationship between actual and perceived fruit and vegetable (F&V) nutrition environments is important as their association may influence F&V purchases and consumption. The study objective was to assess agreement between perceived and actual environment measures of availability, quality, and affordability/price for fresh and canned/frozen F&V. African American WIC recipients (n=84) self-reported perceptions corresponding to chain food stores (n=13) which were then assessed by surveyors. Nearly 80% of participants had positive perceptions of stores' F&V availability, quality, and affordability. Store assessments indicated high F&V availability and quality and lowest prices for canned varieties. Kappa statistics, sensitivity, and specificity calculated agreement between perceived and actual measures. Results indicated slight to fair agreements. Agreements were highest for quality measures (kappa=0.25 (95% CI:0.08-0.42), p=.008). Research implications include promoting nutrition education and resident interviewing to understand F&V expectations.


Subject(s)
Black or African American/psychology , Food Supply/statistics & numerical data , Fruit , Poverty/ethnology , Urban Population , Vegetables , Adolescent , Adult , Black or African American/statistics & numerical data , Commerce/statistics & numerical data , Environment , Female , Food/economics , Food/standards , Humans , Reproducibility of Results , Self Report , Urban Population/statistics & numerical data , Young Adult
10.
PLoS One ; 10(10): e0141047, 2015.
Article in English | MEDLINE | ID: mdl-26492462

ABSTRACT

INTRODUCTION: Episodes of acute adenolymphangitis (ADL) are often the first clinical sign of lymphatic filariasis (LF). They are often accompanied by swelling of the affected limb, inflammation, fever, and general malaise and lead to the progression of lymphedema. Although ADL episodes have been studied for a century or more, questions still remain as to their etiology. We quantified antibody levels to pathogens that potentially contribute to ADL episodes during and after an episode among lymphedema patients in Léogâne, Haiti. We estimated the proportion of ADL episodes hypothesized to be attributed to specific pathogens. METHODS: We measured antibody levels to specific pathogens during and following an ADL episode among 41 lymphedema patients enrolled in a cohort study in Léogâne, Haiti. We calculated the absolute and relative changes in antibody levels between the ADL and convalescent time points. We calculated the proportion of episodes that demonstrated a two-fold increase in antibody level for several bacterial, fungal, and filarial pathogens. RESULTS: Our results showed the greatest proportion of two-fold changes in antibody levels for the carbohydrate antigen Streptococcus group A, followed by IgG2 responses to a soluble filarial antigen (BpG2), Streptococcal Pyrogenic Exotoxin B, and an antigen for the fungal pathogen Candida. When comparing the median antibody level during the ADL episode to the median antibody level at the convalescent time point, only the antigens for Pseudomonas species (P-value = 0.0351) and Streptolysin O (P-value = 0.0074) showed a significant result. CONCLUSION: Although our results are limited by the lack of a control group and few antibody responses, they provide some evidence for infection with Streptococcus A as a potential contributing factor to ADL episodes. Our results add to the current evidence and illustrate the importance of determining the causal role of bacterial and fungal pathogens and immunological antifilarial response in ADL episodes.


Subject(s)
Antibodies, Bacterial/blood , Antibodies, Fungal/blood , Antibodies, Protozoan/blood , Lymphangitis/etiology , Streptococcus pyogenes/immunology , Adult , Animals , Antibodies, Bacterial/immunology , Antibodies, Fungal/immunology , Antibodies, Protozoan/immunology , Cohort Studies , Elephantiasis, Filarial/etiology , Female , Haiti , Humans , Lymphangitis/blood , Lymphangitis/immunology , Lymphedema/blood , Lymphedema/immunology , Male , Wuchereria bancrofti/immunology
11.
Vaccine ; 33(48): 6865-70, 2015 Nov 27.
Article in English | MEDLINE | ID: mdl-26263200

ABSTRACT

Rotavirus vaccine was introduced in El Salvador in 2006 and is recommended to be given concomitantly with DTP-HepB-Haemophilus influenzae type b (pentavalent) vaccine at ages 2 months (upper age limit 15 weeks) and 4 months (upper age limit 8 months) of age. However, rotavirus vaccination coverage continues to lag behind that of pentavalent vaccine, even in years when national rotavirus vaccine stock-outs have not occurred. We analyzed factors associated with receipt of oral rotavirus vaccine among children who received at least 2 doses of pentavalent vaccine in a stratified cluster survey of children aged 24-59 months conducted in El Salvador in 2011. Vaccine doses included were documented on vaccination cards (94.4%) or in health facility records (5.6%). Logistic regression and survival analysis were used to assess factors associated with vaccination status and age at vaccination. Receipt of pentavalent vaccine by age 15 weeks was associated with rotavirus vaccination (OR: 5.1; 95% CI 2.7, 9.4), and receipt of the second pentavalent dose by age 32 weeks was associated with receipt of two rotavirus vaccine doses (OR: 5.0; 95% CI 2.1-12.3). Timely coverage with the first pentavalent vaccine dose was 88.2% in the 2007 cohort and 91.1% in the 2008 cohort (p=0.04). Children born in 2009, when a four-month national rotavirus vaccine stock-out occurred, had an older median age of receipt of rotavirus vaccine and were less likely to receive rotavirus on the same date as the same dose of pentavalent vaccine than children born in 2007 and 2008. Upper age limit recommendations for rotavirus vaccine administration contributed to suboptimal vaccination coverage. Survey data suggest that late rotavirus vaccination and co-administration with later doses of pentavalent vaccine among children born in 2009 helped increase rotavirus vaccine coverage following shortages.


Subject(s)
Rotavirus Infections/epidemiology , Rotavirus Infections/prevention & control , Rotavirus Vaccines/administration & dosage , Child, Preschool , El Salvador/epidemiology , Female , Haemophilus influenzae , Humans , Immunization Schedule , Infant , Infant, Newborn , Male , Vaccination/statistics & numerical data
12.
Infect Control Hosp Epidemiol ; 36(8): 886-92, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25990620

ABSTRACT

OBJECTIVE: To determine whether central line-associated bloodstream infections (CLABSIs) increase the likelihood of readmission. DESIGN: Retrospective matched cohort study for the years 2008-2009. SETTING: Acute care hospitals. PARTICIPANTS: Medicare recipients. CLABSI and readmission status were determined by linking National Healthcare Safety Network surveillance data to the Centers for Medicare and Medicaid Services' Medical Provider and Analysis Review in 8 states. Frequency matching was used on International Classification of Diseases, Ninth Revision, Clinical Modification procedure code category and intensive care unit status. METHODS: We compared the rate of readmission among patients with and without CLABSI during an index hospitalization. Cox proportional hazard analysis was used to assess rate of readmission (the first hospitalization within 30 days after index discharge). Multivariate models included the following covariates: race, sex, length of index hospitalization stay, central line procedure code, Gagne comorbidity score, and individual chronic conditions. RESULTS: Of the 8,097 patients, 2,260 were readmitted within 30 days (27.9%). The rate of first readmission was 7.1 events/person-year for CLABSI patients and 4.3 events/person-year for non-CLABSI patients (P<.001). The final model revealed a small but significant increase in the rate of 30-day readmissions for patients with a CLABSI compared with similar non-CLABSI patients. In the first readmission for CLABSI patients, we also observed an increase in diagnostic categories consistent with CLABSI, including septicemia and complications of a device. CONCLUSIONS: Our analysis found a statistically significant association between CLABSI status and readmission, suggesting that CLABSI may have adverse health impact that extends beyond hospital discharge.


Subject(s)
Catheter-Related Infections/epidemiology , Cross Infection/epidemiology , Patient Readmission/statistics & numerical data , Aged , Aged, 80 and over , Central Venous Catheters/adverse effects , Female , Humans , Length of Stay/statistics & numerical data , Male , Medicare/statistics & numerical data , Retrospective Studies , United States/epidemiology
13.
Water (Basel) ; 7(2): 818-832, 2015 Feb 13.
Article in English | MEDLINE | ID: mdl-25995956

ABSTRACT

The use of contaminated surface water continues to be a pressing issue in areas of the world where people lack improved drinking water sources. In northern coastal Ecuador, many communities rely on untreated surface water as their primary source of drinking water. We undertook a study to explore how microscale river hydrodynamics affect microbial water quality at community water collection locations at three rivers with varying stream velocity and turbidity profiles. To examine how the distance from river shore and physiochemical water quality variables affect microbial contamination levels in the rivers; we collected a total of 355 water samples within six villages on three rivers; and tested for Escherichia coli concentrations using the IDEXX Quanti-tray method. We found that log10E. coli concentrations decreased with increasing distance from shore (ß = -0.017; p = 0.003). Water in the main channel had E. coli concentrations on average 0.12 log10 lower than within eddies along the river shore and 0.27 log10 lower between the sample closest to shore and any sample >6 m from the shore. Higher E. coli concentrations were also significantly associated with increased turbidity (ß = 0.003; p < 0.0001) and decreased dissolved oxygen levels (ß = -0.310; p < 0.0001). The results of this study can help inform community members about the safest locations to collect drinking water and also provide information on watershed scale transport of microbial contaminants between villages.

14.
Ann Epidemiol ; 25(6): 433-438.e1, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25908300

ABSTRACT

PURPOSE: Medications can have unintended effects. High medication use populations may benefit from increased regimen oversight. Limited knowledge exists concerning racial and regional polypharmacy variation. We estimated total medication distributions (excluding supplements) of American black and white adults and assessed racial and regional polypharmacy variation. METHODS: REasons for Geographic And Racial Differences in Stroke (REGARDS) cohort data (n = 30,239 U.S. blacks and whites aged ≥45 years) were analyzed. Home pill bottle inspections assessed the last two weeks' medications. Polypharmacy (≥8 medications) was determined by summing prescription and/or over-the-counter ingredients. Population-weighted logistic regression assessed polypharmacy's association with census region, race, and sex. RESULTS: The mean ingredient number was 4.12 (standard error = 0.039), with 15.7% of REGARDS using 8 ingredients or more. In crude comparisons, women used more medications than men, and blacks and whites reported similar mean ingredients. A cross-sectional, logistic model adjusting for demographics, socioeconomics, and comorbidities showed increased polypharmacy prevalence in whites versus blacks (OR [95% CI]: 0.63, [0.55-0.72]), women (1.94 [1.68-2.23]), and Southerners (broadly Southeasterners and Texans; 1.48 [1.17-1.87]) versus Northeasterners (broadly New England and upper Mid-Atlantic). Possible limitations include polypharmacy misclassification and model misspecification. CONCLUSION: Polypharmacy is common. Race and geography are associated with polypharmacy variation. Further study of underlying factors explaining these differences is warranted.


Subject(s)
Black or African American/statistics & numerical data , Polypharmacy , White People/statistics & numerical data , Aged , Aged, 80 and over , Cohort Studies , Comorbidity , Cross-Sectional Studies , Female , Geography , Humans , Logistic Models , Male , Middle Aged , Multivariate Analysis , United States
15.
Clin J Am Soc Nephrol ; 10(7): 1162-9, 2015 Jul 07.
Article in English | MEDLINE | ID: mdl-25901090

ABSTRACT

BACKGROUND AND OBJECTIVES: Molecular evidence suggests that levels of vitamin D are associated with kidney function loss. Still, population-based studies are limited and few have considered the potential confounding effect of baseline kidney function. This study evaluated the association of serum 25-hydroxyvitamin D with change in eGFR, rapid eGFR decline, and incidence of CKD and albuminuria. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Baseline (2003-2006) and 5.5-year follow-up data from a Swiss adult general population were used to evaluate the association of serum 25-hydroxyvitamin D with change in eGFR, rapid eGFR decline (annual loss >3 ml/min per 1.73 m(2)), and incidence of CKD and albuminuria. Serum 25-hydroxyvitamin D was measured at baseline using liquid chromatography-tandem mass spectrometry. eGFR and albuminuria were collected at baseline and follow-up. Multivariate linear and logistic regression models were used considering potential confounding factors. RESULTS: Among the 4280 people included in the analysis, the mean±SD annual eGFR change was -0.57±1.78 ml/min per 1.73 m(2), and 287 (6.7%) participants presented rapid eGFR decline. Before adjustment for baseline eGFR, baseline 25-hydroxyvitamin D level was associated with both mean annual eGFR change and risk of rapid eGFR decline, independently of baseline albuminuria. Once adjusted for baseline eGFR, associations were no longer significant. For every 10 ng/ml higher baseline 25-hydroxyvitamin D, the adjusted mean annual eGFR change was -0.005 ml/min per 1.73 m(2) (95% confidence interval, -0.063 to 0.053; P=0.87) and the risk of rapid eGFR decline was null (odds ratio, 0.93; 95% confidence interval, 0.79 to 1.08; P=0.33). Baseline 25-hydroxyvitamin D level was not associated with incidence of CKD or albuminuria. CONCLUSIONS: The association of 25-hydroxyvitamin D with eGFR decline is confounded by baseline eGFR. Sufficient 25-hydroxyvitamin D levels do not seem to protect from eGFR decline independently from baseline eGFR.


Subject(s)
Albuminuria/epidemiology , Glomerular Filtration Rate , Kidney/physiopathology , Renal Insufficiency, Chronic/epidemiology , Vitamin D Deficiency/epidemiology , Vitamin D/analogs & derivatives , Adult , Aged , Albuminuria/diagnosis , Albuminuria/physiopathology , Biomarkers/blood , Chromatography, Liquid , Female , Follow-Up Studies , Humans , Incidence , Linear Models , Logistic Models , Male , Middle Aged , Multivariate Analysis , Population Surveillance , Prevalence , Prognosis , Protective Factors , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/physiopathology , Risk Assessment , Risk Factors , Switzerland/epidemiology , Tandem Mass Spectrometry , Time Factors , Vitamin D/blood , Vitamin D Deficiency/blood , Vitamin D Deficiency/diagnosis
16.
J Med Internet Res ; 16(11): e246, 2014 Nov 10.
Article in English | MEDLINE | ID: mdl-25386801

ABSTRACT

BACKGROUND: Men who have sex with men (MSM) are the most affected risk group in the United States' human immunodeficiency virus (HIV) epidemic. Sexual concurrency, the overlapping of partnerships in time, accelerates HIV transmission in populations and has been documented at high levels among MSM. However, concurrency is challenging to measure empirically and variations in assessment techniques used (primarily the date overlap and direct question approaches) and the outcomes derived from them have led to heterogeneity and questionable validity of estimates among MSM and other populations. OBJECTIVE: The aim was to evaluate a novel Web-based and interactive partnership-timing module designed for measuring concurrency among MSM, and to compare outcomes measured by the partnership-timing module to those of typical approaches in an online study of MSM. METHODS: In an online study of MSM aged ≥18 years, we assessed concurrency by using the direct question method and by gathering the dates of first and last sex, with enhanced programming logic, for each reported partner in the previous 6 months. From these methods, we computed multiple concurrency cumulative prevalence outcomes: direct question, day resolution / date overlap, and month resolution / date overlap including both 1-month ties and excluding ties. We additionally computed variants of the UNAIDS point prevalence outcome. The partnership-timing module was also administered. It uses an interactive month resolution calendar to improve recall and follow-up questions to resolve temporal ambiguities, combines elements of the direct question and date overlap approaches. The agreement between the partnership-timing module and other concurrency outcomes was assessed with percent agreement, kappa statistic (κ), and matched odds ratios at the individual, dyad, and triad levels of analysis. RESULTS: Among 2737 MSM who completed the partnership section of the partnership-timing module, 41.07% (1124/2737) of individuals had concurrent partners in the previous 6 months. The partnership-timing module had the highest degree of agreement with the direct question. Agreement was lower with date overlap outcomes (agreement range 79%-81%, κ range .55-.59) and lowest with the UNAIDS outcome at 5 months before interview (65% agreement, κ=.14, 95% CI .12-.16). All agreements declined after excluding individuals with 1 sex partner (always classified as not engaging in concurrency), although the highest agreement was still observed with the direct question technique (81% agreement, κ=.59, 95% CI .55-.63). Similar patterns in agreement were observed with dyad- and triad-level outcomes. CONCLUSIONS: The partnership-timing module showed strong concurrency detection ability and agreement with previous measures. These levels of agreement were greater than others have reported among previous measures. The partnership-timing module may be well suited to quantifying concurrency among MSM at multiple levels of analysis.


Subject(s)
HIV Infections/transmission , Homosexuality, Male , Internet , Sexual Behavior , Sexual Partners , Adult , Epidemics , HIV Infections/epidemiology , Humans , Male , Prevalence , Risk Factors , Time Factors , United States/epidemiology
17.
PLoS Negl Trop Dis ; 8(9): e3140, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25211334

ABSTRACT

BACKGROUND: Lymphedema management programs have been shown to decrease episodes of adenolymphangitis (ADLA), but the impact on lymphedema progression and of program compliance have not been thoroughly explored. Our objectives were to determine the rate of ADLA episodes and lymphedema progression over time for patients enrolled in a community-based lymphedema management program. We explored the association between program compliance and ADLA episodes as well as lymphedema progression. METHODOLOGY/PRINCIPAL FINDINGS: A lymphedema management program was implemented in Odisha State, India from 2007-2010 by the non-governmental organization, Church's Auxiliary for Social Action, in consultation with the Centers for Disease Control and Prevention. A cohort of patients was followed over 24 months. The crude 30-day rate of ADLA episodes decreased from 0.35 episodes per person-month at baseline to 0.23 at 24 months. Over the study period, the percentage of patients who progressed to more severe lymphedema decreased (P-value  = 0.0004), while those whose lymphedema regressed increased over time (P-value<0.0001). Overall compliance to lymphedema management, lagged one time point, appeared to have little to no association with the frequency of ADLA episodes among those without entry lesions (RR = 0.87 (0.69, 1.10)) and was associated with an increased rate (RR = 1.44 (1.11, 1.86)) among those with entry lesions. Lagging compliance two time points, it was associated with a decrease in the rate of ADLA episodes among those with entry lesions (RR = 0.77 (95% CI: 0.59, 0.99)) and was somewhat associated among those without entry lesions (RR = 0.83 (95% CI: 0.64, 1.06)). Compliance to soap was associated with a decreased rate of ADLA episodes among those without inter-digital entry lesions. CONCLUSIONS/SIGNIFICANCE: These results indicate that a community-based lymphedema management program is beneficial for lymphedema patients for both ADLA episodes and lymphedema. It is one of the first studies to demonstrate an association between program compliance and rate of ADLA episodes.


Subject(s)
Lymphangitis/epidemiology , Lymphangitis/therapy , Lymphedema/epidemiology , Lymphedema/therapy , Adult , Aged , Community Health Services , Disease Progression , Female , Humans , India/epidemiology , Male , Middle Aged , Patient Compliance
18.
J Ren Nutr ; 24(5): 303-12, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25030223

ABSTRACT

OBJECTIVE: Vitamin D deficiency is frequent in the general population and might be even more prevalent among populations with kidney failure. We compared serum vitamin D levels, vitamin D insufficiency/deficiency status, and vitamin D level determinants in populations without chronic kidney disease (CKD) and with CKD not requiring renal dialysis. DESIGN AND METHODS: This was a cross-sectional, multicenter, population-based study conducted from 2010 to 2011. Participants were from 10 centers that represent the geographical and cultural diversity of the Swiss adult population (≥15 years old). INTERVENTION: CKD was defined using estimated glomerular filtration rate and 24-hour albuminuria. Serum vitamin D was measured by liquid chromatography-tandem mass spectrometry. Statistical procedures adapted for survey data were used. MAIN OUTCOME MEASURE: We compared 25-hydroxy-vitamin D (25(OH)D) levels and the prevalence of vitamin D insufficiency/deficiency (serum 25(OH)D < 30 ng/mL) in participants with and without CKD. We tested the interaction of CKD status with 6 a priori defined attributes (age, sex, body mass index, walking activity, serum albumin-corrected calcium, and altitude) on serum vitamin D level or insufficiency/deficiency status taking into account potential confounders. RESULTS: Overall, 11.8% (135 of 1,145) participants had CKD. The 25(OH)D adjusted means (95% confidence interval [CI]) were 23.1 (22.6-23.7) and 23.5 (21.7-25.3) ng/mL in participants without and with CKD, respectively (P = .70). Vitamin D insufficiency or deficiency was frequent among participants without and with CKD (75.3% [95% CI 69.3-81.5] and 69.1 [95% CI 53.9-86.1], P = .054). CKD status did not interact with major determinants of vitamin D, including age, sex, BMI, walking minutes, serum albumin-corrected calcium, or altitude for its effect on vitamin D status or levels. CONCLUSION: Vitamin D concentration and insufficiency/deficiency status are similar in people with or without CKD not requiring renal dialysis.


Subject(s)
Nutritional Status , Renal Dialysis , Renal Insufficiency, Chronic/blood , Urine Specimen Collection/methods , Vitamin D Deficiency/epidemiology , Vitamin D/blood , Adolescent , Adult , Albuminuria/urine , Body Mass Index , Calcium/blood , Chromatography, Liquid , Creatinine/blood , Cross-Sectional Studies , Female , Glomerular Filtration Rate , Humans , Logistic Models , Male , Middle Aged , Multivariate Analysis , Prevalence , Renal Insufficiency, Chronic/complications , Serum Albumin/metabolism , Sunlight , Switzerland , Tandem Mass Spectrometry , Vitamin D/administration & dosage , Vitamin D Deficiency/blood , Vitamin D Deficiency/complications , Young Adult
19.
BMC Public Health ; 14: 209, 2014 Feb 28.
Article in English | MEDLINE | ID: mdl-24580732

ABSTRACT

BACKGROUND: Intimate partner violence (IPV) and coercion have been associated with negative health outcomes, including increased HIV risk behaviors, among men who have sex with men (MSM). This is the first study to describe the prevalence and factors associated with experiencing IPV or coercion among US MSM dyads using the actor-partner interdependence model (APIM), an analytic framework to describe interdependent outcomes within dyads. METHODS: Among MSM couples enrolled as dyads in an HIV prevention randomized controlled trial (RCT), two outcomes are examined in this cross-sectional analysis: 1) the actor experiencing physical or sexual IPV from the study partner in the past 3-months and 2) the actor feeling coerced to participate in the RCT by the study partner. Two multilevel APIM logistic regression models evaluated the association between each outcome and actor, partner, and dyad-level factors. RESULTS: Of 190 individuals (95 MSM couples), 14 reported experiencing physical or sexual IPV from their study partner in the past 3 months (7.3%) and 12 reported feeling coerced to participate in the RCT by their study partner (6.3%). Results of multivariate APIM analyses indicated that reporting experienced IPV was associated (p < 0.1) with non-Black/African American actor race, lower actor education, and lower partner education. Reporting experienced coercion was associated (p < 0.1) with younger actor age and lower partner education. CONCLUSIONS: These findings from an HIV prevention RCT for MSM show considerable levels of IPV experienced in the past 3-months and coercion to participate in the research study, indicating the need for screening tools and support services for these behaviors. The identification of factors associated with IPV and coercion demonstrate the importance of considering actor and partner effects, as well as dyadic-level effects, to improve development of screening tools and support services for these outcomes.


Subject(s)
HIV Infections/prevention & control , Interpersonal Relations , Sexual Partners , Spouse Abuse/prevention & control , Adult , Coercion , Cross-Sectional Studies , Georgia/epidemiology , Humans , Logistic Models , Male , Middle Aged , Patient Selection , Prevalence , Randomized Controlled Trials as Topic , Risk-Taking , Spouse Abuse/statistics & numerical data , Surveys and Questionnaires
20.
Am J Nephrol ; 39(1): 50-8, 2014.
Article in English | MEDLINE | ID: mdl-24434854

ABSTRACT

BACKGROUND: Receipt of nephrology care prior to end-stage renal disease (ESRD) is a strong predictor of decreased mortality and morbidity, and neighborhood poverty may influence access to care. Our objective was to examine whether neighborhood poverty is associated with lack of pre-ESRD care at dialysis facilities. METHODS: In a multi-level ecological study using geospatially linked 2007-2010 Dialysis Facility Report and 2006-2010 American Community Survey data, we examined whether high neighborhood poverty (≥20% of households in census tract living below poverty) was associated with dialysis facility-level lack of pre-ESRD care (percentage of patients with no nephrology care prior to dialysis start) in mixed-effects models, adjusting for facility and neighborhood confounders and allowing for neighborhood and regional random effects. RESULTS: Among the 5,184 facilities examined, 1,778 (34.3%) were located in a high-poverty area. Lack of pre-ESRD care was similar in poverty areas (30.8%) and other neighborhoods (29.6%). With adjustment, the absolute increase in percentage of patients at a facility with no pre-ESRD care associated with facility location in a poverty area versus other neighborhood was only 0.08% (95% CI -1.32, 1.47; p = 0.9). Potential effect modification by race and income inequality was detected. CONCLUSION: Despite previously reported detrimental effects of neighborhood poverty on health, facility neighborhood poverty was not associated with receipt of pre-ESRD care, suggesting no need to target interventions to increase access to pre-ESRD care at facilities in poorer geographic areas.


Subject(s)
Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/therapy , Poverty , Renal Dialysis/methods , Aged , Ambulatory Care Facilities , Female , Geography , Health Services Accessibility , Healthcare Disparities , Humans , Male , Middle Aged , Quality of Health Care , Residence Characteristics , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...