Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
1.
J Surg Orthop Adv ; 33(2): 61-67, 2024.
Article in English | MEDLINE | ID: mdl-38995058

ABSTRACT

Rural patients have poorer health indicators, including higher risk of developing osteoarthritis. The objective of this study is to compare rural patients undergoing primary total joint arthroplasty (TJA) at rural hospitals with those undergoing primary TJA at urban hospitals with regards to demographics, comorbidities, and complications and to determine the preferred location of care for rural patients. Data from the Healthcare Cost and Utilization Project National Inpatient Sample between 2016 and 2018 were analyzed. Demographics, comorbidities, inpatient complications, hospital length of stay, inpatient mortality, and discharge disposition were compared between rural patients who underwent TJA at rural hospitals and urban hospitals. Rural patients undergoing primary TJA in rural hospitals were more likely to be women, to be treated in the South, to have Medicaid payer status, to have dementia, diabetes mellitus, lung disease, and postoperative pulmonary complications, and to have a longer hospital length of stay. Those patients were also less likely to have baseline obesity, heart disease, kidney disease, liver disease, cancer, postoperative infection, and cardiovascular complications, and were less likely to be discharged home. Rural patients undergoing primary TJA tend to pursue surgery in their rural hospital when their comorbidity profile is manageable. These patients get their surgery performed in an urban setting when they have the means for travel and cost, and when their comorbidity profile is more complicated, requiring more specialized care, Rural patients are choosing to undergo their primary TJA in urban hospitals as opposed to their local rural hospitals. (Journal of Surgical Orthopaedic Advances 33(2):061-067, 2024).


Subject(s)
Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Length of Stay , Humans , Female , Male , Arthroplasty, Replacement, Knee/statistics & numerical data , Aged , Arthroplasty, Replacement, Hip/statistics & numerical data , Middle Aged , Length of Stay/statistics & numerical data , United States/epidemiology , Postoperative Complications/epidemiology , Hospitals, Urban/statistics & numerical data , Hospitals, Rural/statistics & numerical data , Comorbidity , Rural Population/statistics & numerical data
2.
Injury ; 54(11): 111036, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37769424

ABSTRACT

INTRODUCTION: The use of nitinol continuous compression staples has shown clinical utility in the management of various orthopaedic injuries. While literature is most robust in the realm of foot/ankle and spine surgery, the use of nitinol staples has been documented in fixation of wrist, olecranon, patella, and pelvis fractures. METHODOLOGY: A narrative review was conducted by searching three online databases - PubMed, Web of Science, and Cochrane using the terms "Nitinol" and "Staple" published between 2003 and 2023. A total of 42 articles met inclusion/exclusion criteria and were included in this review. REVIEW: Literature outside of foot/ankle and spine surgery is largely limited to biomechanical studies, case reports, and finite element analyses. The literature is summarized within this review by anatomic location including foot/ankle, lower extremity, hand, upper extremity, spine, and pelvis. CONCLUSION: Existing literature demonstrates a diverse array of applications for nitinol continuous compression staples in both axial and appendicular orthopaedic care. Advantages of these implants include ease of application, ability to capture small bony fragments, continuous compression across a fracture or arthrodesis, and full coaptation which maximizes the surface area for healing and/or fusion.


Subject(s)
Fractures, Bone , Orthopedics , Humans , Alloys , Fractures, Bone/surgery , Arthrodesis
4.
Surgeon ; 21(5): e292-e300, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37028955

ABSTRACT

INTRODUCTION: The impact of autoimmune skin disorders on post-operative outcomes after TJA is conflicting and studies are limited by small sample sizes. The purpose of this study is to analyze a range of common autoimmune skin disorders and identify whether an increased risk of post-operative complication exists after total joint arthroplasty. METHODS: Data was collected from NIS database for patients diagnosed with autoimmune skin disorder (psoriasis, lupus, scleroderma, atopic dermatitis) and who underwent total hip arthroplasty (THA), total knee arthroplasty (TKA), or other TJA (shoulder elbow, wrist, ankle) between 2016 and 2019. Demographic, social, and comorbidity data was collected. Multivariate regression analyses were performed to assess the independent influence of autoimmune skin disorder on each post-operative outcome including implant infection, transfusion, revision, length of stay, cost, and mortality. RESULTS: Among 55,755 patients with autoimmune skin disease who underwent TJA, psoriasis was associated with increased risk of periprosthetic joint infection following THA (odds ratio 2.44 [1.89-3.15]) and increased risk of transfusion following TKA (odds ratio 1.33 [1.076-1.64]). Similar analyses were performed for systemic lupus erythematosus, atopic dermatitis, and scleroderma, however no statistically significant associations were observed in any of the six collected post-operative outcomes. CONCLUSION: This study suggests psoriasis is an independent risk factor for poorer post-operative outcomes following total joint arthroplasty, however similar risk was not observed for other autoimmune skin disorders such as lupus, atopic dermatitis, or scleroderma.


Subject(s)
Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Dermatitis, Atopic , Psoriasis , Skin Diseases , Humans , Dermatitis, Atopic/complications , Dermatitis, Atopic/epidemiology , Arthroplasty, Replacement, Knee/adverse effects , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Arthroplasty, Replacement, Hip/adverse effects , Risk Factors , Skin Diseases/complications , Psoriasis/complications , Retrospective Studies
5.
Ann Fam Med ; 21(2): 119-124, 2023.
Article in English | MEDLINE | ID: mdl-36973046

ABSTRACT

PURPOSE: We assessed low-dose computed tomography (LDCT) screening for lung cancer using a proactive patient education/recruitment program. METHODS: We identified patients aged 55-80 years from a family medicine group. In the retrospective phase (March-August, 2019), patients were categorized as current/former/never smokers, and screening eligibility was determined. Patients who underwent LDCT in the past year, along with outcomes, were documented. In the prospective phase (2020), patients in the same cohort who did not undergo LDCT were proactively contacted by a nurse navigator to discuss eligibility and prescreening. Eligible and willing patients were referred to their primary care physician. RESULTS: In the retrospective phase, of 451 current/former smokers, 184 (40.8%) were eligible for LDCT, 104 (23.1%) were ineligible, and 163 (36.1%) had an incomplete smoking history. Of those eligible, 34 (18.5%) had LDCT ordered. In the prospective phase, 189 (41.9%) were eligible for LDCT (150 [79.4%] of whom had no prior LDCT or diagnostic CT), 106 (23.5%) were ineligible, and 156 (34.6%) had an incomplete smoking history. The nurse navigator identified an additional 56/451 (12.4%) patients as eligible after contacting patients with incomplete smoking history. In total, 206 patients (45.7%) were eligible, an increase of 37.3% compared with the retrospective phase (150). Of these, 122 (59.2%) verbally agreed to screening, 94 (45.6%) met with their physician, and 42 (20.4%) were prescribed LDCT. CONCLUSIONS: A proactive education/recruitment model increased eligible patients for LDCT by 37.3%. Proactive identification/education of patients desiring to pursue LDCT was 59.2%. It is essential to identify strategies that will increase and deliver LDCT screening among eligible and willing patients.


Subject(s)
Lung Neoplasms , Humans , Lung Neoplasms/diagnostic imaging , Smoking , Early Detection of Cancer/methods , Retrospective Studies , Prospective Studies , Family Practice , Mass Screening
6.
J Am Acad Orthop Surg ; 31(2): e107-e117, 2023 Jan 15.
Article in English | MEDLINE | ID: mdl-36580056

ABSTRACT

INTRODUCTION: Perioperative cefazolin administration for total joint arthroplasty is a first-line antibiotic recommended by the American Academy of Orthopaedic Surgeons (AAOS) guidelines for the prevention of periprosthetic joint infections (PJIs). We aim to analyze the clinical viability of giving patients with a documented penicillin allergy (PA) a perioperative full-strength cefazolin "test dose" under anesthesia. METHODS: This is a retrospective chart review of 2,451 total joint arthroplasties from a high-volume arthroplasty orthopaedic surgeon over a 5-year period from January 2013 through December 2017. This surgeon routinely gave patients with a documented PA a full-strength cefazolin test dose while under anesthesia instead of administrating a second-line antibiotic. The primary outcomes examined were allergic reaction and postoperative infection. RESULTS: Cefazolin was given to 87.1% of all patients (1,990) and 46.0% of patients with a PA (143). The total rate of allergic reactions among all patients was 0.5% (11). Only one patient with a documented PA who received cefazolin had an allergic reaction. The reaction was not severe and did not require any additional treatment. In patients who had no reported allergies and received cefazolin, 0.3% (6) had an allergic reaction. There was no statistically significant difference in the rate of allergic reaction when comparing patients with and without a PA (P = 0.95). Patients receiving cefazolin had an overall PJI rate of 2.9% (57) versus those patients receiving antibiotics other than cefazolin who sustained a 5.5% PJI rate (16), which was statistically significant (P = 0.02). CONCLUSION: This study found that utilization of a full-strength test dose of cefazolin in patients with a documented PA is a feasible, safe, and effective way of increasing the rate of cefazolin administration and thus mitigating the risk of PJIs.


Subject(s)
Arthritis, Infectious , Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Drug Hypersensitivity , Hypersensitivity , Prosthesis-Related Infections , Humans , Cefazolin , Arthroplasty, Replacement, Knee/adverse effects , Retrospective Studies , Antibiotic Prophylaxis , Anti-Bacterial Agents , Penicillins/adverse effects , Drug Hypersensitivity/etiology , Drug Hypersensitivity/prevention & control , Arthritis, Infectious/etiology , Arthritis, Infectious/prevention & control , Hypersensitivity/etiology , Arthroplasty, Replacement, Hip/adverse effects , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/prevention & control , Prosthesis-Related Infections/drug therapy
7.
J Foot Ankle Surg ; 62(1): 191-196, 2023.
Article in English | MEDLINE | ID: mdl-36182644

ABSTRACT

Fragility index (FI) is a metric used to interpret the results of randomized controlled trials (RCTs), and describes the number of subjects that would need to be switched from event to non-event for a result to no longer be significant. Studies that analyze FI of RCTs in various orthopedic subspecialties have shown the RCTs to be largely underpowered and highly fragile. However, FI has not been assessed in foot and ankle RCTs. The MEDLINE and Embase online databases were searched from 1/1/2011 through 11/19/2021 for RCTs involving foot and ankle conditions. FI, fragility quotient (FQ), and difference between the FI and number of subjects lost to follow-up was calculated. Spearman correlation was performed to determine the relationship between sample size and FI. Overall, 1262 studies were identified of which 18 were included in the final analysis. The median sample size was 65 (interquartile range [IQR] 57-95.5), the median FI was 2 (IQR 1-2.5), and the median FQ was 0.026 (IQR 0.012-0.033). Ten of 15 (67%) studies with non-zero FI values had FI values less than the number of subjects lost to follow-up. There was linear association between FI and sample size (R2 = 0.495, p-value: .031). This study demonstrates that RCTs in the field of foot and ankle surgery are highly fragile, similar to other orthopedic subspecialties.


Subject(s)
Ankle , Humans , Ankle/surgery , Randomized Controlled Trials as Topic , Sample Size , Databases, Factual
8.
Ann Thorac Surg ; 114(4): 1128-1134, 2022 10.
Article in English | MEDLINE | ID: mdl-35331700

ABSTRACT

BACKGROUND: The objective of this single-blind randomized study is to compare local infiltration of bupivacaine or liposomal bupivacaine (LipoB) in narcotic naïve patients undergoing minimally invasive lobectomy for early stage lung cancer. METHODS: Adult patients without previous lung surgery undergoing minimally invasive lobectomy (robotic or thoracoscopic) for early stage lung cancer were randomly assigned to bupivacaine (with epinephrine 0.25%, 1:200 000) or LipoB 1.3%. Pain level was documented using the visual analog scale and morphine equivalents for narcotic pain medications. Inhospital treatment cost and pharmacy cost were compared. RESULTS: The study enrolled 50 patients (bupivacaine, 24; LipoB, 26). The mean age of patients was 66 years, 94% were non-Hispanic white, and 48% were male. There was no difference in baseline characteristics and comorbidities. Duration of surgery (105 vs 137 minutes, P = .152), chest tube duration (49 vs 55 hours, P = .126), and length of stay (2.45 vs 3.28 days, P = .326) were similar between treatments. Inhospital morphine equivalents were 42.7 mg vs 48 mg (P = .714), and the median pain score was 5.2 vs 4.75 (P = .602) for bupivacaine vs LipoB, respectively. There was no difference in narcotic use at 2 to 4 weeks (57.1% [12 of 21] vs 54.5% [12 of 22], P = 1.00), and at 6 months (5.9% [1 of 17] vs 9.5% [2 of 21], P = 1.00) after surgery. The overall cost ($20 252 vs $22 775, P = .225) was similar; however, pharmacy cost for LipoB was higher ($1052 vs $596, P = .0001). CONCLUSIONS: In narcotic naïve patients undergoing minimally invasive lobectomy, short-term narcotic use, postoperative pain scores, length of stay, and long-term narcotic use were similar between bupivacaine and LipoB.


Subject(s)
Bupivacaine , Lung Neoplasms , Adult , Aged , Anesthetics, Local , Epinephrine , Female , Humans , Liposomes , Lung Neoplasms/drug therapy , Lung Neoplasms/surgery , Male , Morphine Derivatives/therapeutic use , Narcotics/therapeutic use , Pain, Postoperative/drug therapy , Pain, Postoperative/prevention & control , Single-Blind Method
9.
Osteoporos Int ; 33(5): 1067-1078, 2022 May.
Article in English | MEDLINE | ID: mdl-34988626

ABSTRACT

This study examines the difference in length of stay and total hospital charge by income quartile in hip fracture patients. The length of stay increased in lower income groups, while total charge demonstrated a U-shaped relationship, with the highest charges in the highest and lowest income quartiles. INTRODUCTION: Socioeconomic factors have an impact on outcomes in hip fracture patients. This study aims to determine if there is a difference in hospital length of stay (LOS) and total hospital charge between income quartiles in hospitalized hip fracture patients. METHODS: National Inpatient Sample (NIS) data from 2016 to 2018 was used to determine differences in LOS, total charge, and other demographic/clinical outcomes by income quartile in patients hospitalized for hip fracture. Multivariate regressions were performed for both LOS and total hospital charge to determine variable impact and significance. RESULTS: There were 860,045 hip fracture patients were included this study. With 222,625 in the lowest income quartile, 234,215 in the second, 215,270 in the third, and 190,395 in the highest income quartile. LOS decreased with increase in income quartile. Total charge was highest in the highest quartile, while it was lowest in the middle two-quartiles. Comorbidities with the largest magnitude of effect on both LOS and total charge were lung disease, kidney disease, and heart disease. Time to surgery post-admission also had a large effect on both outcomes of interest. CONCLUSION: The results demonstrate that income quartile has an effect on both hospital LOS and total charge. This may be the result of differences in demographics and other clinical variables between quartiles and increased comorbidities in lower income levels. The overall summation of these socioeconomic, demographic, and medical factors affecting patients in lower income levels may result in worse outcomes following hip fracture.


Subject(s)
Hip Fractures , Hospital Charges , Hip Fractures/epidemiology , Hip Fractures/surgery , Hospitals , Humans , Length of Stay , Retrospective Studies
10.
Thorac Surg Clin ; 31(2): 177-188, 2021 May.
Article in English | MEDLINE | ID: mdl-33926671

ABSTRACT

Lung volume reduction surgery (LVRS) patient selection guidelines are based on the National Emphysema Treatment Trial. Because of increased mortality and poor improvement in functional outcomes, patients with non-upper lobe emphysema and low baseline exercise capacity are determined as poor candidates for LVRS. In well-selected patients with heterogeneous emphysema, LVRS has a durable long-term outcome at up to 5-years of follow-up. Five-year survival rates in patients range between 63% and 78%. LVRS seems a durable alternative for end-stage heterogeneous emphysema in patients not eligible for lung transplantation. Future studies will help identify eligible patients with homogeneous emphysema for LVRS.


Subject(s)
Life Expectancy , Lung/surgery , Pneumonectomy/adverse effects , Pneumonectomy/methods , Pulmonary Emphysema/surgery , Aged , Clinical Trials as Topic , Female , Guidelines as Topic , Humans , Kaplan-Meier Estimate , Lung Transplantation , Male , Middle Aged , Patient Selection , Pulmonary Emphysema/mortality , Pulmonary Emphysema/physiopathology , Survival Rate , Treatment Outcome
12.
Ann Thorac Surg ; 109(3): 902-906, 2020 03.
Article in English | MEDLINE | ID: mdl-31610165

ABSTRACT

BACKGROUND: Historically, pulmonary hypertension (PH) has been considered as one of the contraindications for lung volume reduction surgery (LVRS). Newer studies have shown that LVRS is successful in select emphysema patients with PH. METHODS: In-hospital and 1-year functional and quality of life (QOL) outcomes were studied in patients with PH post-LVRS. PH was defined as pulmonary artery pressure (PAP) exceeding 35 mm Hg by right heart catheterization (RHC), where available, or else exceeding 35 mm Hg by echocardiogram. RESULTS: Of 124 patients who underwent LVRS, 56 (45%) had PH (mean PAP, 41 mm Hg) with 48 mild to moderate and 8 severe PH. In-hospital outcomes were similar between patients with and without PH: hours of artificial ventilation (1.8 vs 0.06, P = .882), days in intensive care (4 vs 6, P = .263), prolonged air leak (12% vs 19%, P = .402), and days of hospital stay (13 vs 16, P = .072). Lung function improved significantly at the 1-year follow-up in patients with PH: forced expiratory volume in 1 second % predicted (26 vs 38, P = .001), forced vital capacity % (62 vs 90, P = .001), residual volume % predicted (224 vs 174, P = .001), diffusion capacity of the lung for carbon monoxide % predicted (36 vs 43, P = .001), 6-minute walk distance test (1104 vs 1232 feet, P = .001), and QOL utility scores (0.67 vs 0.77, P = .001). There were no differences in in-hospital, baseline, and follow-up functional and QOL outcomes between patients with and without PH. CONCLUSIONS: In this small, single-institution cohort, outcomes of patients undergoing LVRS for emphysema with PH were similar to those of patients without PH. LVRS may be a potential option for select emphysema patients with PH.


Subject(s)
Contraindications, Procedure , Hypertension, Pulmonary/complications , Pneumonectomy/adverse effects , Pulmonary Emphysema/surgery , Pulmonary Wedge Pressure/physiology , Aged , Female , Follow-Up Studies , Humans , Hypertension, Pulmonary/physiopathology , Male , Pulmonary Emphysema/complications , Pulmonary Emphysema/diagnosis , Retrospective Studies , Risk Factors , Tomography, X-Ray Computed , Vital Capacity
14.
Ann Thorac Surg ; 108(3): 866-872, 2019 09.
Article in English | MEDLINE | ID: mdl-31055037

ABSTRACT

BACKGROUND: Lung volume reduction surgery (LVRS) is the definitive treatment for patients with severe emphysema. There is still a need for long-term data concerning the outcomes of this procedure. This study presents long-term longitudinal data on LVRS including correlation of quality of life (QOL) with pulmonary function testing metrics and includes additional analysis of patients with heterogeneous and homogeneous emphysema. METHODS: Retrospective analysis of data collected from patients undergoing LVRS over a 9-year period at a single center was performed (N = 93). Pulmonary function and 6-minute walk tests as well as QOL questionnaires were administered before and 1 year after surgery. Descriptive statistics were reported for clinical outcomes and QOL indices. Wilcoxon signed-rank tests were used to examine changes from baseline to end of 1-year follow-up. Spearman correlation coefficients were used to evaluate relationships between clinical and QOL outcomes. RESULTS: At 1-year post surgery, mean forced vital capacity (46%, P ≤ .0001), forced expiratory volume (43%, P ≤ .0001), diffusing capacity of the lungs for carbon monoxide (16%, P ≤ .0001), and 6-minute walk distance (20%, P ≤ .0001) were increased from baseline, while residual volume decreased (23%, P ≤ .0001). There was a positive correlation between changes in QOL and forced expiratory volume, forced vital capacity, and, 6-minute walk distance. Patients having heterogeneous disease had greater improvements in forced expiratory volume, forced vital capacity, residual volume, and diffusing capacity of the lungs for carbon monoxide, and greater QOL compared with patients with homogeneous disease. CONCLUSIONS: LVRS continues to be a valuable treatment option for patients with advanced emphysema with reproducible improvements in clinical and QOL metrics. Careful patient selection and optimization prior to surgery are crucial to successful outcomes.


Subject(s)
Hospital Mortality , Pneumonectomy/methods , Pneumonectomy/psychology , Pulmonary Emphysema/surgery , Quality of Life , Academic Medical Centers , Aged , Female , Follow-Up Studies , Forced Expiratory Volume , Humans , Illinois , Length of Stay , Longitudinal Studies , Male , Middle Aged , Patient Selection , Pneumonectomy/mortality , Pulmonary Emphysema/diagnosis , Pulmonary Emphysema/mortality , Pulmonary Emphysema/psychology , Respiratory Function Tests , Retrospective Studies , Risk Assessment , Statistics, Nonparametric , Survival Analysis , Treatment Outcome
15.
Qual Life Res ; 28(7): 1885-1892, 2019 Jul.
Article in English | MEDLINE | ID: mdl-30707368

ABSTRACT

PURPOSE: Lung volume reduction surgery (LVRS) has been shown to improve lung function, but also improve the overall quality of life (QOL). The aim of this study is to compare two QOL questionnaires-EuroQol Questionnaire (EQ-5D-3L) and 36-item Short Form Health Survey (SF-36) in patients post-LVRS. METHODS: All patients undergoing LVRS for severe chronic obstructive pulmonary disease (COPD) at a single center of excellence were analyzed (n = 94). Baseline demographic and clinical outcomes were characterized. Both EQ-5D-3L and SF-36 questionnaires were administered to all patients at baseline (n = 94) and at the end of 1 year (n = 89) post-surgery. SF-36 was converted to Short Form six-dimensions (SF-6D) using standard algorithm. Correlation, discrimination, responsiveness and differences across the two questionnaires were examined. RESULTS: The mean age of patients enrolled in the cohort was 66 years. There was significant increase in forced expiratory volume (FEV1, 43%), forced vital capacity (FVC 46%), diffusion capacity (DLCO 15%), 6 min walk distance test (6MWD 21%) and a significant decrease in residual volume (RV 23%) at the end of 1-year follow-up. The overall mean utility index significantly improved for both SF-6D and EQ-5D-3L questionnaires at the end of follow-up (p = 0.0001). However, the magnitude of percentage increase was higher with EQ-5D-3L compared to SF-6D (32% vs. 13%). Stronger correlations confirmed convergent validity at the end of 1-year follow-up between similar domains. Both questionnaires failed to discriminate between different levels of disease severity post-LVRS in patients with severe COPD. CONCLUSIONS: Both questionnaires responded similarly in patients with COPD post-LVRS. Combining results from QOL questionnaire(s) along with symptoms of disease and history of exacerbation may be a possible solution for identifying disease severity in old and sick patients unwilling/unable to come to hospital for a pulmonary function test post-LVRS.


Subject(s)
Pneumonectomy/psychology , Pulmonary Disease, Chronic Obstructive/psychology , Quality of Life/psychology , Surveys and Questionnaires , Aged , Female , Forced Expiratory Volume/physiology , Humans , Male , Middle Aged , Psychometrics , Pulmonary Disease, Chronic Obstructive/surgery , Vital Capacity/physiology , Walking
16.
Nutrients ; 10(3)2018 Mar 12.
Article in English | MEDLINE | ID: mdl-29534543

ABSTRACT

Proactive nutrition screening is an effective public health strategy for identifying and targeting individuals who could benefit from making dietary improvements for primary and secondary prevention of disease. The Dietary Screening Tool (DST) was developed and validated to assess nutritional risk among rural older adults. The purpose of this study was to evaluate the utility and validity of the DST to identify nutritional risk in middle-aged adults. This cross-sectional study in middle-aged adults (45-64 year olds, n = 87) who reside in Appalachia, examined nutritional status using an online health survey, biochemical measures, anthropometry, and three representative 24-h dietary recalls. The Healthy Eating Index (HEI) was calculated to describe overall diet quality. Adults identified by the DST with a nutrition risk had lower HEI scores (50 vs. 64, p < 0.001) and were much more likely to also be considered at dietary risk by the HEI (OR 11.6; 3.2-42.6) when compared to those not at risk. Those at risk had higher energy-adjusted total fat, saturated fat, and added sugar intakes and lower intakes of dietary fiber, and several micronutrients than those classified as not at risk by the DST. Similarly, the at-risk group had significantly lower serum levels of α-carotene, ß-carotene, cryptoxanthin, lutein, and zeaxanthin but did not differ in retinol or methylmalonic acid compared with those not at risk. The DST is a valid tool to identify middle-aged adults with nutritional risk.


Subject(s)
Diet/adverse effects , Malnutrition/etiology , Mass Screening , Nutrition Assessment , Nutritional Status , Overweight/etiology , Rural Health , Appalachian Region/epidemiology , Biomarkers/blood , Body Mass Index , Cross-Sectional Studies , Diet/ethnology , Diet, Healthy/ethnology , Diet, Western/adverse effects , Diet, Western/ethnology , Female , Humans , Internet , Male , Malnutrition/blood , Malnutrition/epidemiology , Malnutrition/ethnology , Middle Aged , Nutritional Status/ethnology , Obesity/blood , Obesity/epidemiology , Obesity/ethnology , Obesity/etiology , Overweight/blood , Overweight/epidemiology , Overweight/ethnology , Patient Compliance/ethnology , Risk , Rural Health/ethnology , West Virginia/epidemiology
17.
Nutrients ; 10(3)2018 Mar 08.
Article in English | MEDLINE | ID: mdl-29518042

ABSTRACT

The Supplemental Nutrition Assistance Program-Education (SNAP-Ed) program aims to improve nutritional intakes of low-income individuals (<185% poverty threshold). The objective of this study was to describe the compliance with Dietary Guidelines for Americans (DGA) recommendations for fruits, vegetables, and whole grains among SNAP-Ed eligible (n = 3142) and ineligible (n = 3168) adult women (19-70 years) nationwide and SNAP-Ed participating women in Indiana (n = 2623), using the NHANES 2007-2012 and Indiana SNAP-Ed survey data, respectively. Sensitivity analysis further stratified women by race/ethnicity and by current SNAP participation (<130% poverty threshold). Nationally, lower-income women were less likely to meet the fruit (21% vs. 25%) and vegetable (11% vs. 19%) guidelines than higher-income women, but did not differ on whole grains, which were ~5% regardless of income. The income differences in fruit and vegetable intakes were driven by non-Hispanic whites. Fewer SNAP-Ed-eligible U.S. women met fruit (21% vs. 55%) and whole grain (4% vs. 18%) but did not differ for vegetable recommendations (11% vs. 9%) when compared to Indiana SNAP-Ed women. This same trend was observed among current SNAP participants. Different racial/ethnic group relationships with DGA compliance were found in Indiana compared to the nation. Nevertheless, most low-income women in the U.S. are at risk of not meeting DGA recommendations for fruits (79%), vegetables (89%), and whole grains (96%); SNAP-Ed participants in Indiana had higher compliance with DGA recommendations. Increased consumption of these three critical food groups would improve nutrient density, likely reduce calorie consumption by replacing high calorie choices, and improve fiber intakes.


Subject(s)
Diet/standards , Nutrition Policy , Nutrition Surveys , Poverty , Adult , Aged , Diet/economics , Energy Intake , Female , Food Assistance , Food Supply , Fruit , Humans , Middle Aged , Nordazepam , Whole Grains , Young Adult
18.
Nutrients ; 9(9)2017 Aug 24.
Article in English | MEDLINE | ID: mdl-28837086

ABSTRACT

Little is known about the relationship between perceptions of nutrient adequacy and biomarkers of nutrition status. This cross-sectional study of U.S. and German adults (n = 200; 18-80 years) compared dietary practices, knowledge, and beliefs of omega-3 fatty acids (O3-FA) with the omega-3 index (O3-I), an erythrocyte-based biomarker associated with cardiovascular disease (CVD) risk. More than half of adults believed that O3-FAs are beneficial for heart and brain health and could correctly identify the food sources of O3-FA. However, the mean O3-I in the U.S. (4.3%) and Germany (5.5%) puts the majority of adults sampled (99%) in intermediate or high CVD-risk categories. More Americans were considered at high CVD-risk (40%) when compared with Germans (10%). In the U.S., but not Germany, women had a significantly higher O3-I than men (4.8% vs. 3.8%, p < 0.001). In the intermediate CVD-risk group, about one-third of adults in both countries (30% in the U.S. and 27% in Germany) believed their diet was adequate in O3-FA. Notably, mean O3-I concentrations did not significantly differ with dietary perceptions of adequacy. More adults in Germany (26%) than in the U.S. (10%) believed that dietary supplements are needed to achieve a balanced diet. In spite of adequate knowledge about food sources and a consistent belief that O3-FA are important for health, very few participants had O3-I concentrations in the range for CVD protection.


Subject(s)
Diet , Dietary Supplements , Erythrocytes/chemistry , Fatty Acids, Omega-3/administration & dosage , Fatty Acids, Omega-3/blood , Health Knowledge, Attitudes, Practice , Nutritional Status , Recommended Dietary Allowances , Adolescent , Adult , Aged , Aged, 80 and over , Biomarkers/blood , Cross-Sectional Studies , Diet, Healthy , Female , Germany , Humans , Male , Middle Aged , Nutrition Assessment , Nutritive Value , Perception , United States , Young Adult
19.
J Nutr Educ Behav ; 49(8): 639-646.e3, 2017 09.
Article in English | MEDLINE | ID: mdl-28539198

ABSTRACT

OBJECTIVE: To examine shortfall nutrient intakes (ie, calcium, folate, potassium, magnesium, and vitamins A, C, D, and E) by poverty-to-income ratio (PIR). DESIGN: National Health and Nutrition Examination Survey 2011-2012, a nationally representative, cross-sectional survey. PARTICIPANTS: US adults with complete data on poverty status and diet were included (n = 4,524). ANALYSIS: The National Cancer Institute method was used to estimate total usual micronutrient intakes from foods, beverages, medications, and dietary supplements reported on 2 24-hour dietary recalls using measurement error correction. MAIN OUTCOME MEASURES: Calcium, folate, potassium, magnesium, and vitamins A, C, D, and E across 3 PIR categories: <130%, 130% to 350%, and ≥350%. RESULTS: Mean intakes of folate, vitamin C, and vitamin D were significantly greater in men, and magnesium in women, across all PIR categories. Except for calcium in men and vitamin C in women, the highest PIR category had significantly higher mean total usual intakes of all remaining shortfall micronutrients. Importantly, men and women in the highest PIR category (≥350%) were significantly less likely to have intakes below the Estimated Average Requirement across all micronutrients compared with those in the lower PIR categories. CONCLUSIONS AND IMPLICATIONS: Even with dietary supplements, large proportions of US adults have micronutrient intakes below the Estimated Average Requirement. Adults at the highest adjusted income have higher micronutrient intakes and lower risk of inadequacy than those with lower incomes.


Subject(s)
Diet/statistics & numerical data , Micronutrients , Nutrition Surveys , Vitamins , Adult , Diet/standards , Dietary Supplements , Female , Humans , Male , Poverty , United States/epidemiology , Young Adult
20.
Am J Clin Nutr ; 105(6): 1336-1343, 2017 06.
Article in English | MEDLINE | ID: mdl-28446502

ABSTRACT

Background: Public health concerns with regard to both low and high folate status exist in the United States. Recent publications have questioned the utility of self-reported dietary intake data in research and monitoring.Objectives: The purpose of this analysis was to examine the relation between self-reported folate intakes and folate status biomarkers and to evaluate their usefulness for several types of applications.Design: We examined usual dietary intakes of folate by using the National Cancer Institute method to adjust two 24-h dietary recalls (including dietary supplements) for within-person variation and then compared these intakes with serum and red blood cell (RBC) folate among 4878 men and nonpregnant, nonlactating women aged ≥19 y in NHANES 2011-2012, a nationally representative, cross-sectional survey, with respect to consistency across prevalence estimates and rank order comparisons.Results: There was a very low prevalence (<1%) of folate deficiency when serum (<7 nmol/L) and RBC (<305 nmol/L) folate were considered, whereas a higher proportion of the population reported inadequate total dietary folate intakes (6%). Similar patterns of change occurred between intakes and biomarkers of folate status when distributions were examined (i.e., dose response), particularly when diet was expressed in µg. Intakes greater than the Tolerable Upper Intake Level greatly increased the odds of having high serum folate (OR: 17.6; 95% CI: 5.5, 56.0).Conclusions: When assessing folate status in the United States, where fortification and supplement use are common, similar patterns in the distributions of diet and biomarkers suggest that these 2 types of status indicators reflect the same underlying folate status; however, the higher prevalence estimates for inadequate intakes compared with biomarkers suggest, among other factors, a systematic underestimation bias in intake data. Caution is needed in the use of dietary folate data to estimate the prevalence of inadequacy among population groups. The use of dietary data for rank order comparisons or to estimate the potential for dietary excess is likely more reliable.


Subject(s)
Energy Intake , Feeding Behavior , Folic Acid Deficiency/blood , Folic Acid/blood , Adult , Aged , Aged, 80 and over , Biomarkers/blood , Cross-Sectional Studies , Diet , Dietary Supplements , Erythrocytes/metabolism , Female , Folic Acid/administration & dosage , Folic Acid Deficiency/epidemiology , Humans , Male , Mental Recall , Middle Aged , Nutrition Surveys , Prevalence , Self Report , United States/epidemiology , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...