Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
1.
Nutrients ; 14(11)2022 May 25.
Article in English | MEDLINE | ID: mdl-35683992

ABSTRACT

Background: We have previously reported that the addition of resistant maltodextrin (RMD), a fermentable functional fiber, to the diet increases fecal weight as well as the amount of fecal bifidobacteria. Here, we report on the targeted analysis of changes in potentially beneficial gut bacteria associated with the intervention. Objective: The primary objective of this study was to determine the effect of adding 0, 15 and 25 g RMD to the diets of healthy free-living adults on potentially beneficial gut bacteria. Methods: We expanded on our previously reported microbiota analysis in a double-blind, placebo-controlled feeding study (NCT02733263) by performing additional qPCR analyses targeting fecal lactic acid bacteria (LAB), Akkermansia muciniphila, Faecalibacterium prausnitzii and Fusicatenibacter saccharivorans in samples from 49 participants. Results: RMD resulted in an approximately two-fold increase in fecal Fusicatenibacter saccharivorans (p = 0.024 for 15 g/day RMD and p = 0.017 for 25 g/day RMD). For Akkermansia muciniphila and Faecalibacterium prausnitzii, we obtained borderline evidence that showed increased amounts in participants that had low baseline levels of these bacteria (p < 0.1 for 25 g/day RMD). We did not detect any effects of RMD on LAB. Conclusions: RMD supplementation in healthy individuals increases Fusicatenibacter saccharivorans. Albeit to a lesser extent, RMD at the higher intake level may also increase Akkermansia muciniphila and Faecalibacterium prausnitzii in individuals with low baseline levels of those two species. Potential benefits associated with these microbiota changes remain to be established in studies with quantifiable health-related endpoints.


Subject(s)
Faecalibacterium prausnitzii , Polysaccharides , Adult , Akkermansia , Clostridiales , Double-Blind Method , Feces/microbiology , Humans , Polysaccharides/pharmacology , Verrucomicrobia
2.
Ophthalmol Glaucoma ; 4(6): 638-645, 2021.
Article in English | MEDLINE | ID: mdl-33722789

ABSTRACT

PURPOSE: In this study, we describe common demographic and clinical characteristics of the glaucoma patient population attending vision rehabilitation. DESIGN: Cross-sectional study. PARTICIPANTS: Patients attending a hospital-based vision rehabilitation center with a primary ocular diagnosis of glaucoma. METHODS: Participants' charts were retrospectively reviewed. Data extracted from medical records included demographics, referring physician, ocular history, glaucoma diagnosis, past ocular surgery, intraocular pressure, optic nerve findings, results of a functional intake assessing activities of daily living, depression, visual hallucinations, best-corrected visual acuity (BCVA), mean deviation (MD) scores on visual field testing, and log contrast sensitivity (CS). MAIN OUTCOME MEASURES: Participant demographic information, ocular history, self-reported difficulty with activities of daily living, depression, visual hallucinations, BCVA, visual field, and CS. RESULTS: The mean age of patients in this study was 77 years and ranged from 8 to 103 years. Ninety percent of patients were referred to vision rehabilitation by an ophthalmologist. Median BCVA was 20/50. Fifty-five percent of patients were functionally monocular, and for all patients, there was a median 9-line difference in BCVA between eyes. Median MD score was -13.95 decibels (dB). Median CS was 1.05. Patients reported having the greatest difficulty with reading (88%), writing (72%), and mobility (67%). Seventy-eight percent of patients stopped driving, and 12% reported difficulty driving. Among those experiencing depression, there was a 4:1 ratio of depressed patients having difficulty with mobility. One-third of patients experienced visual hallucinations. CONCLUSIONS: Most glaucoma patients attending vision rehabilitation are not legally blind, but many are functionally monocular. This may cause greater difficulty performing functions that require the use of binocularity. Increasing the referral of younger glaucoma patients to vision rehabilitation may help patients learn to cope with the loss of visual function that occurs over time.


Subject(s)
Activities of Daily Living , Glaucoma , Adolescent , Adult , Aged , Aged, 80 and over , Child , Cross-Sectional Studies , Glaucoma/diagnosis , Glaucoma/epidemiology , Humans , Middle Aged , Retrospective Studies , Young Adult
3.
BMC Womens Health ; 20(1): 136, 2020 06 29.
Article in English | MEDLINE | ID: mdl-32600463

ABSTRACT

BACKGROUND: Little is known about how the menstrual cycle affects gastrointestinal function and self-reported stress in young, healthy women taking oral contraceptives (OC). This study prospectively characterized gastrointestinal function and symptoms on each day throughout the menstrual cycle. METHODS: Healthy women aged 18-35 years (n = 78) who took OC participated in the 5-week observational study. Stool frequency, self-reported stress, stool form measured by the Bristol Stool Form Scale (BSFS), and gastrointestinal symptoms measured by a modified version of the Gastrointestinal Symptom Rating Scale (GSRS) were assessed daily. GSRS scores were reported (1 = no discomfort at all, 7 = very severe discomfort) and were averaged for individual syndrome scores or summed for the total score. The validated, weekly version of the GSRS was completed at two time points to reflect menstruation and 1 week prior to menstruation (n = 72). Outcomes were analyzed in linear mixed models with the Dunnett's post hoc test against day 1 of menstrual bleeding or with nonparametric tests. RESULTS: Daily stress (P = 0.0018), BSFS score (P = 0.0493), stool frequency (P = 0.0241), abdominal pain (P < 0.0001), diarrhea (P = 0.0022), constipation (P = 0.0446), reflux (P = 0.0193), and indigestion (P < 0.0001) all varied significantly by the day of the menstrual cycle. Dunnett's post hoc tests showed that scores (mean ± SEM) on the first day of bleeding (day 1) for daily abdominal pain (2.6 ± 0.2), diarrhea (1.7 ± 0.1), and indigestion (2.1 ± 0.2) symptoms were higher than scores on all other days of the menstrual cycle (P < 0.05) with scores not on day 1 falling under 1.5, or between no discomfort at all and slight discomfort. Reflux, stool frequency, BSFS, self-reported stress, and constipation were higher on day 1 (P < 0.05) than on 12, 8, 6, 4, and 2 other days of the menstrual cycle, respectively. The median (IQR) GSRS score was higher during the week of menstruation than the week prior to menstruation for diarrhea [1.50 (1.00-2.33) vs 1.33 (1.00-2.00), P = 0.002] and abdominal pain [2.00 (1.33-2.67) vs 1.67 (1.33-2.33), P = 0.011] syndrome scores. CONCLUSION: Bowel habits appear to vary across the menstrual cycle and suggest more gastrointestinal discomfort on day 1 of menstrual bleeding in healthy women taking OC. Future interventional studies could identify ways to improve gastrointestinal symptoms in healthy women during menstruation.


Subject(s)
Contraceptives, Oral/adverse effects , Defecation/physiology , Gastrointestinal Transit/physiology , Menstrual Cycle/physiology , Menstruation/physiology , Adolescent , Adult , Contraceptives, Oral/administration & dosage , Feces , Female , Gastrointestinal Diseases , Humans , Prospective Studies , Young Adult
4.
Biomaterials ; 239: 119839, 2020 05.
Article in English | MEDLINE | ID: mdl-32065973

ABSTRACT

Differences in glucose uptake in peripheral and neural tissues account for the reduced efficacy of insulin in nervous tissues. Herein, we report the design of short peptides, referred as amino acid compounds (AAC) with and without a modified side chain moiety. At nanomolar concentrations, a candidate therapeutic molecule, AAC2, containing a 7-(diethylamino) coumarin-3-carboxamide side-chain improved glucose control in human peripheral adipocytes and the endothelial brain barrier cells by activation of insulin-insensitive glucose transporter 1 (GLUT1). AAC2 interacted specifically with the leptin receptor (LepR) and activated atypical protein kinase C zeta (PKCς) to increase glucose uptake. The effects induced by AAC2 were absent in leptin receptor-deficient predipocytes and in Leprdb mice. In contrast, AAC2 established glycemic control altering food intake in leptin-deficient Lepob mice. Therefore, AAC2 activated the LepR and acted in a cytokine-like manner distinct from leptin. In a monogenic Ins2Akita mouse model for the phenotypes associated with type 1 diabetes, AAC2 rescued systemic glucose uptake in these mice without an increase in insulin levels and adiposity, as seen in insulin-treated Ins2Akita mice. In contrast to insulin, AAC2 treatment increased brain mass and reduced anxiety-related behavior in Ins2Akita mice. Our data suggests that the unique mechanism of action for AAC2, activating LepR/PKCς/GLUT1 axis, offers an effective strategy to broaden glycemic control for the prevention of diabetic complications of the nervous system and, possibly, other insulin insensitive or resistant tissues.


Subject(s)
Blood Glucose , Diabetes Mellitus, Experimental , Amino Acids , Animals , Anxiety , Diabetes Mellitus, Experimental/drug therapy , Insulin , Mice , Mice, Inbred C57BL , Receptors, Leptin
5.
J Glaucoma ; 28(6): 512-518, 2019 06.
Article in English | MEDLINE | ID: mdl-30807440

ABSTRACT

PRECIS: Rabbit model studies suggested better morphology blebs with equal intraocular pressure (IOP) efficacy as a standard mitomycin C (MMC) trabeculectomy using a novel slow-release drug delivery antifibrotic system delivering small quantities of MMC and 5-fluorouracil (5-FU). PURPOSE: To evaluate 2 different concentrations of biodegradable poly(lactic-co-glycolic acid) (PLGA) system with 5-FU and MMC (ElutiGLASS) for their ability to reduce fibrosis and compare the results with standard trabeculectomy with MMC in a rabbit model. MATERIALS AND METHODS: New Zealand albino rabbits (19) were divided into 3 groups (A, B, C) and standard trabeculectomy operation was performed in the right eye of each rabbit.Group (A) had trabeculectomy with MMC (0.4 mg/mL) applied using a Weck cell sponge; (B) trabeculectomy with slow-release ElutiGLASS (0.23 mg, 5-FU/0.33 µg MMC released over 23 to 30 d); (C) trabeculectomy with rapid release ElutiGLASS (0.45 mg of 5-FU/0.65 µg MMC, released over 5 to 7 d). The rabbits were followed for 3 months before euthanasia. RESULTS: Bleb morphology, vascularity, and fibrosis were less pronounced in groups B and C when compared with group A at 3 months. Group B appears to have a lower and more diffuse bleb appearance compared with the other 2 groups with honeycomb appearance on both clinical examination and ultrasound biomicroscopy imaging with higher percentage of maintained bleb space (83%), less fibrosis than group A while maintaining the same low inflammation score as the other 2 groups on histology. At 3 months, the PLGA polymer had completely disappeared in all rabbits. There were no statistical differences in the degree of IOP reduction or histologic inflammation, among the 3 groups. CONCLUSIONS: We successfully created a sustained-release antifibrotic drug delivery system that delivered known dosage of the drugs at doses that are significantly lower than the current standard, and resulted in less fibrosis while maintaining a healthy bleb and equal reduction of IOP. TRANSLATIONAL RELEVANCE: These results are supportive of the antifibrotic effect of the slow-release drug delivery system used in conjunction with trabeculectomy, thus paving the way for human pilot studies to improve and simplify existing surgical techniques for filtering surgeries in glaucoma.


Subject(s)
Drug Delivery Systems , Fluorouracil , Glaucoma , Mitomycin , Trabeculectomy , Animals , Humans , Male , Rabbits , Absorbable Implants , Drug Implants , Drug Liberation , Endophthalmitis/drug therapy , Endophthalmitis/etiology , Endophthalmitis/metabolism , Fibrosis/etiology , Fibrosis/metabolism , Fibrosis/prevention & control , Filtering Surgery/adverse effects , Fluorouracil/administration & dosage , Fluorouracil/adverse effects , Fluorouracil/pharmacokinetics , Glaucoma/metabolism , Glaucoma/surgery , Intraocular Pressure , Mitomycin/administration & dosage , Mitomycin/adverse effects , Mitomycin/pharmacokinetics , Postoperative Complications/drug therapy , Postoperative Complications/metabolism , Tonometry, Ocular , Trabeculectomy/adverse effects , Trabeculectomy/methods
6.
Nutr Res ; 60: 33-42, 2018 12.
Article in English | MEDLINE | ID: mdl-30527258

ABSTRACT

Dietary fiber stimulates the growth of potentially beneficial bacteria (eg, bifidobacteria), yet most Americans do not meet daily fiber recommendations. Resistant maltodextrin (RMD), a fermentable functional fiber, may help individuals meet total fiber recommendations and potentially increase bifidobacteria. It was hypothesized that fecal bifidobacteria counts/ng fecal DNA would increase after adding 25 g RMD to inadequate fiber diets of healthy adults. In this double-blind, controlled crossover study, 51 participants (26.3 ± 6.8 years, mean ± SD) were randomized to consume 0, 15, and 25 g RMD daily for 3 weeks followed by a 2-week washout. Participants collected all stools for 2 days at weeks 0 and 3 of each intervention for stool wet weight (WW) measurements and fecal bifidobacteria counts. Weekly 24-hour dietary recalls assessed total fiber intake. Only 25 g RMD resulted in a change (final minus baseline) in bifidobacteria that was significant compared with 0 g (0.17 ± 0.09 vs -0.17 ± 0.09 log10[counts], respectively, mean ± SEM, P = .008). Stool WW increased only with 25 g (150 ± 11 vs baseline 121±11 g/d; P = .011). Mean daily total fiber intake (including RMD) was significantly higher (both P< .001) with 15 g (17.8 ± 0.6 g/1000 kcal or 4184 kJ) and 25 g (25.3 ± 1.1 g/1000 kcal) compared with 0 g RMD (8.4±0.4 g/1000 kcal). Mean daily total fiber intakes exceeded recommendations (14 g/1000 kcal) with 15 and 25 g of RMD, and 25 g RMD increased fecal bifidobacteria counts and stool WW, suggesting health benefits from increasing total fiber intake.


Subject(s)
Bifidobacterium/drug effects , Defecation/drug effects , Dietary Fiber/pharmacology , Feces , Gastrointestinal Microbiome/drug effects , Intestines/drug effects , Polysaccharides/pharmacology , Adult , Bifidobacterium/growth & development , Cross-Over Studies , Diet , Dietary Fiber/administration & dosage , Double-Blind Method , Feces/microbiology , Female , Fermentation , Humans , Intestines/microbiology , Male , Polysaccharides/administration & dosage , Reference Values , Starch , Young Adult
7.
J Vis Exp ; (137)2018 07 22.
Article in English | MEDLINE | ID: mdl-30080202

ABSTRACT

Many waterbird populations have faced declines over the last century, including the common tern (Sterna hirundo), a waterbird species with a widespread breeding distribution, that has been recently listed as endangered in some habitats of its range. Waterbird monitoring programs exist to track populations through time; however, some of the more intensive approaches require entering colonies and can be disruptive to nesting populations. This paper describes a protocol that utilizes a minimally invasive surveillance system to continuously monitor common tern nesting behavior in typical ground-nesting colonies. The video monitoring system utilizes wireless cameras focused on individual nests as well as over the colony as a whole, and allows for observation without entering the colony. The video system is powered with several 12 V car batteries that are continuously recharged using solar panels. Footage is recorded using a digital video recorder (DVR) connected to a hard drive, which can be replaced when full. The DVR may be placed outside of the colony to reduce disturbance. In this study, 3,624 h of footage recorded over 63 days in weather conditions ranging from 12.8 °C to 35.0 °C produced 3,006 h (83%) of usable behavioral data. The types of data retrieved from the recorded video can vary; we used it to detect external disturbances and measure nesting behavior during incubation. Although the protocol detailed here was designed for ground-nesting waterbirds, the principal system could easily be modified to accommodate alternative scenarios, such as colonial arboreal nesting species, making it widely applicable to a variety of research needs.


Subject(s)
Breeding/methods , Charadriiformes/growth & development , Nesting Behavior/physiology , Video Recording/methods , Animals
8.
BMC Pulm Med ; 18(1): 17, 2018 Jan 25.
Article in English | MEDLINE | ID: mdl-29370846

ABSTRACT

BACKGROUND: Exacerbations of chronic obstructive pulmonary disease (COPD) are an important measure of disease severity in terms of impaired disease progression, increased recovery time, healthcare resource utilization, overall morbidity and mortality. We aimed to quantify exacerbation and healthcare resource utilization rates among COPD patients in Sweden with respect to baseline treatments, exacerbation history, and comorbidities. METHODS: Patients with a COPD or chronic bronchitis (CB) diagnosis in secondary care at age of ≥40 years on 1.7.2009 were identified and followed until 1.7.2010 or death. Severe exacerbations were defined as hospitalizations due to respiratory disease, and healthcare resource utilization was measured by all-cause hospitalizations and secondary care visits. Poisson regression was used adjusting for age, gender, time since COPD/CB diagnosis, and Charlson comorbidity index. RESULTS: In 88,548 patients (54% females, mean age 72 years), previous respiratory hospitalizations and current high use of COPD medication (double or triple therapy) predicted an 8.3-fold increase in severe exacerbation rates and 1.8-fold increase in healthcare resource utilization rates in the following year, compared to patients without combination treatment and/or history of severe exacerbations. CONCLUSIONS: COPD/CB patients with history of severe exacerbations and high use of COPD medication experienced a significantly increased rate of severe exacerbations and healthcare resource utilization during the one-year follow-up.


Subject(s)
Bronchitis, Chronic/drug therapy , Bronchitis, Chronic/epidemiology , Health Resources/statistics & numerical data , Hospitalization/statistics & numerical data , Pulmonary Disease, Chronic Obstructive/drug therapy , Pulmonary Disease, Chronic Obstructive/epidemiology , Adrenal Cortex Hormones/therapeutic use , Adrenergic beta-2 Receptor Agonists/therapeutic use , Aged , Aged, 80 and over , Asthma/epidemiology , Cardiovascular Diseases/epidemiology , Comorbidity , Disease Progression , Female , Humans , Male , Middle Aged , Muscarinic Antagonists/therapeutic use , Registries , Severity of Illness Index , Sweden/epidemiology , Symptom Flare Up
9.
J Burn Care Res ; 38(5): 299-303, 2017.
Article in English | MEDLINE | ID: mdl-28296670

ABSTRACT

Enteral nutrition support is a critical component of modern burn care for severely burned patients. However, tube feeds are frequently withheld during the perioperative period because of aspiration concerns. As a result, patients requiring multiple operative procedures risk accumulating significant protein-calorie deficits. The objective of this study was to describe our American Burn Association-certified burn center's experience implementing an intraoperative feeding protocol in severely burned patients defined as a cutaneous burn ≥20% TBSA. A retrospective review of patients with major thermal injuries (2008-2013). Thirty-three patients with an average of seven operating room trips (range, 2-21 trips) were evaluated. Seventeen patients received intraoperative enteral feeds (protocol group) and 16 patients did not (standard group). Feeding was performed using an enteral feeding tube placed postpylorically and was continued intraoperatively, regardless of operative positioning. There was no statistically significant difference in mortality between the groups (P = .62). No intraoperative aspiration or regurgitation events were recorded. The protocol group received significantly more calculated protein and caloric requirements, 98.06 and 98.4%, respectively, compared with 70.6 and 73.2% in the standard group (P < .001). Time to goal tube feed infusion rate was achieved on average 3 days sooner in the protocol group compared with the standard group (3.35 vs 6.18 days, P = .008). Early initiation and continuation of enteral feeds in severely burned patients led to higher percentages received of prescribed goal protein and caloric needs without increased rates of aspiration, regurgitation, or mortality.


Subject(s)
Burns/surgery , Enteral Nutrition/methods , Nutritional Requirements , Perioperative Care/methods , Burn Units , Critical Care/methods , Female , Humans , Male , Retrospective Studies
10.
Am J Clin Nutr ; 105(3): 758-767, 2017 03.
Article in English | MEDLINE | ID: mdl-28228426

ABSTRACT

Background: Rhinoconjunctivitis-specific quality of life is often reduced during seasonal allergies. The Mini Rhinoconjunctivitis Quality of Life Questionnaire (MRQLQ) is a validated tool used to measure quality of life in people experiencing allergies (0 = not troubled to 6 = extremely troubled). Probiotics may improve quality of life during allergy season by increasing the percentage of regulatory T cells (Tregs) and inducing tolerance.Objective: The objective of this study was to determine whether consuming Lactobacillus gasseri KS-13, Bifidobacterium bifidum G9-1, and B. longum MM-2 compared with placebo would result in beneficial effects on MRQLQ scores throughout allergy season in individuals who typically experience seasonal allergies. Secondary outcomes included changes in immune markers as part of a potential mechanism for changes in MRQLQ scores.Design: In this double-blind, placebo-controlled, parallel, randomized clinical trial, 173 participants (mean ± SEM: age 27 ± 1 y) who self-identified as having seasonal allergies received either a probiotic (2 capsules/d, 1.5 billion colony-forming units/capsule) or placebo during spring allergy season for 8 wk. MRQLQ scores were collected weekly throughout the study. Fasting blood samples were taken from a subgroup (placebo, n = 37; probiotic, n = 35) at baseline and week 6 (predicted peak of pollen) to determine serum immunoglobulin (Ig) E concentrations and Treg percentages.Results: The probiotic group reported an improvement in the MRQLQ global score from baseline to pollen peak (-0.68 ± 0.13) when compared with the placebo group (-0.19 ± 0.14; P = 0.0092). Both serum total IgE and the percentage of Tregs increased from baseline to week 6, but changes were not different between groups.Conclusions: This combination probiotic improved rhinoconjunctivitis-specific quality of life during allergy season for healthy individuals with self-reported seasonal allergies; however, the associated mechanism is still unclear. This trial was registered at clinicaltrials.gov as NCT02349711.


Subject(s)
Bifidobacterium bifidum , Bifidobacterium longum , Conjunctivitis, Allergic , Lactobacillus gasseri , Probiotics/therapeutic use , Quality of Life , Rhinitis, Allergic, Seasonal , Activities of Daily Living , Adult , Conjunctivitis, Allergic/complications , Conjunctivitis, Allergic/drug therapy , Double-Blind Method , Eye/pathology , Female , Humans , Immunoglobulin E/blood , Male , Nose/pathology , Rhinitis, Allergic, Seasonal/complications , Rhinitis, Allergic, Seasonal/drug therapy , T-Lymphocytes, Regulatory/metabolism
11.
J Burn Care Res ; 38(3): e670-e677, 2017.
Article in English | MEDLINE | ID: mdl-27617405

ABSTRACT

Concurrent injuries to multiple extremities present unique challenges to the reconstructive surgeon. The primary goal in such scenarios is to optimize functional outcomes. The goal of this article is to present an overview of various techniques necessary to provide sufficient soft tissue and preserve amputation limb lengths and function. The concept of innovative techniques for maximizing limb savage and function is presented using an index patient with multiple extremity third- and fourth-degree burn injuries resulting in nonsalvageable lower extremities and severe left-hand wounds. A review of other potential innovative techniques is discussed. The burn injury resulted in a need for bilateral guillotine below-knee amputations. Above-knee amputation was avoided in the left leg using a parascapular free fasciocutaneous flap, while through-knee amputation was preferred to above-knee amputation in the right leg. The preservation of areas with questionable viability resulted in salvaging the left hand of the patient using digital palmar flaps to resurface the dorsum with creation of a first web-space. Maintenance of maximal viable length of limbs and any residual function in the limbs can be of significant functional benefit to multiple limb amputation patients. Maximizing the limb length in such patients is critical, and typical "rules" that have traditionally been utilized to minimize numbers of operations and optimize prosthetic fit may not apply.


Subject(s)
Burns/surgery , Extremities/injuries , Extremities/surgery , Limb Salvage , Accidents, Traffic , Activities of Daily Living , Adult , Amputation, Surgical , Burns/etiology , Humans , Male , Recovery of Function , Surgical Flaps , Treatment Outcome
12.
Burns ; 42(2): 457-65, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26774601

ABSTRACT

OBJECTIVE: The use of negative-pressure-wound-therapy (NPWT) is associated with improved outcomes in smaller burns. We report our experience using extra-large (XL) NPWT dressings to treat ≥15% total body surface area (TBSA) burned and describe our technique and early outcomes. We also provide NPWT exudate volume for predictive fluid resuscitation in these critically ill patients. METHODS: We retrospectively reviewed patients treated with XL-NPWT from 2012 to 2014. Following excision/grafting, graft and donor sites were sealed with a layered NPWT dressing. We documented wound size, dressing size, NPWT outputs, graft take, wound infections, and length of stay (LOS). Mean NPWT exudate volume per %TBSA per day was calculated. RESULTS: Twelve burn patients (mean TBSA burned 30%, range 15-60%) were treated with XL-NPWT (dressing TBSA burned and skin graft donor sites range 17-44%). Average graft take was 97%. No wound infections occurred. Two patients had burns ≥50% TBSA and their LOS was reduced compared to ABA averages. XL-NPWT outputs peaked at day 1 after grafting followed by a steady decline until dressings were removed. Average XL-NPWT dressing output during the first 5 days was 101±66mL/%BSA covered per day. 2 patients developed acute kidney injury. CONCLUSION: The use of XL-NPWT to treat extensive burns is feasible with attention to application technique. NPWT dressings appear to improve graft take, and to decrease risk of infection, LOS, and pain and anxiety associated with wound care. Measured fluid losses can improve patient care in future applications of NPWT to large burn wounds.


Subject(s)
Burns/therapy , Fluid Therapy/methods , Negative-Pressure Wound Therapy/methods , Skin Transplantation/methods , Adolescent , Adult , Body Surface Area , Female , Humans , Length of Stay/statistics & numerical data , Male , Middle Aged , Retrospective Studies , Surgical Wound Infection/epidemiology , Trauma Severity Indices , Young Adult
13.
J Crit Care ; 29(6): 1130.e1-4, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25035049

ABSTRACT

Although chlorhexidine gluconate (CHG) disks have been shown to help reduce the incidence of central line-associated blood stream infections, their use can result in local skin necrosis. The effects of CHG disks on patients with complex skin pathology have not been studied. We report 6 cases of dermal necrosis associated with Biopatch (Ethicon Inc, Somerville, NJ) CHG disks in adults with complex skin pathology including those with Stevens-Johnson syndrome, toxic epidermal necrolysis syndrome, graft-versus-host disease, burns, and anasarca. All patients had a CHG disk placed at a central venous catheter insertion site. Age range was from 21 to 84 years. Discovery of the reaction ranged from 4 to 14 days after disk placement. Resultant skin erosions required 2 to 10 weeks to reepithelialize. Complicated skin disorder patients represent a rare subset of the critically ill who appear prone to CHG disk necrosis. Continuous contact of CHG under occlusive dressings is speculated to predispose Stevens-Johnson syndrome, toxic epidermal necrolysis syndrome, graft-versus-host disease, and burn patients to local chemical injury secondary to loss of the epithelial tissue barrier, decreased cohesion of the epidermal-dermal junction, and increased tissue permeability. In these patients, the risk of placing the CHG disk may present more risk than using alternative antimicrobial dressings.


Subject(s)
Anti-Infective Agents, Local/adverse effects , Catheter-Related Infections/prevention & control , Chlorhexidine/analogs & derivatives , Stevens-Johnson Syndrome/etiology , Adult , Aged , Aged, 80 and over , Bandages , Chlorhexidine/adverse effects , Critical Illness , Fatal Outcome , Female , Humans , Male , Middle Aged , Young Adult
14.
Paediatr Perinat Epidemiol ; 26(4): 316-27, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22686383

ABSTRACT

BACKGROUND: Post-partum depression (PPD) is the most common complication of pregnancy in developed countries, affecting 10-15% of new mothers. There has been a shift in thinking less in terms of PPD per se to a broader consideration of poor mental health, including anxiety after giving birth. Some risk factors for poor mental health in the post-partum period can be identified prenatally; however prenatal screening tools developed to date have had poor sensitivity and specificity. The objective of this study was to develop a screening tool that identifies women at risk of distress, operationalized by elevated symptoms of depression and anxiety in the post-partum period using information collected in the prenatal period. METHODS: Using data from the All Our Babies Study, a prospective cohort study of pregnant women living in Calgary, Alberta (N = 1578), we developed an integer score-based prediction rule for the prevalence of PPD, as defined as scoring 10 or higher on the Edinburgh Postnatal Depression Scale (EPDS) at 4-months postpartum. RESULTS: The best fit model included known risk factors for PPD: depression and stress in late pregnancy, history of abuse, and poor relationship quality with partner. Comparison of the screening tool with the EPDS in late pregnancy showed that our tool had significantly better performance for sensitivity. Further validation of our tool was seen in its utility for identifying elevated symptoms of postpartum anxiety. CONCLUSION: This research heeds the call for further development and validation work using psychosocial factors identified prenatally for identifying poor mental health in the post-partum period.


Subject(s)
Anxiety Disorders/diagnosis , Depression, Postpartum/diagnosis , Mothers/psychology , Postpartum Period , Pregnancy Complications , Adolescent , Adult , Alberta/epidemiology , Anxiety/psychology , Cohort Studies , Female , Humans , Mass Screening/methods , Middle Aged , Pregnancy , Prenatal Diagnosis/methods , Prospective Studies , Reproducibility of Results , Risk Assessment , Young Adult
16.
Commun Dis Intell Q Rep ; 33(2): 89-154, 2009 Jun.
Article in English | MEDLINE | ID: mdl-19877533

ABSTRACT

In 2007, 69 diseases and conditions were nationally notifiable in Australia. States and territories reported a total of 146,991 notifications of communicable diseases to the National Notifiable Diseases Surveillance System, an increase of 5% on the number of notifications in 2006. In 2007, the most frequently notified diseases were sexually transmissible infections (62,474 notifications, 43% of total notifications), gastrointestinal diseases (30,325 notifications, 21% of total notifications) and vaccine preventable diseases (25,347 notifications, 17% of total notifications). There were 19,570 notifications of bloodborne diseases; 6,823 notifications of vectorborne diseases; 1,762 notifications of other bacterial infections; 687 notifications of zoonoses and 3 notifications of quarantinable diseases.


Subject(s)
Communicable Diseases/epidemiology , Disease Notification/statistics & numerical data , Disease Outbreaks/statistics & numerical data , Adolescent , Adult , Age Distribution , Aged , Aged, 80 and over , Australia/epidemiology , Child , Child, Preschool , Communicable Disease Control , Female , Humans , Infant , Infant, Newborn , Male , Middle Aged , Population Surveillance , Sentinel Surveillance , Sex Distribution
17.
J Mol Med (Berl) ; 87(7): 703-12, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19387601

ABSTRACT

Most patients with type 1 diabetes rely on multiple daily insulin injections to maintain blood glucose control. However, insulin injections carry the risk of inducing hypoglycemia and do not eliminate diabetic complications. We sought to develop and evaluate a regulatable cell-based system for delivery of insulin to treat diabetes. We generated two intestinal cell lines in which human insulin expression is controlled by mifepristone. Insulin mRNA expression was dependent on the mifepristone dose and incubation time and cells displayed insulin and C-peptide immunoreactivity and glucose-induced insulin release following mifepristone treatment. Cell transplantation followed by mifepristone administration reversed streptozotocin (STZ)-induced diabetes in mice, and this effect was dependent on the mifepristone dose delivered. These data support the notion that engineering regulatable insulin expression within a cell already equipped for regulated secretion may be efficacious for the treatment of insulin-dependent diabetes.


Subject(s)
Cell- and Tissue-Based Therapy/methods , Diabetes Mellitus, Experimental/therapy , Gene Expression Regulation/drug effects , Hormone Antagonists/pharmacology , Insulin/metabolism , Intestinal Mucosa/metabolism , Mifepristone/pharmacology , Animals , Cell Line, Tumor/transplantation , Humans , Intestines/cytology , Male , Mice , Mice, Inbred BALB C , Time Factors
18.
Cornea ; 26(5): 515-9, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17525642

ABSTRACT

PURPOSE: To evaluate the use of corneal donor tissue deemed unsuitable for full-thickness penetrating keratoplasty (PK) for use in deep lamellar endothelial keratoplasty (DLEK) and to compare postoperative results to those of DLEK surgery using donor tissue that is suitable for PK. METHODS: Small-incision DLEK surgery was performed using 39 donor corneas unsuitable for PK. Thirty-five donors had anterior scars or opacities, 3 donors had pterygia within the 8-mm zone, and 1 had prior LASIK. All donor preparation was completed by manual stromal dissection. The DLEK surgical and postoperative courses were reviewed. Preoperative and 6-month postoperative results of this study group were compared with a control group consisting of the first 55 consecutive small-incision DLEK patients receiving donor corneas that had no criteria excluding them from use in PK. Four eyes in the study group and 1 eye in the control group had the confounding variables of the presence of an anterior-chamber lens or surgical vitrectomy with macular disease in the recipient eye. RESULTS: There was no significant difference in preoperative measurements of best spectacle-corrected visual acuity (BSCVA; P = 0.372), donor endothelial cell density (ECD; P = 0.749), or corneal topography [surface regularity index (SRI), P = 0.485; or surface asymmetry index (SAI), P = 0.154] between the 2 groups. For the patients receiving corneas deemed unacceptable for PK, at 6 months after surgery, the vision (P = 0.002) and corneal topography measurements improved significantly from before surgery (SRI, P < 0.001; SAI, P < 0.001), and there was no significant change in refractive astigmatism (P = 0.240). There was a significant difference in the vision at 6 months postoperatively between the overall study group and the control group, with the mean vision of the study group at 20/56 and the control group at 20/43 (P = 0.015). If eyes with known cystoid macular edema (CME) and vitrectomy are removed from each group, there is no significant difference in vision at 6 months between the study group and the control group (P = 0.110), with the average BSCVA of those receiving donor corneas unsuitable for PK equal to 20/48 (range, 20/25-20/200) and the average vision for those receiving PK-acceptable donor tissue equal to 20/43 (range, 20/20-20/80). The 6-month average refractive astigmatism of the study group was 1.12 +/- 0.99 D (range, 0.00-4.00 D), and the average endothelial cell count was 2064 +/- 396 cells/mm(2) (range, 1208-2957 cells/mm(2)). There was no significant difference in 6-month postoperative endothelial cell count (P = 0.443), refractive astigmatism (P = 0.567), or corneal topography (SRI, P = 0.332; SAI, P = 0.110) in study patients who received corneas unsuitable for PK compared with control patients who received corneas suitable for PK. CONCLUSIONS: Endothelial keratoplasty such as DLEK surgery with manual donor preparation broadens the donor pool by enabling corneas that cannot be used for PK to be used for selective endothelial transplantation without deleterious postoperative results.


Subject(s)
Corneal Transplantation/methods , Donor Selection/standards , Endothelium, Corneal/transplantation , Fuchs' Endothelial Dystrophy/surgery , Tissue Donors , Aged , Aged, 80 and over , Corneal Topography , Female , Guidelines as Topic , Humans , Keratoplasty, Penetrating/standards , Male , Middle Aged , Prospective Studies , Retrospective Studies , Visual Acuity
19.
Cornea ; 26(5): 543-5, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17525648

ABSTRACT

PURPOSE: To determine if the final corneal thickness after deep lamellar endothelial keratoplasty (DLEK) is correlated in any way with visual performance. METHODS: One hundred fifty-five consecutive eyes without macular disease underwent DLEK surgery and had pachymetry recorded at 6 months postoperatively. The eyes were grouped according to visual acuity, and pachymetry was correlated between groups: group 1 (20/20, 20/25, or 20/30), n = 38; group 2 (20/40 or 20/50), n = 79; group 3 (20/60, 20/70, or 20/80), n = 30; group 4 (20/100 or worse), n = 8. RESULTS: The mean pachymetry, SD, and range of pachymetry for each group are as follows: group 1, 0.571 +/- 0.080 mm (range, 0.408-0.784 mm); group 2, 0.598 +/- 0.080 mm (range, 0.437-0.816 mm); group 3, 0.605 +/- 0.099 mm (range, 0.454-0.945 mm); group 4, 0.607 +/- 0.120 mm (range, 0.410-0.781 mm). There was no significant correlation between vision and corneal thickness (P = 0.312). There was no statistical difference in pachymetry among all 4 groups (P = 0.323). The influence of pachymetry in visual acuity is not relevant (r = 0.03). CONCLUSIONS: The variance in corneal thickness in DLEK does not seem to influence visual results.


Subject(s)
Cornea/physiopathology , Corneal Transplantation/physiology , Endothelium, Corneal/transplantation , Visual Acuity/physiology , Endothelium, Corneal/physiopathology , Humans
20.
Ophthalmology ; 114(4): 631-9, 2007 Apr.
Article in English | MEDLINE | ID: mdl-17398317

ABSTRACT

PURPOSE: To report the endothelial survival over a 2-year period after 2 techniques of deep lamellar endothelial keratoplasty (DLEK) in the treatment of endothelial dysfunction. DESIGN: Prospective, noncomparative, interventional case series. PARTICIPANTS: One hundred eyes of 88 patients with corneal edema. METHODS: One hundred consecutive eyes with endothelial failure were entered into a prospective study of endothelial keratoplasty, and the donor central endothelial cell density (ECD) was recorded postoperatively at 6 months (n = 98), 12 months (n = 96), and 24 months (n = 85) and then compared with the preoperative eye bank measurements. The subsets of eyes with large-incision DLEK (n = 36) and small-incision DLEK (n = 62) were also evaluated and compared. MAIN OUTCOME MEASURES: Preoperative and postoperative central ECDs were prospectively evaluated and the cell loss calculated for each postoperative time point. RESULTS: The average (and standard deviation) ECD at 6 months was 2140+/-426 cells/mm(2), representing a mean cell loss from preoperative donor cell measurements of 25+/-15%. At 12 months, ECD was 2090+/-448 cells/mm2 (26+/-16% cell loss), and at 24 months, it was 1794+/-588 cells/mm2 (37+/-27% cell loss). The additional cell loss from 1 to 2 years was significant (P<0.001). In the subset of large-incision DLEK eyes (n = 36), the cell loss from preoperatively to 6 months was 23%; 12 months, 22%; and 24 months, 27%. In the subset of small-incision DLEK eyes (n = 62), the cell loss from preoperatively to 6 months was 25%; 12 months, 28%; and 24 months, 43%. The cell loss from small-incision DLEK surgery was significantly greater than that from large-incision DLEK surgery at the 12-month (P = 0.013) and 24-month (P<0.001) postoperative measurements. CONCLUSIONS: Although the initial cell loss from DLEK surgery is minimally changed from 6 to 12 months postoperatively, there is an acceleration of cell loss from 1 year to 2 years postoperatively. The small-incision DLEK technique, which involves folding of the donor tissue, results in a significantly higher endothelial cell loss at 1 and 2 years than that found after large-incision DLEK surgery, wherein the tissue is not folded.


Subject(s)
Corneal Diseases/surgery , Corneal Transplantation/methods , Endothelium, Corneal/pathology , Endothelium, Corneal/surgery , Graft Survival , Aged , Aged, 80 and over , Cell Count , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...