Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 36
Filter
1.
Avicenna J Med ; 14(1): 45-53, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38694135

ABSTRACT

Background Increased mortality rates among coronavirus disease 2019 (COVID-19) positive patients admitted to intensive care units (ICUs) highlight a compelling need to establish predictive criteria for ICU admissions. The aim of our study was to identify criteria for recognizing patients with COVID-19 at elevated risk for ICU admission. Methods We identified patients who tested positive for COVID-19 and were hospitalized between March and May 2020. Patients' data were manually abstracted through review of electronic medical records. An ICU admission prediction model was derived from a random sample of half the patients using multivariable logistic regression. The model was validated with the remaining half of the patients using c-statistic. Results We identified 1,094 patients; 204 (18.6%) were admitted to the ICU. Correlates of ICU admission were age, body mass index (BMI), quick Sequential Organ Failure Assessment (qSOFA) score, arterial oxygen saturation to fraction of inspired oxygen ratio, platelet count, and white blood cell count. The c-statistic in the derivation subset (0.798, 95% confidence interval [CI]: 0.748, 0.848) and the validation subset (0.764, 95% CI: 0.706, 0.822) showed excellent comparability. At 22% predicted probability for ICU admission, the derivation subset estimated sensitivity was 0.721, (95% CI: 0.637, 0.804) and specificity was 0.763, (95% CI: 0.722, 0.804). Our pilot predictive model identified the combination of age, BMI, qSOFA score, and oxygenation status as significant predictors for ICU admission. Conclusion ICU admission among patients with COVID-19 can be predicted by age, BMI, level of hypoxia, and severity of illness.

2.
Spartan Med Res J ; 7(1): 30124, 2022.
Article in English | MEDLINE | ID: mdl-35291705

ABSTRACT

INTRODUCTION: Uncontrolled hypertension can result in severe clinical conditions such as stroke, chronic kidney disease and congestive heart failure, especially in African American populations. To the knowledge of the authors, the effect of time sequence on blood pressure (BP) using an Automated Office Blood Pressure (AOBP) device has not been documented in an African American cohort. The objective of this study was to investigate the possible influence of time sequence of measurement (pre- and post-physician visit) on BP readings in an African American cohort, in the presence or absence of a Medical Assistant (MA) via AOBP monitoring. METHODS: A two-phase, single-blinded, non-randomized trial was conducted at MI-based Ascension Providence Hospital with a convenience sample of hypertensive patients. BP readings were taken using both an Omron 907 (Omron Corp., Kyoto, Japan) and a Welch Allyn (WA) Connex Spot Monitor (Welch Allyn, Inc., Skaneateles Falls, NY) AOBP devices. Descriptive statistics were generated, and T-tests were performed. RESULTS: In Phase 1, (N = 148), the mean systolic/diastolic readings for the pre-physician visits (141/82 mmHg) were statistically significantly higher than the post-visit readings (134/80 mmHg) (p ≤ 0.02). Post-visit physician readings from either AOBP device did not differ statistically (p = 0.72). In Phase 2 (n = 50), the presence of an MA resulted in significantly higher readings than when an MA was absent, however, the results of Phase 2 also supported the trends for lower BP post-physician visit found in Phase 1. CONCLUSION: Based on the consistency of these results, a post-physician visit AOBP reading, in the presence or absence of an MA, may provide a more accurate BP measurement to determine whether or not to treat hypertension in African American patients.

4.
Heliyon ; 7(12): e08566, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34957338

ABSTRACT

BACKGROUND & OBJECTIVES: Race plays an important role in healthcare disparities, often resulting in worse health outcomes. It is unclear if other patient factors and race interactions may influence mortality in patients with COVID-19. We aimed to evaluate how multiple determinants of all-cause in-hospital mortality from COVID-19 were linked to race. METHODS: A retrospective observational study was conducted at two hospitals in metropolitan Detroit. We identified patients aged ≥18 years-old who had tested positive for COVID-19 and were admitted between March 9 through May 16, 2020. Multivariable logistic regression was performed assessing predictors of all-cause in-hospital mortality in COVID-19. RESULTS: We identified 1064 unique patients; 74% were African Americans (AA). The all-cause in-hospital mortality was 21.7%, with the majority of deaths seen in AA (65.4%, P = 0.002) and patients 80 years or older (52%, P < 0.0001). AA women had lower all-cause mortality than AA men, white women, and white men based on race-gender interactions. In multivariable logistic regression analysis, older age (>80-year-old), dementia, and chronic kidney disease were associated with worse all-cause in-hospital mortality. Adjusted for race and body mass index (BMI), the main odds ratios (OR) and 95% confidence intervals (CI) are: Age 80 and older vs < 60 in females: OR = 7.4, 95% CI: 2.9, 18.7; in males OR = 7.3, 95% CI: 3.3, 16.2; Chronic Kidney Disease (CKD): OR = 1.7, 95% CI: 1.2, 2.6; Dementia: OR = 2.2, 95% CI: 1.5, 3.3. CONCLUSION: Gender significantly modified the association of race and COVID-19 mortality. African American females had the lowest all-cause in-hospital mortality risk compared to other gender-race groups.

5.
Clin Breast Cancer ; 21(3): e220-e227, 2021 06.
Article in English | MEDLINE | ID: mdl-33168447

ABSTRACT

BACKGROUND: Genomic medicine has led to significant advancements in the prevention and treatment of cancer. The National Comprehensive Cancer Network (NCCN) guidelines recommend BRCA1/2 screening in high-risk individuals; however, the guidelines have not incorporated differences within ethnic cohorts beyond Ashkenazi Jewish ethnicity. We analyzed the prevalence of BRCA1/2 mutations in various ethnicities and identified high-risk personal characteristics and family history incorporating differences within ethnic cohorts beyond Ashkenazi Jewish ethnicity. PATIENTS AND METHODS: We reviewed data collected by a Michigan medical genetic clinic in a community-based hospital from 2008 to 2018. A retrospective chart analysis was conducted of 1090 patients who received genetic counseling regarding hereditary cancer syndromes. RESULTS: We found a statistically significant higher rate of pathogenic BRCA1/2 mutation prevalence in African American patients, at 8.1%, compared to non-Ashkenazi Jewish white patients, at 3.6% (P = .02). African Americans have a mutational prevalence nearing that of the Ashkenazi Jewish population. CONCLUSION: Revision of the NCCN guidelines regarding hereditary cancer syndrome testing in various ethnic groups is imperative and overdue. Future studies are needed to identify health care disparities in and socioeconomic barriers to genetic testing.


Subject(s)
Black or African American/genetics , Breast Neoplasms/ethnology , Breast Neoplasms/genetics , Early Detection of Cancer/statistics & numerical data , Healthcare Disparities , Adult , Breast Neoplasms/prevention & control , Female , Genes, BRCA1 , Genes, BRCA2 , Genetic Counseling , Genetic Testing/statistics & numerical data , Humans , Middle Aged , Retrospective Studies , Risk Factors , United States
6.
Knee ; 27(6): 1746-1752, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33197813

ABSTRACT

BACKGROUND: A multitude of chemical agents are currently used intra-articularly to decrease pain after orthopaedic procedures including total knee arthroplasty. However, the possible deleterious effects of these injectable chemicals on chondrocyte viability have not been weighed against their potential benefits. Using a human osteoarthritic chondrocyte model, the purpose of this study was to assess the potential for cartilage damage caused by bupivacaine, Toradol, Duramorph, and acetaminophen from surgical local anesthesia. METHODS: Human distal femur and proximal tibia cross sections were obtained during total knee arthroplasty and divided into control group and experimental groups treated by bupivacaine, Toradol, Duramorph, and acetaminophen respectively. Chondrocytes obtained from enzymatically digested cartilage were cultured using a 3D alginate bead culture method to ensure lower rates of dedifferentiation. Chondrocyte bead cultures were exposed to the study chemicals. The gene expression and chondrocyte viability were measured by RT-PCR and flow cytometry, respectively. RESULTS: Compared with untreated group bupivacaine treatment led to the greatest cellular apoptosis with 30.5 ± 11% dead cells (P = 0.000). Duramorph and acetaminophen did not result in a significant increase in cell death. Bupivacaine treatment led to an increase in Caspase 3 gene expression (P = 0.000) as well as the acetaminophen treatment (P = 0.001) when compared to control. CONCLUSION: Our data demonstrated that Duramorph and Toradol were not cytotoxic to human chondrocytes and may be better alternatives to the frequently used and more cytotoxic bupivacaine. Acetaminophen did not result in increased cell death; however, it did show increased caspase 3 gene expression and caution should be considered.


Subject(s)
Acetaminophen/pharmacology , Bupivacaine/pharmacology , Cell Survival/drug effects , Chondrocytes/drug effects , Gene Expression/drug effects , Ketorolac Tromethamine/pharmacology , Morphine/pharmacology , Analgesics, Non-Narcotic/pharmacology , Analgesics, Opioid/pharmacology , Anesthetics, Local/pharmacology , Apoptosis , Case-Control Studies , Caspase 3/genetics , Caspase 3/metabolism , Cells, Cultured , Flow Cytometry , Humans , Knee Joint/cytology , Osteoarthritis, Knee/pathology , RNA, Messenger/metabolism , Reverse Transcriptase Polymerase Chain Reaction
7.
Ann Hepatobiliary Pancreat Surg ; 24(2): 156-161, 2020 May 31.
Article in English | MEDLINE | ID: mdl-32457260

ABSTRACT

BACKGROUNDS/AIMS: Distal pancreatic resections are intricate operations with potential for significant morbidity; there is controversy surrounding the appropriate setting regarding surgeon/hospital volume. We report our distal pancreatectomy experience from a community-based teaching hospital. METHODS: This study includes all patients who underwent laparoscopic distal pancreatectomy (LDP) and open distal pancreatectomy (ODP) for benign and malignant lesions between June 2004 and October 2017. Both groups were compared for perioperative characteristics, parenchymal resection technique, and outcomes. RESULTS: 138 patients underwent distal pancreatectomy during this time. The distribution of LDP and ODP was 68 and 70 respectively. Operative time (146 vs. 174 min), blood loss (139 vs. 395 ml) and mean length of stay (4.8 vs. 8.0 days) were significantly lower in the laparoscopic group. The 30-day Clavien Grade 2/3 morbidity rate was 13.7% (19/138) and the incidence of Grade B/C pancreatic fistula was 6.5% (9/138), with no difference between ODP and LDP. 30-day mortality was 0.7% (1/138). 61/138 resections had a malignancy on final pathology. ODP mean tumor diameter was greater (6.4 cm vs. 2.9 cm), but there was no significant difference in the mean number of harvested nodes (8.6 vs. 7.4). The cost of hospitalization, including readmissions and surgery was significantly lower for LDP ($7558 vs. $11610). CONCLUSIONS: This series of distal pancreatectomies indicates a shorter hospital stay, less operative blood loss and reduced cost in the LDP group, and comparable morbidity and oncologic outcomes between LDP and ODP. It highlights the feasibility and safety of these complex surgeries in a community setting.

8.
Sci Rep ; 10(1): 6038, 2020 Apr 02.
Article in English | MEDLINE | ID: mdl-32242047

ABSTRACT

An amendment to this paper has been published and can be accessed via a link at the top of the paper.

9.
Int Orthop ; 42(11): 2627-2632, 2018 11.
Article in English | MEDLINE | ID: mdl-30219966

ABSTRACT

PURPOSE: To examine the role of polymerization temperature on the cement porosity and antibiotic elution to optimize antibiotic release from antibiotic-laden cement (ABLC). METHODS: Elution profiles of vancomycin and tobramycin from ABLC discs prepared with low- and high-dose antibiotic dosages, cured at 8, 21, and 37 °C, and placed in phosphate buffered saline (PBS) at 37 °C were examined. Samples were collected at one, four, eight, 24, 72, 168, 336, and 1008 hours to calculate the quantity of antibiotic eluted. Porosity was determined by MicroCT analysis. RESULTS: ABLC porosity and antibiotic elution were increased up to five times the amount eluted from room temperature discs (p < 0.05). Low-dose ABLC group saw decreased but similar porosity at 8 °C and 21 °C compared to cement cured at 37 °C (p < 0.001). High-dose ABLC group porosities were all significantly different (p < 0.02). CONCLUSIONS: Altering the polymerization temperature of ABLC led to more porous constructs yielding increased antibiotic elution.


Subject(s)
Anti-Bacterial Agents/pharmacology , Bone Cements/chemistry , Polymerization , Anti-Bacterial Agents/chemistry , Porosity , Temperature , Tobramycin/chemistry , Tobramycin/pharmacology , Vancomycin/chemistry , Vancomycin/pharmacology
10.
J Neurosci Nurs ; 50(4): 188-192, 2018 Aug.
Article in English | MEDLINE | ID: mdl-29750679

ABSTRACT

Over the past 50 years, the Journal of Neuroscience Nursing (JNN) has grown from a neurosurgical focus to the broader neuroscience focus alongside the professional nursing organization that it supports. Stroke care in JNN focused on the surgical treatment and nursing care for cranial treatment of conditions such as cerebral aneurysm, carotid disease, arteriovenous malformation, and artery bypass procedures. As medical science has grown and new medications and treatment modalities have been successfully trialed, JNN has brought to its readership this information about recombinant tissue plasminogen activator, endovascular trials, and new assessment tools such as the National Institute of Health Stroke Scale. JNN is on the forefront of publishing nursing research in the areas of stroke caregiver needs and community education for rapid treatment of stroke and stroke risk reduction. The journal has been timely and informative in keeping neuroscience nurses on the forefront of the changing world of stroke nursing.


Subject(s)
Anniversaries and Special Events , Endovascular Procedures/methods , Evidence-Based Nursing , Neuroscience Nursing/trends , Stroke , Humans , Stroke/nursing , Stroke/therapy , Tissue Plasminogen Activator/therapeutic use
11.
Orthopedics ; 41(3): e424-e431, 2018 May 01.
Article in English | MEDLINE | ID: mdl-29708567

ABSTRACT

The purpose of this study was to compare blood leukocyte profiles and metal ion concentrations between hip resurfacing arthroplasty (articular surface replacement) patients with and without revision. A total of 25 articular surface replacement patients were recruited (10 with stable implants and 15 undergoing revision). Blood concentrations of chromium (Cr) and cobalt (Co) were measured. Flow cytometry was used to quantify the subpopulations of leukocytes, including CD14+ monocytes, CD16+ monocytes, CD3+ T-lymphocytes, CD19+ B-lymphocytes, CD4+ helper T-cells, and CD45+RA memory vs naïve T-cells. Patients undergoing revision had higher blood Co (mean, 10.85 µg/L) and Cr (mean, 3.19 µg/L) levels than patients with stable implants (mean Co, 3.06 µg/L; mean Cr, 1.07 µg/L) (P<.05). The number of CD4+ helper T-cells was higher in patients with stable implants (mean, 842±311 cells/µL) than in patients undergoing revision (mean, 591±208 cells/µL) (P<.05). There was a significant association between total metal ion levels (Co+Cr) and the number of CD14+ monocytes (P=.045) and inflammatory CD16+ monocytes (P=.046). The authors observed that the increase in blood metal ions was associated with an increase in CD16+ monocytes. They believe that continued analysis of blood leukocyte profiles may be helpful in defining differences among failed articular surface replacement, stable articular surface replacement, and failed metal-on-polyethylene implants. [Orthopedics. 2018; 41(3):e424-e431.].


Subject(s)
Arthroplasty, Replacement, Hip/instrumentation , Chromium/blood , Cobalt/blood , Hip Prosthesis , Leukocytes/metabolism , Metal-on-Metal Joint Prostheses , Reoperation , Aged , Biomarkers/blood , Female , Flow Cytometry , Humans , Ions , Male , Middle Aged
12.
J Stroke Cerebrovasc Dis ; 27(7): 1897-1904, 2018 Jul.
Article in English | MEDLINE | ID: mdl-29571756

ABSTRACT

BACKGROUND: Early detection of dysphagia is critical to reducing hospital complications and length of stay in patients with various types of strokes. The aim of this study was to develop and evaluate the DePaul Hospital Swallow Screener (DHSS) tool to assess for dysphagia in patients with stroke. METHODS: This prospective observational study investigated patients admitted to a comprehensive stroke center. The DHSS is composed of a questionnaire containing 8 nonswallow items and a water swallow test. All patients admitted under a standard stroke protocol are screened by the nursing staff using the DHSS and then objectively evaluated by a speech-language pathologist using the Mann Assessment of Swallowing Ability (MASA). Validity measures and reliability through Cohen's κ-coefficient with associated 95% confidence intervals were calculated. RESULTS: A total of 224 patients completed the DHSS and had at least 1 MASA score. The overall Content Validity Index score for the DHSS was .92. Compared with the MASA dysphagia cutoff value, the DHSS had a specificity of 93% and a sensitivity of 69%, and compared with the MASA aspiration risk cutoff value, the DHSS had a specificity of 90% and a sensitivity of 70%. Stratified analysis for those with any documented stroke (ischemic or hemorrhagic) compared with those admitted with transient ischemic attack or no stroke yielded similar sensitivity and specificity in both dysphagia and aspiration risk. CONCLUSION: The DHSS is a valid and reliable swallow screening tool with moderate agreement, high specificity, and reliable predictive values when compared with the MASA.


Subject(s)
Deglutition Disorders/diagnosis , Deglutition Disorders/etiology , Deglutition , Respiratory Aspiration/diagnosis , Respiratory Aspiration/etiology , Stroke/complications , Aged , Female , Humans , Male , Middle Aged , Prospective Studies , Reproducibility of Results , Sensitivity and Specificity , Stroke/diagnosis , Surveys and Questionnaires , Water
13.
Gastroenterology Res ; 11(6): 416-421, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30627265

ABSTRACT

BACKGROUND: The rate of inadequate bowel preparation in the general population is approximately 23%. As more individuals with developmental disabilities enter late adulthood, a concomitant rise in endoscopic procedures for this population, including screening colonoscopies, is anticipated. However, there are sparse data on the adequacy of bowel preparation in patients with developmental disabilities. METHODS: A retrospective analysis of 91 patients with developmental disabilities who underwent colonoscopy from 2006 to 2014 was performed. Bowel preparation adequacy from these procedures was evaluated, together with other data, including age, developmental disability diagnoses, procedure type, indication and setting. RESULTS: Mean age at the time of endoscopy was 52.6 ± 13.4 years, with an age range of 18 - 74 years. Inadequate bowel preparation was found in approximately 51% of documented cases. Outpatients were more likely to have adequate bowel preparation compared to inpatients, with an odds ratio of 2.75 (95% confidence interval: 1.14 - 6.62, P = 0.022). No other major factors identified had any statistically significant influence on the adequacy of bowel preparation. CONCLUSION: Over half of patients with developmental disabilities undergoing colonoscopy had inadequate bowel preparations in our study, which is more than twice the rate for the general population. Furthermore, outpatients were 2.75 times more likely to have adequate bowel preparation compared to inpatients. Further studies are recommended to improve endoscopic practices for this patient population.

14.
Am J Surg ; 215(4): 577-580, 2018 Apr.
Article in English | MEDLINE | ID: mdl-28629609

ABSTRACT

Laparoscopic colectomy is associated with important early postoperative advantages. These procedures can however increase total operative duration. Our hypothesis is that increased operative duration is associated with post-operative complications that may outweigh the benefits of a minimally invasive approach. We analyzed data from the Michigan Surgical Quality Collaborative (MSQC)R. This is a statewide database of patients who have undergone colon or rectal resections. Colorectal procedures were divided into four groups by surgical approach: open, laparoscopic, robotic and laparoscopic and robotic procedures converted to open. The sample was divided into three groups by operative duration: less than 2 h, between 2 and 4 h and greater than 4 h and compared by selected preoperative variables and outcomes. Small but significant differences in perioperative outcomes were noted in colectomies with a >4 h operative duration. However, laparoscopic procedures exceeding 4 h were not associated with significant differences perioperative outcomes.


Subject(s)
Colectomy/methods , Laparoscopy/methods , Operative Time , Robotic Surgical Procedures/methods , Aged , Female , Humans , Male , Michigan , Middle Aged , Postoperative Complications , Retrospective Studies , Treatment Outcome
15.
Eur J Endocrinol ; 177(1): K1-K6, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28515208

ABSTRACT

OBJECTIVE: Autoimmune lymphocytic parathyroiditis and acquired hypocalciuric hypercalcemia associated with autoantibodies against the calcium-sensing receptor (anti-CaSR) are rare and poorly understood conditions. Here, we describe a patient with acquired parathyroid hormone (PTH)-dependent hypercalcemia with associated hypocalciuria, found to have true lymphocytic parathyroiditis on histopathology, and circulating anti-CaSR antibodies in serum. DESIGN AND METHODS: A 64-year-old woman was referred to our clinic for persistent hypercalcemia after a subtotal parathyroidectomy. She was normocalcemic until the age of 63 years when she was diagnosed with primary hyperparathyroidism. She underwent subtotal parathyroidectomy with appropriate intraoperative PTH decline. Two weeks post-parathyroidectomy, she presented with persistent hypercalcemia and hyperparathyroidism. Urine studies revealed an inappropriately low 24-h urine calcium (Ca)/creatinine clearance ratio. Surgical pathology was consistent with true lymphocytic parathyroiditis with lymphoid follicles. The presence of circulating anti-CaSR antibodies was detected by immunoprecipitation of CaSR by the patient's serum. After a 4-week course of prednisone, serum Ca and PTH normalized, and her anti-CaSR titers declined. She remains normocalcemic 10 months after the discontinuation of glucocorticoid therapy. We present this patient in the context of the relevant published literature on lymphocytic parathyroiditis and acquired hypocalciuric hypercalcemia related to anti-CaSR antibodies. CONCLUSIONS: Autoimmune lymphocytic parathyroiditis and acquired hypocalciuric hypercalcemia associated with anti-CaSR antibodies is a very rare yet important condition to be considered in a patient with acquired PTH-dependent hypercalcemia with inappropriate hypocalciuria. Although subtotal parathyroidectomy is unlikely to correct the hypercalcemia, this entity may respond to a short course of prednisone therapy.


Subject(s)
Autoantibodies/immunology , Glucocorticoids/therapeutic use , Hypercalcemia/etiology , Hyperparathyroidism, Primary/etiology , Receptors, Calcium-Sensing/immunology , Anti-Inflammatory Agents/therapeutic use , Calcium/blood , Diabetes Mellitus, Type 2/complications , Female , Humans , Hypercalcemia/immunology , Hyperparathyroidism, Primary/immunology , Hyperparathyroidism, Primary/therapy , Middle Aged , Parathyroid Hormone/blood , Parathyroidectomy , Prednisone/therapeutic use
16.
J Arthroplasty ; 32(4): 1272-1279, 2017 04.
Article in English | MEDLINE | ID: mdl-28065625

ABSTRACT

BACKGROUND: Monofilament and barbed monofilament sutures have been shown in in vitro models to have less bacterial adherence than braided suture. This study evaluates bacterial adherence to suture materials and tissue reactivity with an in vivo contaminated wound mouse model. METHODS: Staphylococcus aureus was used to create an in vivo contaminated wound model at 2 amounts (106 colony-forming units [CFU] and 108 CFU) using a mouse air pouch. Three types of commonly used absorbable suture were evaluated: braided, monofilament, and barbed monofilament. Bacterial suture adherence was evaluated with suture culture, a photon-capturing camera system, and scanning electron microscopy. Tissue reactivity was assessed through histology and protein expression. RESULTS: The braided suture group with the high amount of S aureus exhibited frank purulence and air pouch hypertrophy in all 8 mice. A significant difference was found between suture groups inoculated with 108 CFU (P < .05) as measured by bacterial culture concentration using the optical density method. The braided suture hosted more bacteria than either monofilament (P < .005) or barbed monofilament suture (P < .005). No difference was appreciated between the monofilament and barbed monofilament groups. Kruskal-Wallis test demonstrated a significant difference between groups in regard to levels of tumor necrosis factor-α (P < .05) and interleukin-1 (P < .05). CONCLUSION: Our in vivo contaminated wound model demonstrated that barbed monofilament suture performed similarly to monofilament suture and better than braided suture in terms of bacterial adherence, biofilm formation, and tissue reactivity.


Subject(s)
Bacterial Adhesion , Staphylococcal Infections/prevention & control , Surgical Wound Infection/prevention & control , Sutures , Animals , Biofilms , Female , Interleukin-1/metabolism , Mice, Inbred BALB C , Microscopy, Electron, Scanning , Staphylococcal Infections/metabolism , Staphylococcal Infections/microbiology , Staphylococcus aureus , Surgical Wound Infection/metabolism , Suture Techniques , Tumor Necrosis Factor-alpha/metabolism
17.
Orthopedics ; 40(3): e436-e442, 2017 May 01.
Article in English | MEDLINE | ID: mdl-28135373

ABSTRACT

Interlocking nails coated with antibiotic-supplemented cement provide effective treatment of infected long bone nonunion, but the thicker coating on guidewires may provide greater antibacterial activity. This study compared the properties of cement cured on each construct by evaluating 2-cm segments of 8-mm interlocking nails and 3.5-mm guidewires coated with antibiotic-supplemented cement. Each construct (n=7 for each group) was coated with polymethylmethacrylate cement (Simplex; Stryker Orthopaedics, Mahwah, New Jersey) containing either 1 g tobramycin or 1 g vancomycin powder plus 2.2 g tobramycin powder. A No. 40 French polyvinyl chloride chest tube was used as a mold for all constructs. Segments were soaked in sterile phosphate-buffered saline, and entire aliquots were exchanged at various intervals over a 6-week period. Antibiotic concentration, antibacterial activity, cement curing temperature, and porosity were measured. At least half of the total elution of antibiotics occurred within the first 24 hours for all constructs. For the tobramycin-only cement, no differences between constructs were observed. For constructs containing both antibiotics, interlocking nails showed more antibiotic release than guidewires at most time points (P<.05-P<.001). Antibiotics were released for 6 weeks and continued to inhibit Staphylococcus aureus growth. Cement curing temperatures for interlocking nails were lower than those for guidewires (P<.05). Guidewires coated with cement containing tobramycin and vancomycin showed significantly greater porosity compared with the other 3 groups (P<.05), but the amount of antibiotic released did not directly relate to porosity for any construct type. Interlocking nails coated with antibiotic-supplemented cement may provide greater antibiotic delivery to infected long bone nonunion compared with guidewires. A thin mantle of cement may allow greater elution, possibly as a result of cooler exothermic reactions. [Orthopedics. 2017; 40(3):e436-e442.].


Subject(s)
Anti-Bacterial Agents/chemistry , Bone Cements/chemistry , Bone Nails , Bone Wires , Tobramycin/chemistry , Vancomycin/chemistry , Anti-Bacterial Agents/pharmacology , Polymethyl Methacrylate , Porosity , Staphylococcus aureus/drug effects , Temperature , Tobramycin/pharmacology , Vancomycin/pharmacology
18.
Gastroenterology Res ; 10(6): 329-333, 2017 Dec.
Article in English | MEDLINE | ID: mdl-29317939

ABSTRACT

BACKGROUND: Lymphocytic colitis (LC) is a chronic disorder characterized by watery diarrhea. This study sought to evaluate if LC recurs after therapy, and the time frame in which this occurs. Secondary objectives included length and type of therapy, drug-free intervals, and reasons for drug discontinuation. METHODS: A retrospective chart review between January 1, 2008 and October 30, 2015 of patients with biopsy-confirmed lymphocytic, collagenous, or microscopic colitis was conducted. Patient-reported average bowel movements/day were reviewed. Demographic data, dates of colonoscopy, follow-up, type and dose of medications used, and therapy start/stop dates were reviewed. RESULTS: Patients presenting with colonoscopic documented LC (n = 114) were predominantly female (88%), Caucasian (97%), with mean bowel movements/day of five. A total of 58/114 (51%) patients were placed on a therapy. Patients taking budesonide saw bowel movements/day reduced from 4.7 to 2.4 compared to 5.8 to 2.8 for those given 5-aminosalicylic acid (5-ASA). First-line medications budesonide and 5-aminosalicylic acid failed in 12/58 (21%) of patients, other drugs also resulted in therapy changes. Thirty-five percent required their initial therapy changed and of those 40% required a second change. Symptom exacerbations were documented during therapy for 19% of patients; therapy changes resulted in good response. CONCLUSIONS: Almost half of all LC cases (56/114) gradually improved without requiring therapy. Seventy-six percent of our treated patients responded well to budesonide when used as the first-line therapy. Similarly, 61.5% responded to 5-ASA. Budesonide was the drug of choice for flares. Tailoring drug therapy to meet individual patient's needs appears to be the best current approach to managing LC.

19.
Sci Rep ; 6: 31486, 2016 08 11.
Article in English | MEDLINE | ID: mdl-27511713

ABSTRACT

In coastal environments, evaporation is an important driver of subsurface salinity gradients in marsh systems. However, it has not been addressed in the intertidal zone of sandy beaches. Here, we used field data on an estuarine beach foreshore with numerical simulations to show that evaporation causes upper intertidal zone pore-water salinity to be double that of seawater. We found the increase in pore-water salinity mainly depends on air temperature and relative humidity, and tide and wave actions dilute a fraction of the high salinity plume, resulting in a complex process. This is in contrast to previous studies that consider seawater as the most saline source to a coastal aquifer system, thereby concluding that seawater infiltration always increases pore-water salinity by seawater-groundwater mixing dynamics. Our results demonstrate the combined effects of evaporation and tide and waves on subsurface salinity distribution on a beach face. We anticipate our quantitative investigation will shed light on the studies of salt-affected biological activities in the intertidal zone. It also impacts our understanding of the impact of global warming; in particular, the increase in temperature does not only shift the saltwater landward, but creates a different salinity distribution that would have implications on intertidal biological zonation.

20.
Scand J Occup Ther ; 22(3): 173-80, 2015 May.
Article in English | MEDLINE | ID: mdl-25328060

ABSTRACT

BACKGROUND: A community-based occupational therapy program aims to provide client-centered and occupation-based interventions to at-risk youth. OBJECTIVE: This pilot study explores how at-risk youth experiencing psychosocial and environmental barriers to occupation respond to client-centered and occupation-based occupational therapy in the community. METHOD: One-on-one semi-structured interviews were conducted with five youth participants receiving individual therapy interventions through a community-based occupational therapy program. The transcript data were analyzed qualitatively. RESULTS: Three themes emerged: (i) client-centered and occupation-based OT interventions, (ii) the youths' increased self-advocacy, and (iii) the enhancement of youths' perception of their future. CONCLUSION AND SIGNIFICANCE: The youth in this study described OT interventions exemplifying client-centered and occupation-based therapy, a non-prescriptive approach that validates the individual and may prove especially effective in serving the at-risk youth population.


Subject(s)
Adolescent Behavior/psychology , Attitude to Health , Occupational Therapy/methods , Patient-Centered Care/methods , Professional-Patient Relations , Adolescent , Female , Humans , Interviews as Topic , Male , Pilot Projects , Psychology , Qualitative Research
SELECTION OF CITATIONS
SEARCH DETAIL
...