Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.219
Filter
1.
J Emerg Med ; 66(4): e413-e420, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38490894

ABSTRACT

BACKGROUND: Opioids are commonly prescribed for the management of acute orthopedic trauma pain, including nonoperative distal radius fractures. OBJECTIVES: This prospective study aimed to determine if a clinical decision support intervention influenced prescribing decisions for patients with known risk factors. We sought to quantify frequency of opioid prescriptions for acute nonoperative distal radius fractures treated. METHODS: We performed a prospective study at one large health care system. Utilizing umbrella code S52.5, we identified all distal radius fractures treated nonoperatively, and the encounters were merged with the Prescription Reporting with Immediate Medication Mapping (PRIMUM) database to identify encounters with opioid prescriptions and patients with risk factors for opioid use disorder. We used multivariable logistic regression to determine patient characteristics associated with the prescription of an opioid. Among encounters that triggered the PRIMUM alert, we calculated the percentage of encounters where the PRIMUM alert influenced the prescribing decision. RESULTS: Of 2984 encounters, 1244 (41.7%) included an opioid prescription. Age increment is a significant factor to more likely receive opioid prescriptions (p < 0.0001) after adjusting for other factors. Among encounters where the physician received an alert, those that triggered the alert for early refill were more likely to influence physicians' opioid prescribing when compared with other risk factors (p = 0.0088). CONCLUSION: Over 90% of patients (106/118) continued to receive an opioid medication despite having a known risk factor for abuse. Additionally, we found older patients were more likely to be prescribed opioids for nonoperatively managed distal radius fractures.


Subject(s)
Acute Pain , Decision Support Systems, Clinical , Wrist Fractures , Humans , Analgesics, Opioid/therapeutic use , Prospective Studies , Drug Prescriptions , Practice Patterns, Physicians' , Acute Pain/drug therapy
2.
Phytopathology ; 114(1): 155-163, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37335121

ABSTRACT

Spring dead spot (SDS) (Ophiosphaerella spp.) is a soilborne disease of warm-season turfgrasses grown where winter dormancy occurs. The edaphic factors that influence where SDS epidemics occur are not well defined. A study was conducted during the spring of 2020 and repeated in the spring of 2021 on four 'TifSport' hybrid bermudagrass (Cynodon dactylon × transvaalensis) golf course fairways expressing SDS symptoms in Cape Charles, VA, U.S.A. SDS within each fairway was mapped from aerial imagery collected in the spring of 2019 with a 20 MP CMOS 4k true color sensor mounted on a DJI Phantom 4 Pro drone. Three disease intensity zones were designated from the maps (low, moderate, high) based on the density of SDS patches in an area. Disease incidence and severity, soil samples, surface firmness, thatch depth, and organic matter measurements were taken from 10 plots within each disease intensity zone from each of the four fairways (n = 120). Multivariate pairwise correlation analyses (P < 0.1) and best subset stepwise regression analyses were conducted to determine which edaphic factors most influenced the SDS epidemic within each fairway and each year. Edaphic factors that correlated with an increase in SDS or were selected for the best fitting model varied across holes and years. However, in certain cases, soil pH and thatch depth were predictors for an increase in SDS. No factors were consistently associated with SDS occurrence, but results from this foundational study of SDS epidemics can guide future research to relate edaphic factors to SDS disease development.


Subject(s)
Ascomycota , Plant Diseases , Seasons , Cynodon , Soil
3.
J Opioid Manag ; 19(3): 247-255, 2023.
Article in English | MEDLINE | ID: mdl-37145927

ABSTRACT

OBJECTIVE: Opioid-related adverse drug events continue to occur. This study aimed to characterize the patient population receiving naloxone to inform future intervention efforts. DESIGN: We describe a case series of patients who received naloxone in the hospital during a 16-week time frame in 2016. Data were collected on other administered medications, reason for admission to the hospital, pre-existing diagnoses, comorbidities, and demographics. SETTING: Twelve hospitals within a large healthcare system. PATIENTS: 46,952 patients were admitted during the study period. 31.01 percent (n = 14,558) of patients received opioids, of which 158 received naloxone. INTERVENTION: Administration of naloxone. Main outcome of interest: Sedation assessment via Pasero Opioid-Induced Sedation Scale (POSS), administration of sedating medications. RESULTS: POSS score was documented prior to opioid administration in 93 (58.9 percent) patients. Less than half of patients had a POSS documented prior to naloxone administration with 36.8 percent documented 4 hours prior. 58.2 percent of patients received multimodal pain therapy with other nonopioid medications. Most patients received more than one sedating medication concurrently (n = 142, 89.9 percent). CONCLUSIONS: Our findings highlight areas for intervention to prevent opioid oversedation. Investing in electronic clinical decision support mechanisms, such as sedation assessment, could detect patients at risk for oversedation and ultimately prevent the need for naloxone. Coordinated order sets for pain management can reduce the percentage of patients receiving multiple sedating medications and promote the use of multimodal pain management in efforts to reduce opioid reliance while optimizing pain control.


Subject(s)
Analgesics, Opioid , Naloxone , Humans , Analgesics, Opioid/therapeutic use , Workflow , Retrospective Studies , Pain/drug therapy , Narcotic Antagonists
4.
Clin Oncol (R Coll Radiol) ; 35(6): e352-e361, 2023 06.
Article in English | MEDLINE | ID: mdl-37031075

ABSTRACT

AIMS: Clinical equipoise exists regarding early-stage lung cancer treatment among patients as trials comparing stereotactic body radiation therapy (SBRT) and surgical resection are unavailable. Given the potential differences in treatment effectiveness and side-effects, we sought to determine the associations between treatment type, decision regret and depression. MATERIALS AND METHODS: A multicentre, prospective study of patients with stage IA-IIA non-small cell lung cancer (NSCLC) with planned treatment with SBRT or surgical resection was conducted. Decision regret and depression were measured using the Decision Regret Scale (DRS) and Patient Health Questionnaire-4 (PHQ-4) at 3, 6 and 12 months post-treatment, respectively. Mixed linear regression modelling examined associations between treatment and decision regret adjusting for patient sociodemographics. RESULTS: Among 211 study participants with early-stage lung cancer, 128 (61%) patients received SBRT and 83 (39%) received surgical resection. The mean age was 73 years (standard deviation = 8); 57% were female; 79% were White non-Hispanic. In the entire cohort at 3 months post-treatment, 72 (34%) and 57 (27%) patients had mild and severe decision regret, respectively. Among patients who received SBRT or surgery, 71% and 46% of patients experienced at least mild decision regret at 3 months, respectively. DRS scores increased at 6 months and decreased slightly at 12 months of follow-up in both groups. Higher DRS scores were associated with SBRT treatment (adjusted mean difference = 4.18, 95% confidence interval 0.82 to 7.54) and depression (adjusted mean difference = 3.49, 95% confidence interval 0.52 to 6.47). Neither patient satisfaction with their provider nor decision-making role concordance was associated with DRS scores. CONCLUSIONS: Most early-stage lung cancer patients experienced at least mild decision regret, which was associated with SBRT treatment and depression symptoms. Findings suggest patients with early-stage lung cancer may not be receiving optimal treatment decision-making support. Therefore, opportunities for improved patient-clinician communication probably exist.


Subject(s)
Carcinoma, Non-Small-Cell Lung , Lung Neoplasms , Radiosurgery , Humans , Female , Aged , Male , Carcinoma, Non-Small-Cell Lung/radiotherapy , Carcinoma, Non-Small-Cell Lung/surgery , Carcinoma, Non-Small-Cell Lung/pathology , Lung Neoplasms/radiotherapy , Lung Neoplasms/surgery , Lung Neoplasms/pathology , Prospective Studies , Treatment Outcome , Radiosurgery/adverse effects , Emotions , Neoplasm Staging
5.
J Subst Use Addict Treat ; 151: 208986, 2023 08.
Article in English | MEDLINE | ID: mdl-36822268

ABSTRACT

OBJECTIVE: Opioids and stimulants are increasingly implicated in overdose deaths, particularly among minoritized groups. We examined daily opioid and cocaine co-use, nonfatal overdoses, and naloxone carrying among minoritized people who inject drugs (PWID). METHODS: The study derived data from 499 PWID in Baltimore City, MD, recruited using street-based outreach between 2016 and 2019. Participants reported overdoses; sociodemographic characteristics; and use of nonmedical prescription opioids, heroin, cocaine, and naloxone. RESULTS: Among the participants, the mean age was 46, 34 % were female, 64 % self-identified as Black, and 53 % experienced recent homelessness. Black PWID, compared to White PWID, were as likely to use opioids and cocaine daily but were 61 % less likely to have naloxone. After controlling for sociodemographic characteristics, women (aOR:1.88, 95%CI: 1.14, 3.11), persons experiencing homelessness (aOR:3.07, 95%CI: 1.79, 5.24), and those who experienced a recent overdose (aOR:2.14, 95%CI: 1.29, 3.58) were significantly more likely to use opioids and any form of cocaine every day. In a subanalysis of only female PWID, females engaged in sex work (aOR:2.27, 95%CI: 1.02, 5.07) and females experiencing recent homelessness (aOR:5.82, 95%CI: 2.50, 13.52) were significantly more likely to use opioids and cocaine daily. Furthermore, females (aOR:1.69, 95%CI:1.03, 2.77), persons experiencing homelessness (aOR:1.94, 95%CI:1.16, 3.24), and those with higher educational attainment (aOR:2.06, 95%CI:1.09, 3.91) were more likely to often/always carry naloxone, while Black PWID were less likely to have naloxone (aOR:0.39, 95%CI:0.22, 0.69). CONCLUSIONS: These findings highlight the need for targeted naloxone distribution and other harm-reduction interventions among minoritized groups in urban areas.


Subject(s)
Cocaine-Related Disorders , Cocaine , Drug Overdose , Opioid-Related Disorders , Substance Abuse, Intravenous , Humans , Female , Male , Analgesics, Opioid , Opioid-Related Disorders/epidemiology , Drug Overdose/epidemiology , Naloxone/therapeutic use , Cocaine/therapeutic use
6.
Arch Osteoporos ; 18(1): 12, 2022 12 17.
Article in English | MEDLINE | ID: mdl-36527534

ABSTRACT

Multinational reports suggest Ireland has one of the greatest illness burdens related to osteoporosis. Hospital care represents the costliest portion of health services. We found public hospital bed days for fragility fractures in Ireland increased by 43% between 2008 and 2017 which exceeded those for other common diseases. INTRODUCTION: Recent multinational reports suggest Ireland has one of the greatest illness burdens related to osteoporosis, manifesting clinically as fragility fractures (FF). International reports show that FF incidence, rate of hospital admission and cost are similar or greater than those for breast cancer, myocardial infarction and stroke. Studies addressing the illness burden of osteoporosis in Ireland are few, and none compares fragility fractures to other common chronic diseases. METHODS: A retrospective analysis of national administrative data for all public hospital admissions was performed on adults aged 50 years and older from January 2008 to December 2017. RESULTS: In 2017, public hospital bed days for FF totalled 249,887 outnumbering Chronic Obstructive Pulmonary Disease (COPD): 131,897; 6 solid cancers (CA): 118,098; myocardial infarction (MI): 83,477; and diabetes mellitus (DM): 31,044. Bed days for FF increased by 43% between 2008 and 2017, in contrast to a 32%, 28% and 31% reduction for CA, MI and DM, respectively, and a 12% increase for COPD. Public hospital bed days for FF in 2016 were greater than MI, stroke, atrial fibrillation and chest pain combined but less than a combination of COPD, pneumonia and lower respiratory tract infection. CONCLUSION: Osteoporotic fractures represent a large and rapidly increasing illness burden amongst older Irish adults, with substantial care requirements and the resulting onus on our healthcare system. Urgent action is needed to address this public health issue and the services for those at risk of fracture.


Subject(s)
Diabetes Mellitus , Myocardial Infarction , Osteoporosis , Osteoporotic Fractures , Pulmonary Disease, Chronic Obstructive , Stroke , Adult , Humans , Middle Aged , Aged , Aged, 80 and over , Osteoporotic Fractures/epidemiology , Osteoporotic Fractures/etiology , Retrospective Studies , Osteoporosis/epidemiology , Osteoporosis/complications , Pulmonary Disease, Chronic Obstructive/epidemiology , Pulmonary Disease, Chronic Obstructive/complications , Stroke/epidemiology , Stroke/complications , Hospitals, Public , Myocardial Infarction/epidemiology , Myocardial Infarction/complications
7.
BMC Cardiovasc Disord ; 22(1): 96, 2022 03 09.
Article in English | MEDLINE | ID: mdl-35264114

ABSTRACT

BACKGROUND: It is unclear whether genetic variants identified from single nucleotide polymorphisms (SNPs) strongly associated with coronary heart disease (CHD) in genome-wide association studies (GWAS), or a genetic risk score (GRS) derived from them, can help stratify risk of recurrent events in patients with CHD. METHODS: Study subjects were enrolled at the close-out of the LIPID randomised controlled trial of pravastatin vs placebo. Entry to the trial had required a history of acute coronary syndrome 3-36 months previously, and patients were in the trial for a mean of 36 months. Patients who consented to a blood sample were genotyped with a custom designed array chip with SNPs chosen from known CHD-associated loci identified in previous GWAS. We evaluated outcomes in these patients over the following 10 years. RESULTS: Over the 10-year follow-up of the cohort of 4932 patients, 1558 deaths, 898 cardiovascular deaths, 727 CHD deaths and 375 cancer deaths occurred. There were no significant associations between individual SNPs and outcomes before or after adjustment for confounding variables and for multiple testing. A previously validated 27 SNP GRS derived from SNPs with the strongest associations with CHD also did not show any independent association with recurrent major cardiovascular events. CONCLUSIONS: Genetic variants based on individual single nucleotide polymorphisms strongly associated with coronary heart disease in genome wide association studies or an abbreviated genetic risk score derived from them did not help risk profiling in this well-characterised cohort with 10-year follow-up. Other approaches will be needed to incorporate genetic profiling into clinically relevant stratification of long-term risk of recurrent events in CHD patients.


Subject(s)
Coronary Disease , Genome-Wide Association Study , Coronary Disease/diagnosis , Coronary Disease/genetics , Genetic Predisposition to Disease , Genotype , Humans , Polymorphism, Single Nucleotide , Risk Factors
8.
Article in English | MEDLINE | ID: mdl-35063884

ABSTRACT

Docosahexaenoic acid (DHA) intake was estimated in pregnant women between 12- and 20-weeks' gestation using the National Cancer Institute's (NCI) Diet History Questionnaire-II (DHQ-II) and a 7-question screener designed to capture DHA intake (DHA Food Frequency Questionnaire, DHA-FFQ). Results from both methods were compared to red blood cell phospholipid DHA (RBC-DHA) weight percent of total fatty acids. DHA intake from the DHA-FFQ was more highly correlated with RBC-DHA (rs=0.528) than the DHQ-II (rs=0.352). Moreover, the DHA-FFQ allowed us to obtain reliable intake data from 1355 of 1400 participants. The DHQ-II provided reliable intake for only 847 of 1400, because many participants only partially completed it and it was not validated for Hispanic participants. Maternal age, parity, and socioeconomic status (SES) were also significant predictors of RBC-DHA. When included with estimated intake from the DHA-FFQ, the model accounted for 36% of the variation in RBC-DHA.


Subject(s)
Diet , Pregnant Women , Docosahexaenoic Acids , Erythrocytes , Fatty Acids , Female , Humans , Pregnancy , Surveys and Questionnaires , United States
9.
Climacteric ; 25(4): 369-375, 2022 08.
Article in English | MEDLINE | ID: mdl-34694941

ABSTRACT

OBJECTIVE: The aim of this study is to analyze the association between coronary artery vitamin D receptor (VDR) expression and systemic coronary artery atherosclerosis (CAA) risk factors. METHODS: Female cynomolgus monkeys (n = 39) consumed atherogenic diets containing the women's equivalent of 1000 IU/day of vitamin D3. After 32 months consuming the diets, each monkey underwent surgical menopause. After 32 postmenopausal months, CAA and VDR expression were quantified in the left anterior descending coronary artery. Plasma 25OHD3, lipid profiles and serum monocyte chemotactic protein-1 (MCP-1) were measured. RESULTS: In postmenopausal monkeys receiving atherogenic diets, serum MCP-1 was significantly elevated compared with baseline (482.2 ± 174.2 pg/ml vs. 349.1 ± 163.2 pg/ml, respectively; p < 0.001; d = 0.79) and at the start of menopause (363.4 ± 117.2 pg/ml; p < 0.001; d = 0.80). Coronary VDR expression was inversely correlated with serum MCP-1 (p = 0.042). Additionally, the change of postmenopausal MCP-1 (from baseline to necropsy) was significantly reduced in the group with higher, compared to below the median, VDR expression (p = 0.038). The combination of plasma 25OHD3 and total plasma cholesterol/high-density lipoprotein cholesterol was subsequently broken into low-risk, moderate-risk and high-risk groups; as the risk increased, the VDR quantity decreased (p = 0.04). CAA was not associated with various atherogenic diets. CONCLUSION: Coronary artery VDR expression was inversely correlated with markers of CAA risk and inflammation, including MCP-1, suggesting that systemic and perhaps local inflammation in the artery may be associated with reduced arterial VDR expression.


Subject(s)
Atherosclerosis , Coronary Artery Disease , Receptors, Calcitriol/metabolism , Atherosclerosis/complications , Coronary Artery Disease/etiology , Female , Humans , Inflammation , Risk Factors , Vitamin D
11.
Safety (Basel) ; 7(2)2021 Jun 03.
Article in English | MEDLINE | ID: mdl-34552980

ABSTRACT

Young adults enrolled in collegiate agricultural programs are a critical audience for agricultural health and safety training. Understanding the farm tasks that young adults engage in is necessary for tailoring health and safety education. The project analyzed evaluation survey responses from the Gear Up for Ag Health and Safety™ program, including reported agricultural tasks, safety concerns, frequency of discussing health and safety concerns with healthcare providers, safety behaviors, and future career plans. The most common tasks reported included operation of machinery and grain-handling. Most participants intended to work on a family-owned agricultural operation or for an agribusiness/cooperative following graduation. Reported safety behaviors (hearing protection, eye protection, and sunscreen use when performing outdoor tasks) differed by gender and education type. Male community college and university participants reported higher rates of "near-misses" and crashes when operating equipment on the roadway. One-third of participants reported discussing agricultural health and safety issues with their medical provider, while 72% were concerned about the health and safety of their family and co-workers in agriculture. These findings provide guidance for better development of agricultural health and safety programs addressing this population-future trainings should be uniquely tailored, accounting for gender and educational differences.

12.
J Hosp Infect ; 114: 117-125, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33930487

ABSTRACT

BACKGROUND: Healthcare workers (HCWs) are at the front line of the ongoing coronavirus 2019 (COVID-19) pandemic. Comprehensive evaluation of the seroprevalence of severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) among HCWs in a large healthcare system could help to identify the impact of epidemiological factors and the presence of symptoms on the immune response to the infection over time. AIM: To determine the seroprevalence of SARS-CoV-2-specific antibodies among HCWs, identify associated epidemiological factors and study antibody kinetics. METHODS: A longitudinal evaluation of the seroprevalence and epidemiology of SARS-CoV-2-specific antibodies was undertaken in approximately 30,000 HCWs in the largest healthcare system in Connecticut, USA. FINDINGS: At baseline, the prevalence of SARS-CoV-2 antibody among 6863 HCWs was 6.3% [95% confidence interval (CI) 5.7-6.9%], and was highest among patient care support (16.7%), medical assistants (9.1%) and nurses (8.2%), and lower for physicians (3.8%) and advanced practice providers (4.5%). Seroprevalence was significantly higher among African Americans [odds ratio (OR) 3.26 compared with Caucasians, 95% CI 1.77-5.99], in participants with at least one symptom of COVID-19 (OR 3.00, 95% CI 1.92-4.68), and in those reporting prior quarantine (OR 3.83, 95% CI 2.57-5.70). No symptoms were reported in 24% of seropositive participants. Among the 47% of participants who returned for a follow-up serological test, the seroreversion rate was 39.5% and the seroconversion rate was 2.2%. The incidence of re-infection in the seropositive group was zero. CONCLUSION: Although there is a decline in the immunoglobulin G antibody signal over time, 60.5% of seropositive HCWs had maintained their seroconversion status after a median of 5.5 months.


Subject(s)
Antibodies, Viral/blood , COVID-19 , SARS-CoV-2 , Adult , COVID-19/immunology , Connecticut/epidemiology , Female , Health Personnel , Humans , Kinetics , Male , Middle Aged , SARS-CoV-2/immunology , Seroepidemiologic Studies
14.
J Intellect Disabil Res ; 65(4): 340-347, 2021 04.
Article in English | MEDLINE | ID: mdl-33443319

ABSTRACT

BACKGROUND: There are currently no validated methods for energy intake assessment in adolescents with intellectual and developmental disabilities (IDD). The purpose of this study was to determine the feasibility of collecting 3-day image-assisted food records (IARs) and doubly labelled water (TDEEDLW ) data in adolescents with IDD and to obtain preliminary estimates of validity and reliability for energy intake estimated by IAR. METHODS: Adolescents with IDD completed a 14-day assessment of mean daily energy expenditure using doubly labelled water. Participants were asked to complete 3-day IARs twice during the 14-day period. To complete the IAR, participants were asked to fill out a hard copy food record over three consecutive days (two weekdays/one weekend day) and to take before and after digital images of all foods and beverages consumed using an iPad tablet provided by the study. Energy intake from the IAR was calculated using Nutrition Data System for Research. Mean differences, intraclass correlations and Bland-Altman limits of agreement were performed. RESULTS: Nineteen adolescents with IDD, mean age 15.1 years, n = 6 (31.6%) female and n = 6 (31.6%) ethnic/racial minorities, enrolled in the trial. Participants successfully completed their 3-day food records and self-collected doubly labelled water urine samples for 100% of required days. Images were captured for 67.4 ± 30.1% of all meals recorded at assessment 1 and 72.3 ± 29.5% at assessment 2. The energy intake measured by IAR demonstrated acceptable test-retest reliability (intraclass correlation = 0.70). On average, IAR underestimated total energy intake by -299 ± 633 kcal/day (mean per cent error = -9.6 ± 22.2%); however, there was a large amount of individual variability in differences between the IAR and TDEEDLW (range = -1703 to 430). CONCLUSIONS: The collection of IAR and TDEEDLW is feasible in adolescents with IDD. While future validation studies are needed, the preliminary estimates obtained by this study suggest that in adolescents with IDD, the IAR method has acceptable reliability and may underestimate energy intake by ~9%.


Subject(s)
Developmental Disabilities , Water , Adolescent , Child , Diet Records , Energy Intake , Ethnic and Racial Minorities , Feasibility Studies , Female , Humans , Male , Reproducibility of Results
15.
Ann Am Thorac Soc ; 18(2): 336-346, 2021 02.
Article in English | MEDLINE | ID: mdl-32936675

ABSTRACT

Expert recommendations to discuss prognosis and offer palliative options for critically ill patients at high risk of death are variably heeded by intensive care unit (ICU) clinicians. How to best promote such communication to avoid potentially unwanted aggressive care is unknown. The PONDER-ICU (Prognosticating Outcomes and Nudging Decisions with Electronic Records in the ICU) study is a 33-month pragmatic, stepped-wedge cluster randomized trial testing the effectiveness of two electronic health record (EHR) interventions designed to increase ICU clinicians' engagement of critically ill patients at high risk of death and their caregivers in discussions about all treatment options, including care focused on comfort. We hypothesize that the quality of care and patient-centered outcomes can be improved by requiring ICU clinicians to document a functional prognostic estimate (intervention A) and/or to provide justification if they have not offered patients the option of comfort-focused care (intervention B). The trial enrolls all adult patients admitted to 17 ICUs in 10 hospitals in North Carolina with a preexisting life-limiting illness and acute respiratory failure requiring continuous mechanical ventilation for at least 48 hours. Eligibility is determined using a validated algorithm in the EHR. The sequence in which hospitals transition from usual care (control), to intervention A or B and then to combined interventions A + B, is randomly assigned. The primary outcome is hospital length of stay. Secondary outcomes include other clinical outcomes, palliative care process measures, and nurse-assessed quality of dying and death.Clinical trial registered with clinicaltrials.gov (NCT03139838).


Subject(s)
Critical Illness , Intensive Care Units , Adult , Critical Illness/therapy , Electronics , Humans , Palliative Care , Randomized Controlled Trials as Topic , Respiration, Artificial
16.
J Neurosci Methods ; 346: 108922, 2020 12 01.
Article in English | MEDLINE | ID: mdl-32946912

ABSTRACT

BACKGROUND: The Allen Institute recently built a set of high-throughput experimental pipelines to collect comprehensive in vivo surveys of physiological activity in the visual cortex of awake, head-fixed mice. Developing these large-scale, industrial-like pipelines posed many scientific, operational, and engineering challenges. NEW METHOD: Our strategies for creating a cross-platform reference space to which all pipeline datasets were mapped required development of 1) a robust headframe, 2) a reproducible clamping system, and 3) data-collection systems that are built, and maintained, around precise alignment with a reference artifact. RESULTS: When paired with our pipeline clamping system, our headframe exceeded deflection and reproducibility requirements. By leveraging our headframe and clamping system we were able to create a cross-platform reference space to which multi-modal imaging datasets could be mapped. COMPARISON WITH EXISTING METHODS: Together, the Allen Brain Observatory headframe, surgical tooling, clamping system, and system registration strategy create a unique system for collecting large amounts of standardized in vivo datasets over long periods of time. Moreover, the integrated approach to cross-platform registration allows for multi-modal datasets to be collected within a shared reference space. CONCLUSIONS: Here we report the engineering strategies that we implemented when creating the Allen Brain Observatory physiology pipelines. All of the documentation related to headframe, surgical tooling, and clamp design has been made freely available and can be readily manufactured or procured. The engineering strategy, or components of the strategy, described in this report can be tailored and applied by external researchers to improve data standardization and stability.


Subject(s)
Brain , Head , Animals , Brain/diagnostic imaging , Histological Techniques , Mice , Reproducibility of Results , Wakefulness
17.
PLoS One ; 15(2): e0225019, 2020.
Article in English | MEDLINE | ID: mdl-32097413

ABSTRACT

Small animal imaging has become essential in evaluating new cancer therapies as they are translated from the preclinical to clinical domain. However, preclinical imaging faces unique challenges that emphasize the gap between mouse and man. One example is the difference in breathing patterns and breath-holding ability, which can dramatically affect tumor burden assessment in lung tissue. As part of a co-clinical trial studying immunotherapy and radiotherapy in sarcomas, we are using micro-CT of the lungs to detect and measure metastases as a metric of disease progression. To effectively utilize metastatic disease detection as a metric of progression, we have addressed the impact of respiratory gating during micro-CT acquisition on improving lung tumor detection and volume quantitation. Accuracy and precision of lung tumor measurements with and without respiratory gating were studied by performing experiments with in vivo images, simulations, and a pocket phantom. When performing test-retest studies in vivo, the variance in volume calculations was 5.9% in gated images and 15.8% in non-gated images, compared to 2.9% in post-mortem images. Sensitivity of detection was examined in images with simulated tumors, demonstrating that reliable sensitivity (true positive rate (TPR) ≥ 90%) was achievable down to 1.0 mm3 lesions with respiratory gating, but was limited to ≥ 8.0 mm3 in non-gated images. Finally, a clinically-inspired "pocket phantom" was used during in vivo mouse scanning to aid in refining and assessing the gating protocols. Application of respiratory gating techniques reduced variance of repeated volume measurements and significantly improved the accuracy of tumor volume quantitation in vivo.


Subject(s)
Lung Neoplasms/diagnostic imaging , Respiratory-Gated Imaging Techniques/methods , X-Ray Microtomography/methods , Animals , Data Accuracy , Disease Models, Animal , Lung Volume Measurements , Mice , Mice, Inbred C57BL , Mice, Transgenic , Phantoms, Imaging , Sensitivity and Specificity , X-Ray Microtomography/instrumentation
18.
Int J Food Sci Nutr ; 71(1): 116-121, 2020 Feb.
Article in English | MEDLINE | ID: mdl-31032680

ABSTRACT

The Nutrition Literacy Assessment Instrument (NLit) measures nutrition literacy, including a subscale for ability to interpret nutrition fact panels (NFP). Recent redesign of the NFP in the US was issued to improve usability. This study aimed to determine reliability of the NLit subscale using two NFP versions. A 35-item survey was administered to 48 attendees with very low incomes. Surveys included previously validated NLit numeracy questions referencing the Current NFP (C-NFP), demographic and financial literacy questions, and the same NLit numeracy questions referencing the New NFP (N-NFP). NLit numeracy between the C-NFP and N-NFP were related (r = 0.842, p < .001), and N-NFP showed excellent reliability (Cronbach-α = 0.815). Mean NLit numeracy scores for the C-NFP and N-NFP were 53.5% and 55.5%, respectively (p = .437). Exchanging the N-NFP for the C-NFP in the NLit maintains strong reliability. Similar numeracy scores between C-NFP and N-NFP suggest the redesign may not be easier to read.


Subject(s)
Food Labeling , Nutrition Assessment , Nutritional Status , Adolescent , Adult , Cross-Sectional Studies , Female , Health Knowledge, Attitudes, Practice , Humans , Male , Middle Aged , Nutrition Surveys , Reproducibility of Results , Young Adult
19.
J Neurophysiol ; 123(1): 356-366, 2020 01 01.
Article in English | MEDLINE | ID: mdl-31747332

ABSTRACT

Wide-field calcium imaging is often used to measure brain dynamics in behaving mice. With a large field of view and a high sampling rate, wide-field imaging can monitor activity from several distant cortical areas simultaneously, revealing cortical interactions. Interpretation of wide-field images is complicated, however, by the absorption of light by hemoglobin, which can substantially affect the measured fluorescence. One approach to separating hemodynamics and calcium signals is to use multiwavelength backscatter recordings to measure light absorption by hemoglobin. Following this approach, we develop a spatially detailed regression-based method to estimate hemodynamics. This Spatial Model is based on a linear form of the Beer-Lambert relationship but is fit at every pixel in the image and does not rely on the estimation of physical parameters. In awake mice of three transgenic lines, the Spatial Model offers improved separation of hemodynamics and changes in GCaMP fluorescence. The improvement is pronounced near blood vessels and, in contrast with the Beer-Lambert equations, can remove vascular artifacts along the sagittal midline and in general permits more accurate fluorescence-based determination of neuronal activity across the cortex.NEW & NOTEWORTHY This paper addresses a well-known and strong source of contamination in wide-field calcium-imaging data: hemodynamics. To guide researchers toward the best method to separate calcium signals from hemodynamics, we compare the performance of several methods in three commonly used mouse lines and present a novel regression model that outperforms the other techniques we consider.


Subject(s)
Behavior, Animal/physiology , Calcium-Binding Proteins , Calcium , Cerebral Cortex/diagnostic imaging , Green Fluorescent Proteins , Hemodynamics/physiology , Neuroimaging , Animals , Female , Fluorescence , Male , Mice , Mice, Transgenic , Microscopy, Electron , Models, Theoretical , Neuroimaging/methods , Neuroimaging/standards , Pattern Recognition, Visual/physiology
20.
J Comp Pathol ; 172: 27-30, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31690411

ABSTRACT

A 7-year-old neutered male domestic shorthair cat was presented with chronic lameness in the right forelimb. A cystic bony lesion was identified in the distal right humerus and amputation was performed. The epiphyseal trabecular bones of the capitulum and trochlea was replaced by a tan to pink, expansile mass that was surrounded by a thin rim of cortical bone. Microscopically, the tumour was composed of a bland, osteoid producing spindle cell population within a well-vascularized fibrous stroma. Radiographical and histological features were consistent with osteoblastoma. Osteoblastoma and the related osteoid osteoma are uncommon, benign osteoblastic tumours that are reported rarely in animals. These tumours should be considered as differential diagnoses for slow growing, cystic bony lesions in cats.


Subject(s)
Humerus/pathology , Neoplasms, Bone Tissue/veterinary , Osteoblastoma , Animals , Cat Diseases/pathology , Cat Diseases/surgery , Cats , Diagnosis, Differential , Humerus/surgery , Male , Neoplasms, Bone Tissue/diagnosis , Neoplasms, Bone Tissue/surgery , Osteoblastoma/diagnosis , Osteoblastoma/pathology , Osteoblastoma/surgery
SELECTION OF CITATIONS
SEARCH DETAIL
...