Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
1.
J Rural Health ; 2024 Jul 02.
Article in English | MEDLINE | ID: mdl-38953158

ABSTRACT

PURPOSE: To investigate the enduring disparities in adverse COVID-19 events between urban and rural communities in the United States, focusing on the effects of SARS-CoV-2 vaccination and therapeutic advances on patient outcomes. METHODS: Using National COVID Cohort Collaborative (N3C) data from 2021 to 2023, this retrospective cohort study examined COVID-19 hospitalization, inpatient death, and other adverse events. Populations were categorized into urban, urban-adjacent rural (UAR), and nonurban-adjacent rural (NAR). Adjustments included demographics, variant-dominant waves, comorbidities, region, and SARS-CoV-2 treatment and vaccination. Statistical methods included Kaplan-Meier survival estimates, multivariable logistic, and Cox regression. FINDINGS: The study included 3,018,646 patients, with rural residents constituting 506,204. These rural dwellers were older, had more comorbidities, and were less vaccinated than their urban counterparts. Adjusted analyses revealed higher hospitalization odds in UAR and NAR (aOR 1.07 [1.05-1.08] and 1.06 [1.03-1.08]), greater inpatient death hazard (aHR 1.30 [1.26-1.35] UAR and 1.37 [1.30-1.45] NAR), and greater risk of other adverse events compared to urban dwellers. Delta increased, while Omicron decreased, inpatient adverse events relative to pre-Delta, with rural disparities persisting throughout. Treatment effectiveness and vaccination were similarly protective across all cohorts, but dexamethasone post-ventilation was effective only in urban areas. Nirmatrelvir/ritonavir and molnupiravir better protected rural residents against hospitalization. CONCLUSIONS: Despite advancements in treatment and vaccinations, disparities in adverse COVID-19 outcomes persist between urban and rural communities. The effectiveness of some therapeutic agents appears to vary based on rurality, suggesting a nuanced relationship between treatment and geographic location while highlighting the need for targeted rural health care strategies.

2.
Cardiooncology ; 9(1): 36, 2023 Oct 06.
Article in English | MEDLINE | ID: mdl-37803479

ABSTRACT

OBJECTIVE: To determine the impact of acute SARS-CoV-2 infection on patient with concomitant active cancer and CVD. METHODS: The researchers extracted and analyzed data from the National COVID Cohort Collaborative (N3C) database between January 1, 2020, and July 22, 2022. They included only patients with acute SARS-CoV-2 infection, defined as a positive test by PCR 21 days before and 5 days after the day of index hospitalization. Active cancers were defined as last cancer drug administered within 30 days of index admission. The "Cardioonc" group consisted of patients with CVD and active cancers. The cohort was divided into four groups: (1) CVD (-), (2) CVD ( +), (3) Cardioonc (-), and (4) Cardioonc ( +), where (-) or ( +) denotes acute SARS-CoV-2 infection status. The primary outcome of the study was major adverse cardiovascular events (MACE), including acute stroke, acute heart failure, myocardial infarction, or all-cause mortality. The researchers analyzed the outcomes by different phases of the pandemic and performed competing-risk analysis for other MACE components and death as a competing event. RESULTS: The study analyzed 418,306 patients, of which 74%, 10%, 15.7%, and 0.3% had CVD (-), CVD ( +), Cardioonc (-), and Cardioonc ( +), respectively. The Cardioonc ( +) group had the highest MACE events in all four phases of the pandemic. Compared to CVD (-), the Cardioonc ( +) group had an odds ratio of 1.66 for MACE. However, during the Omicron era, there was a statistically significant increased risk for MACE in the Cardioonc ( +) group compared to CVD (-). Competing risk analysis showed that all-cause mortality was significantly higher in the Cardioonc ( +) group and limited other MACE events from occurring. When the researchers identified specific cancer types, patients with colon cancer had higher MACE. CONCLUSION: In conclusion, the study found that patients with both CVD and active cancer suffered relatively worse outcomes when they had acute SARS-CoV-2 infection during early and alpha surges in the United States. These findings highlight the need for improved management strategies and further research to better understand the impact of the virus on vulnerable populations during the COVID-19 pandemic.

3.
Microorganisms ; 11(3)2023 Feb 28.
Article in English | MEDLINE | ID: mdl-36985188

ABSTRACT

Angiotensin-converting enzyme 2 (ACE2), first discovered in 2000, serves as an important counterregulatory enzyme to the angiotensin II-mediated vasoconstrictive, pro-inflammatory, and pro-fibrotic actions of the renin-angiotensin system (RAS). Conversion of angiotensin II to the peptide angiotensin 1-7 (ANG 1-7) exerts protective vasodilatory, anti-inflammatory, and anti-fibrotic actions through interaction with the MasR receptor. There are many important considerations when noting the role of ACE2 in the pathogenesis and sequelae of COVID-19 infection. ACE2, in the role of COVID-19 infection, was recognized early in 2020 at the beginning of the pandemic as a cell membrane-bound and soluble binding site for the viral spike protein facilitating entering into tissue cells expressing ACE2, such as the lungs, heart, gut, and kidneys. Mechanisms exist that alter the magnitude of circulating and membrane-bound ACE2 (e.g., SARS-CoV-2 infection, viral variants, patient characteristics, chronic disease states, and the degree of cell surface expression of ACE2) and the influence these mechanisms have on the severity of disease and associated complications (e.g., respiratory failure, systemic inflammatory response syndrome, acute myocarditis, acute kidney injury). Several medications alter the ACE2 receptor expression, but whether these medications can influence the course of the disease and improve outcomes is unclear. In this review, we will discuss what is known about the interrelation of SARS-CoV-2, ACE2 and the factors that may contribute to the variability of its expression and potential contributors to the severity of COVID-19 infection.

4.
BMC Emerg Med ; 22(1): 14, 2022 01 24.
Article in English | MEDLINE | ID: mdl-35073849

ABSTRACT

BACKGROUND: Patients requiring emergent warfarin reversal (EWR) have been prescribed three-factor prothrombin complex concentrate (PCC3) and four-factor prothrombin complex concentrate (PCC4) to reverse the anticoagulant effects of warfarin. There is no existing systematic review and meta-analysis of studies directly comparing PCC3 and PCC4. METHODS: The primary objective of this systematic review and meta-analysis was to determine the effectiveness of achieving study defined target INR goal after PCC3 or PCC4 administration. Secondary objectives were to determine the difference in safety endpoints, thromboembolic events (TE), and survival during the patients' hospital stay. Random-effects meta-analysis models were used to estimate the odds ratios (OR), and heterogeneity associated with the outcomes. The Newcastle-Ottawa Scale was used to assess study quality, and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. RESULTS: Ten full-text manuscripts and five abstracts provided data for the primary and secondary outcomes. Patients requiring EWR had more than three times the odds of reversal to goal INR when they were given PCC4 compared to PCC3 (OR = 3.61, 95% CI: 1.97-6.60, p < 0.001). There was no meaningful clinical association or statistically significant result between PCC4 and PCC3 groups in TE (OR = 1.56, 95% CI: 0.83-2.91, p = 0.17), or survival during hospital stay (OR = 1.34, 95% CI: 0.81-2.23, p = 0.25). CONCLUSION: PCC4 is more effective than PCC3 in meeting specific predefined INR goals and has similar safety profiles in patients requiring emergent reversal of the anticoagulant effects of warfarin.


Subject(s)
Anticoagulants , Warfarin , Anticoagulants/adverse effects , Blood Coagulation Factors , Hemorrhage , Humans , International Normalized Ratio , Retrospective Studies , Warfarin/adverse effects
5.
Pancreas ; 50(6): 841-846, 2021 07 01.
Article in English | MEDLINE | ID: mdl-34347732

ABSTRACT

OBJECTIVE: Gastrointestinal bleeding (GIB) is an uncommon complication after abdominal surgery. Given the unique risks in the total pancreatectomy with islet autotransplant (TPIAT) population, we aimed to describe this population's incidence of postoperative GIB. METHODS: Prospectively collected data on patients who underwent a TPIAT from 2001 to 2018 at the University of Minnesota were reviewed for postoperative GIB. Each GIB patient was matched to a control patient and compared for medical, medication, and social history and for clinical outcomes. RESULTS: Sixty-eight patients developed a GIB (12.4%) at median time after surgery of 17 months. Etiologies included the following: anastomotic ulcer (35%), Clostridium difficile (4%), gastric or duodenal ulcers (9%), esophagitis/gastritis (10%), hemorrhoids (3%), inflammatory bowel disease (4%), Mallory-Weiss tears (1%), and unknown (29%). During diagnostic workup, 87% had an endoscopic procedure and 3% underwent imaging. Seven patients required an operation (10%), 1 required an open embolization (1%), and 13 required endoscopic treatments (19%). Patients with a GIB were more likely to die (15% vs 5%, P = 0.055). CONCLUSIONS: Twelve percent of patients developed a GIB after TPIAT. One third of those had an undefined etiology despite endoscopy. The need for intervention was high (30%).


Subject(s)
Gastrointestinal Hemorrhage/diagnosis , Islets of Langerhans Transplantation/methods , Pancreatectomy/methods , Postoperative Complications/diagnosis , Adult , Female , Gastrointestinal Hemorrhage/etiology , Humans , Islets of Langerhans Transplantation/adverse effects , Male , Middle Aged , Pancreatectomy/adverse effects , Postoperative Complications/etiology , Prospective Studies , Retrospective Studies , Risk Assessment/methods , Risk Assessment/statistics & numerical data , Risk Factors , Transplantation, Autologous , Young Adult
6.
BMC Emerg Med ; 20(1): 93, 2020 11 26.
Article in English | MEDLINE | ID: mdl-33243152

ABSTRACT

BACKGROUND: Prothrombin Complex Concentrates (PCC) are prescribed for emergent warfarin reversal (EWR). The comparative effectiveness and safety among PCC products are not fully understood. METHODS: Patients in an academic level one trauma center who received PCC3 or PCC4 for EWR were identified. Patient characteristics, PCC dose and time of dose, pre- and post-INR and time of measurement, fresh frozen plasma and vitamin K doses, and patient outcomes were collected. Patients whose pre-PCC International Normalized Ratio (INR) was > 6 h before PCC dose or the pre-post PCC INR was > 12 h were excluded. The primary outcome was achieving an INR ≤ 1.5 post PCC. Secondary outcomes were the change in INR over time, post PCC INR, thromboembolic events (TE), and death during hospital stay. Logistic regression modelled the primary outcome with and without a propensity score adjustment accounting for age, sex, actual body weight, dose, initial INR value, and time between INR measurements. Data are reported as median (IQR) or n (%) with p < 0.05 considered significant. RESULTS: Eighty patients were included (PCC3 = 57, PCC4 = 23). More PCC4 patients achieved goal INR (87.0% vs. 31.6%, odds ratio (OR) = 14.4, 95% CI: 3.80-54.93, p < 0.001). This result remained true after adjusting for possible confounders (AOR = 10.7, 95% CI: 2.17-51.24, p < 0.001). The post-PCC INR was lower in the PCC4 group (1.3 (1.3-1.5) vs. 1.7 (1.5-2.0)). The INR change was greater for PCC4 (2.3 (1.3-3.3) vs. 1.1 (0.6-2.0), p = 0.003). Death during hospital stay (p = 0.52) and TE (p = 1.00) were not significantly different. CONCLUSIONS: PCC4 was associated with a higher achievement of goal INR than PCC3. This relationship was observed in the unadjusted and propensity score adjusted results.


Subject(s)
Anticoagulants/adverse effects , Blood Coagulation Factors/administration & dosage , Warfarin/adverse effects , Aged , Aged, 80 and over , Body Weight , Emergencies , Female , Humans , International Normalized Ratio , Male , Middle Aged , Plasma , Propensity Score , Retrospective Studies , Time Factors , Trauma Centers , Vitamin K/administration & dosage
7.
Am J Pharm Educ ; 83(9): 7349, 2019 11.
Article in English | MEDLINE | ID: mdl-31871357

ABSTRACT

Objective. To determine if the number of patient encounters during advanced pharmacy practice experiences (APPEs) relates to student self-assessment of patient care skills using entrustable professional activities (EPAs). Methods. During 12-week acute care/institutional (AC/INST) APPEs, 15-week combined community pharmacy and ambulatory care (CPAC) APPEs, and three 5-week AC/INST or CPAC elective APPEs, fourth-year pharmacy students completed patient tracking surveys. Students documented the number of encounters, type of care provided, primary and secondary diagnoses, and special dosing/population considerations. Students completed self-assessment surveys for 12 EPAs. Students rated their ability to perform each EPA using a four-point scale (1=still developing this skill; 4=can do this independently) at the start and after each APPE semester. Results. Data were collected from May 2016 through April 2017. During this time, 165 students completed APPEs. Students reported 79,717 encounters. There was no significant correlation found between total number of encounters and EPA scores. The baseline EPA mean score was 3.1 and semester 3 EPA mean score was 3.7. The mean student-reported EPA scores did increase over time, some more quickly than others. Conclusion. Tracking student patient encounters provided insight into the quantity and variety of patients and conditions seen and level of care provided by students during APPEs. Mean scores on EPAs increased over time with increased exposure to patients. Patient tracking can be used to inform the curriculum by identifying potential gaps in both didactic and experiential education.


Subject(s)
Clinical Competence , Education, Pharmacy/methods , Students, Pharmacy , Ambulatory Care/standards , Community Pharmacy Services/standards , Curriculum , Educational Measurement , Humans , Self-Assessment , Surveys and Questionnaires , Time Factors
8.
Pancreas ; 48(10): 1329-1333, 2019.
Article in English | MEDLINE | ID: mdl-31688597

ABSTRACT

OBJECTIVES: To determine the rate of portal vein thrombosis (PVT) based on pharmacologic prophylaxis protocol and the impact of PVT on islet graft function after total pancreatectomy with islet autotransplantation (TPIAT). METHODS: We compared the incidence of PVT, postsurgical bleeding, and thrombotic complications in patients undergoing TPIAT between 2001 and 2018 at the University of Minnesota who received either unfractionated heparin (UFH) or enoxaparin for postoperative PVT prophylaxis. Six-month and 1-year graft function was compared between patients who developed PVT and those who did not. RESULTS: Twelve patients (6.6%) developed a PVT, which resolved by 6 months after TPIAT in 10 patients. There was no statistically significant difference in PVT rate between patients who received UFH or enoxaparin for prophylaxis (P = 0.54). Patients who received enoxaparin developed other thrombotic complications more often (6% vs 0%, P = 0.02). Islet graft function did not differ in patients who developed PVT versus those who did not. CONCLUSIONS: There was no difference between enoxaparin or UFH prophylaxis in preventing PVT, but there may be a higher incidence of other thrombotic complications with enoxaparin. In the setting of routine screening and anticoagulation therapy, PVT is a self-limited process.


Subject(s)
Islets of Langerhans Transplantation/adverse effects , Pancreatectomy/adverse effects , Portal Vein , Postoperative Complications/prevention & control , Venous Thrombosis/prevention & control , Adult , Enoxaparin/therapeutic use , Female , Heparin/therapeutic use , Humans , Male , Middle Aged , Retrospective Studies , Transplantation, Autologous
9.
Phytopathology ; 109(7): 1157-1170, 2019 Jul.
Article in English | MEDLINE | ID: mdl-30860431

ABSTRACT

As complete host resistance in soybean has not been achieved, Sclerotinia stem rot (SSR) caused by Sclerotinia sclerotiorum continues to be of major economic concern for farmers. Thus, chemical control remains a prevalent disease management strategy. Pesticide evaluations were conducted in Illinois, Iowa, Michigan, Minnesota, New Jersey, and Wisconsin from 2009 to 2016, for a total of 25 site-years (n = 2,057 plot-level data points). These studies were used in network meta-analyses to evaluate the impact of 10 popular pesticide active ingredients, and seven common application timings on SSR control and yield benefit, compared with not treating with a pesticide. Boscalid and picoxystrobin frequently offered the best reductions in disease severity and best yield benefit (P < 0.0001). Pesticide applications (one- or two-spray programs) made during the bloom period provided significant reductions in disease severity index (DIX) (P < 0.0001) and led to significant yield benefits (P = 0.0009). Data from these studies were also used in nonlinear regression analyses to determine the effect of DIX on soybean yield. A three-parameter logistic model was found to best describe soybean yield loss (pseudo-R2 = 0.309). In modern soybean cultivars, yield loss due to SSR does not occur until 20 to 25% DIX, and considerable yield loss (-697 kg ha-1 or -10 bu acre-1) is observed at 68% DIX. Further analyses identified several pesticides and programs that resulted in greater than 60% probability for return on investment under high disease levels.


Subject(s)
Ascomycota , Glycine max/growth & development , Pesticides , Ascomycota/growth & development , Illinois , Iowa , Michigan , Minnesota , Plant Diseases/microbiology , Wisconsin
10.
Plant Dis ; 102(12): 2592-2601, 2018 12.
Article in English | MEDLINE | ID: mdl-30334675

ABSTRACT

In soybean, Sclerotinia sclerotiorum apothecia are the sources of primary inoculum (ascospores) critical for Sclerotinia stem rot (SSR) development. We recently developed logistic regression models to predict the presence of apothecia in irrigated and nonirrigated soybean fields. In 2017, small-plot trials were established to validate two weather-based models (one for irrigated fields and one for nonirrigated fields) to predict SSR development. Additionally, apothecial scouting and disease monitoring were conducted in 60 commercial fields in three states between 2016 and 2017 to evaluate model accuracy across the growing region. Site-specific air temperature, relative humidity, and wind speed data were obtained through the Integrated Pest Information Platform for Extension and Education (iPiPE) and Dark Sky weather networks. Across all locations, iPiPE-driven model predictions during the soybean flowering period (R1 to R4 growth stages) explained end-of-season disease observations with an accuracy of 81.8% using a probability action threshold of 35%. Dark Sky data, incorporating bias corrections for weather variables, explained end-of-season disease observations with 87.9% accuracy (in 2017 commercial locations in Wisconsin) using a 40% probability threshold. Overall, these validations indicate that these two weather-based apothecial models, using either weather data source, provide disease risk predictions that both reduce unnecessary chemical application and accurately advise applications at critical times.


Subject(s)
Ascomycota/physiology , Fungicides, Industrial/pharmacology , Glycine max/microbiology , Plant Diseases/statistics & numerical data , Algorithms , Ascomycota/drug effects , Flowers/microbiology , Fruiting Bodies, Fungal , Logistic Models , Plant Diseases/microbiology , Regression Analysis , Spores, Fungal , Weather , Wisconsin
12.
Plant Dis ; 102(1): 73-84, 2018 Jan.
Article in English | MEDLINE | ID: mdl-30673449

ABSTRACT

Sclerotinia stem rot (SSR) epidemics in soybean, caused by Sclerotinia sclerotiorum, are currently responsible for annual yield reductions in the United States of up to 1 million metric tons. In-season disease management is largely dependent on chemical control but its efficiency and cost-effectiveness depends on both the chemistry used and the risk of apothecia formation, germination, and further dispersal of ascospores during susceptible soybean growth stages. Hence, accurate prediction of the S. sclerotiorum apothecial risk during the soybean flowering period could enable farmers to improve in-season SSR management. From 2014 to 2016, apothecial presence or absence was monitored in three irrigated (n = 1,505 plot-level observations) and six nonirrigated (n = 2,361 plot-level observations) field trials located in Iowa (n = 156), Michigan (n = 1,400), and Wisconsin (n = 2,310), for a total of 3,866 plot-level observations. Hourly air temperature, relative humidity, dew point, wind speed, leaf wetness, and rainfall were also monitored continuously, throughout the season, at each location using high-resolution gridded weather data. Logistic regression models were developed for irrigated and nonirrigated conditions using apothecial presence as a binary response variable. Agronomic variables (row width) and weather-related variables (defined as 30-day moving averages, prior to apothecial presence) were tested for their predictive ability. In irrigated soybean fields, apothecial presence was best explained by row width (r = -0.41, P < 0.0001), 30-day moving averages of daily maximum air temperature (r = 0.27, P < 0.0001), and daily maximum relative humidity (r = 0.16, P < 0.05). In nonirrigated fields, apothecial presence was best explained by using moving averages of daily maximum air temperature (r = -0.30, P < 0.0001) and wind speed (r = -0.27, P < 0.0001). These models correctly predicted (overall accuracy of 67 to 70%) apothecial presence during the soybean flowering period for four independent datasets (n = 1,102 plot-level observations or 30 daily mean observations).


Subject(s)
Ascomycota/physiology , Crop Production/methods , Glycine max , Plant Diseases/microbiology , Weather , Ascomycota/growth & development , Iowa , Logistic Models , Michigan , Risk , Glycine max/growth & development , Spores, Fungal/physiology , Wisconsin
13.
Curr Pharm Teach Learn ; 9(6): 962-965, 2017 Nov.
Article in English | MEDLINE | ID: mdl-29233392

ABSTRACT

INTRODUCTION: There are ongoing assessment and improvement activities related to strategies to improve the quality of education in the complex and resource-intensive area of experiential education (EE). One undescribed approach for design and delivery of EE programs for schools and colleges, with reliance on volunteer preceptors, is to utilize clinical practice faculty in formal partnerships with EE leadership to enhance curriculum and assessment. COMMENTARY AND IMPLICATIONS: Clinical practice faculty, who possess practice setting expertise, can serve as course directors for advanced pharmacy practice experience (APPE) rotations. In this role, they can collaborate with EE faculty and staff to create more course-specific expectations, learning objectives, and criteria for APPE rotation experiences. This model could increase consistency for students and preceptors, using an approach that is analogous to content experts serving as course directors in didactic curriculum. This commentary explores the potential of this strategy to increase quality and consistency in EE.


Subject(s)
Faculty, Pharmacy/psychology , Problem-Based Learning/standards , Work Engagement , Curriculum/standards , Curriculum/trends , Education, Pharmacy/standards , Education, Pharmacy/trends , Humans , Program Development/methods , Students, Pharmacy/psychology , Workforce
14.
BMC Health Serv Res ; 17(1): 127, 2017 02 11.
Article in English | MEDLINE | ID: mdl-28187730

ABSTRACT

BACKGROUND: Ischemic stroke is a risk associated with atrial fibrillation (AF) and is estimated to occur five times more often in afflicted patients than in those without AF. Anti-thrombotic therapy is recommended for the prevention of ischemic stroke. Risk stratification tools, such as the CHADS2, and more recently the CHA2DS2-VASc, for predicting stroke in patients with AF have been developed to determine the level of stroke risk and assist clinicians in the selection of antithrombotic therapy. Warfarin, for stroke prevention in AF, is the most commonly prescribed anticoagulant in North America. The purpose of this study was to examine the utility of using the CHADS2 score levels (low and high) in contrast to the CHA2DS2-VASc when examining the outcome of warfarin prescriptions for adult patients with AF. The CHA2DS2-VASc tool was not widely used in 2010, when the data analyzed were collected. It has only been since 2014 that CHA2DS2-VASc criteria has been recommended to guide anticoagulant treatment in updated AF treatment guidelines. METHODS: Bivariate and multivariate data analysis strategies were used to analyze 2010 National Ambulatory Care Survey (NAMCS) data. NAMCS is designed to collect data on the use and provision of ambulatory care services nationwide. The study population for this research was US adults with a diagnosis of AF. Warfarin prescription was the dependent variable for this study. The study population was 7,669,844 AF patients. RESULTS: Bivariate analysis revealed that of those AF patients with a high CHADS2 score, 25.1% had received a warfarin prescription and 18.8 for those with a high CHA2DS2-VASc score. Logistic regression analysis yielded that patients with AF had higher odds of having a warfarin prescription if they had a high CHADS2 score, were Caucasian, lived in a zip code where < 20% of the population had a university education, and lived in a zip code where < 10% of the population were living in households with incomes below the federal poverty level. Further, the analysis yielded that patients with AF had lesser odds of having a warfarin prescription if they were ≥ 65 years of age, female, or had health insurance. CONCLUSIONS: Overall, warfarin appears to be under-prescribed for patients with AF regardless of the risk stratification system used. Based on the key findings of our study opportunities for interventions are present to improve guideline adherence in alignment with risk stratification for stroke prevention. Interprofessional health care teams can provide improved medical management of stroke prevention for patients with AF. These interprofessional health care teams should be constituted of primary care providers (physicians, physician assistants, and nurse practitioners), nurses (RN, LPN), and pharmacists (PharmD, RPh).


Subject(s)
Atrial Fibrillation/drug therapy , Guideline Adherence , Stroke/etiology , Warfarin/administration & dosage , Aged , Anticoagulants/therapeutic use , Atrial Fibrillation/complications , Cardiovascular Diseases/drug therapy , Female , Humans , Male , Middle Aged , Multivariate Analysis , North America , Risk Assessment/methods , Risk Factors
15.
J Investig Med ; 65(1): 15-22, 2017 01.
Article in English | MEDLINE | ID: mdl-27619555

ABSTRACT

The National Institute of Health's concept of team science is a means of addressing complex clinical problems by applying conceptual and methodological approaches from multiple disciplines and health professions. The ultimate goal is the improved quality of care of patients with an emphasis on better population health outcomes. Collaborative research practice occurs when researchers from >1 health-related profession engage in scientific inquiry to jointly create and disseminate new knowledge to clinical and research health professionals in order to provide the highest quality of patient care to improve population health outcomes. Training of clinicians and researchers is necessary to produce clinically relevant evidence upon which to base patient care for disease management and empirically guided team-based patient care. In this study, we hypothesized that team science is an example of effective and impactful interprofessional collaborative research practice. To assess this hypothesis, we examined the contemporary literature on the science of team science (SciTS) produced in the past 10 years (2005-2015) and related the SciTS to the overall field of interprofessional collaborative practice, of which collaborative research practice is a subset. A modified preferred reporting items for systematic reviews and meta-analyses (PRISMA) approach was employed to analyze the SciTS literature in light of the general question: Is team science an example of interprofessional collaborative research practice? After completing a systematic review of the SciTS literature, the posed hypothesis was accepted, concluding that team science is a dimension of interprofessional collaborative practice.


Subject(s)
Cooperative Behavior , Interprofessional Relations , Research , Science , Humans
16.
Am J Pharm Educ ; 80(4): 57, 2016 May 25.
Article in English | MEDLINE | ID: mdl-27293224

ABSTRACT

The profession of pharmacy is facing a shifting health system context that holds both opportunity and risk. If the profession of pharmacy is to advance, pharmacists must be recognized as a consistent member of the health care team in all clinical settings, contributing at the fullest extent of licensure and education. One part of achieving this broad goal is to implement a new way of defining and assessing pharmacy practice skills, such as entrustable professional activities (EPA). Assessment of professional tasks and practice activities with EPAs has been successfully implemented in medical education for assessing trainee preparation for practice. This EPA model is being applied to pharmacy education to develop an assessment framework across the advanced pharmacy practice experience (APPE) curriculum. The APPE course directors, practice faculty members, and the Office of Experiential Education collaboratively defined a set of universal EPAs critical for pharmacists in any practice setting and would be assessed in all practice experience types.


Subject(s)
Clinical Competence/standards , Pharmaceutical Services/standards , Pharmacists/standards , Professional Role , Trust , Education, Pharmacy/methods , Education, Pharmacy/standards , Humans
17.
J Surg Res ; 201(1): 181-7, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26850200

ABSTRACT

BACKGROUND: We report our experience dosing and monitoring enoxaparin with anti-factor Xa activity (anti-FXaA) levels for venous thromboembolism prophylaxis in trauma patients (TP). MATERIALS AND METHODS: TP receiving standard, non-weight-based dosed enoxaparin administered every 12 h for venous thromboembolism prophylaxis with peak anti-FXaA levels measured were prospectively monitored and evaluated and those whose first anti-FXaA levels ≥ or <0.2 IU/mL were compared. Anti-FXaA levels and enoxaparin dose (mg/kg actual body weight) were evaluated for correlation. RESULTS: Of the fifty-one TP included, initial anti-FXaA levels were <0.2 IU/mL in 37 (72.5%) whose dose was lower than those within target range (0.38 [0.32-0.42] mg/kg versus 0.45 [0.39-0.48] mg/kg, P = 0.003). Thirty-seven TP achieved anti-FXaA level ≥0.2 IU/mL (23 requiring dose increases) at a dose of 0.49 [0.44-0.54] mg/kg. Correlation between dose and anti-FXaA levels for the initial 51 anti-FXaA levels (r = 0.360, P = 0.009) and for all 103 anti-XaA levels (r = 0.556, P < 0.001) was noted. CONCLUSIONS: Non-weight-based enoxaparin dosing did not achieve target anti-FXaA levels in most TP. Higher anti-FXaA levels correlated with larger weight-based enoxaparin doses. Weight-based enoxaparin dosing (i.e., 0.5 mg/kg subcutaneously every 12 h) would better achieve target anti-FXaA levels.


Subject(s)
Anticoagulants/administration & dosage , Drug Monitoring/methods , Enoxaparin/administration & dosage , Factor Xa/metabolism , Venous Thromboembolism/prevention & control , Adult , Female , Humans , Male , Middle Aged , Retrospective Studies
20.
Pest Manag Sci ; 71(12): 1649-56, 2015 Dec.
Article in English | MEDLINE | ID: mdl-25582896

ABSTRACT

BACKGROUND: Multiple applications of pyrethroid insecticides are used to manage European corn borer, Ostrinia nubilalis Hübner, in snap bean, but new diamide insecticides may reduce application frequency. In a 2 year small-plot study, O. nubilalis control was evaluated by applying cyantraniliprole (diamide) and bifenthrin (pyrethroid) insecticides at one of three phenological stages (bud, bloom and pod formation) of snap bean development. Co-application of these insecticides with either herbicides or fungicides was also examined as a way to reduce the total number of sprays during a season. RESULTS: Cyantraniliprole applications timed either during bloom or during pod formation controlled O. nubilalis better than similar timings of bifenthrin. Co-applications of insecticides with fungicides controlled O. nubilalis as well as insecticide applications alone. Insecticides applied either alone or with herbicides during bud stage did not control this pest. CONCLUSION: Diamides are an alternative to pyrethroids for the management of O. nubilalis in snap bean. Adoption of diamides by snap bean growers could improve the efficiency of production by reducing the number of sprays required each season.


Subject(s)
Insecticides , Moths , Pest Control/methods , Animals , Fabaceae/growth & development , Fabaceae/parasitology , Fungicides, Industrial , Herbicides , Pyrazoles , Pyrethrins , ortho-Aminobenzoates
SELECTION OF CITATIONS
SEARCH DETAIL
...