Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
1.
JMIR Res Protoc ; 11(10): e36607, 2022 Oct 20.
Article in English | MEDLINE | ID: mdl-36264626

ABSTRACT

BACKGROUND: Older adults with cognitive impairment have more emergency department visits and 30-day readmissions and are more likely to die after visiting the emergency department than people without cognitive impairment. Emergency department providers frequently do not identify cognitive impairment. Use of cognitive screening tools, along with better understanding of root causes for emergency department visits, could equip health care teams with the knowledge needed to develop individually tailored care management strategies for post-emergency department care. By identifying and directly addressing patients' and informal caregivers' (or care partners') psychosocial and health care needs, such strategies could reduce the need for repeat acute care. We have used the terms "caregiver" and "care partner" interchangeably. OBJECTIVE: We aimed to describe the protocol for a randomized controlled trial of a new care management intervention, the Program of Intensive Support in Emergency Departments for Care Partners of Cognitively Impaired Patients (POISED) trial, compared with usual care. We described the research design, intervention, outcome measures, data collection techniques, and analysis plans. METHODS: Emergency department patients who were aged ≥75 years and screened positive for cognitive impairment via either the Mini-Cog or the proxy-reported Short Informant Questionnaire on Cognitive Decline in the Elderly, with a planned discharge to home, were recruited to participate with their identified informal (family or friend) caregiver in the 2-site POISED randomized controlled trial at New York University Langone Health and Indiana University. The intervention group received 6 months of care management from the POISED Care Team of registered nurses and specialty-trained paraprofessionals, who perform root cause analyses, administer standardized assessments, provide advice, recommend appropriate referrals, and, when applicable, implement dementia-specific comorbid condition protocols. The control group received care as recommended at emergency department discharge (usual care) and were given information about resources for further cognitive assessment. The primary outcome is repeat emergency department use; secondary outcomes include caregiver activation for patient health care management, caregiver depression, anxiety, and experience of social support as important predisposing and time-varying enabling and need characteristics. Data were collected from questionnaires and patients' electronic health records. RESULTS: Recruitment was conducted between March 2018 and May 2021. Study findings will be published in peer-reviewed journals and presented to peer audiences, decision makers, stakeholders, and other interested persons. CONCLUSIONS: The POISED intervention is a promising approach to tailoring care management based on root causes for emergency department admission of patients with cognitive impairment with the aim of reducing readmissions. This trial will provide insights for caregivers and emergency department and primary care providers on appropriate, personalized, and proactive treatment plans for older adults with cognitive impairment. The findings will be relevant to audiences concerned with quality of life for individuals with cognitive impairment and their caregivers. TRIAL REGISTRATION: ClinicalTrials.gov NCT03325608; https://clinicaltrials.gov/ct2/show/NCT03325608. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): DERR1-10.2196/36607.

2.
Theor Appl Genet ; 134(9): 2749-2766, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34117909

ABSTRACT

KEY MESSAGE: Polygenic genome-wide association mapping identified two regions of the cowpea genome associated with different components of resistance to its major post-harvest pest, the seed beetle Callosobruchus maculatus. Cowpea (Vigna unguiculata) is an important grain and fodder crop in arid and semi-arid regions of Africa, Asia, and South America, where the cowpea seed beetle, Callosobruchus maculatus, is a serious post-harvest pest. Development of cultivars resistant to C. maculatus population growth in storage could increase grain yield and quality and reduce reliance on insecticides. Here, we use a MAGIC (multi-parent, advanced-generation intercross) population of cowpea consisting of 305 recombinant inbred lines (RILs) to identify genetic variants associated with resistance to seed beetles. Because inferences regarding the genetic basis of resistance may depend on the source of the pest or the assay protocol, we used two divergent geographic populations of C. maculatus and two complementary assays to measure several aspects of resistance. Using polygenic genome-wide association mapping models, we found that the cowpea RILs harbor substantial additive-genetic variation for most resistance measures. Variation in several components of resistance, including larval development time and survival, was largely explained by one or several linked loci on chromosome 5. A second region on chromosome 8 explained increased seed resistance via the induction of early-exiting larvae. Neither of these regions contained genes previously associated with resistance to insects that infest grain legumes. We found some evidence of gene-gene interactions affecting resistance, but epistasis did not contribute substantially to resistance variation in this mapping population. The combination of mostly high heritabilities and a relatively consistent and simple genetic architecture increases the feasibility of breeding for enhanced resistance to C. maculatus.


Subject(s)
Chromosomes, Plant/genetics , Coleoptera/physiology , Disease Resistance/immunology , Genetic Variation , Plant Diseases/immunology , Plant Proteins/metabolism , Vigna/genetics , Animals , Chromosome Mapping/methods , Disease Resistance/genetics , Gene Expression Regulation, Plant , Genome-Wide Association Study , Plant Diseases/genetics , Plant Diseases/parasitology , Plant Proteins/genetics , Vigna/growth & development , Vigna/parasitology
3.
Evol Appl ; 13(10): 2597-2609, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33294011

ABSTRACT

Environmental stress can have a profound effect on inbreeding depression. Quantifying this effect is of particular importance in threatened populations, which are often simultaneously subject to both inbreeding and environmental stress. But while the prevalence of inbreeding-stress interactions is well known, the importance and broader applicability of such interactions in conservation are not clearly understood. We used seed beetles, Callosobruchus maculatus, as a model system to quantify how environmental stressors (here host quality and temperature stress) interact with inbreeding as measured by changes in the magnitude of inbreeding depression, δ, as well as the relative importance of inbreeding-stress interactions to overall fitness. We found that while both environmental stressors caused substantial inbreeding-stress interactions as measured by change in δ, the relative importance of these interactions to overall survival was modest. This suggests that assessing inbreeding-stress interactions within the framework of δ alone may give an inaccurate representation of the relevance of interactions to population persistence. Furthermore, we found that the effect of environmental stress on fitness, but not inbreeding depression, varied strongly among populations. These results suggest that the outcomes of inbreeding-stress interactions are not easily generalized, an important consideration in conservation settings.

4.
Environ Entomol ; 49(4): 938-946, 2020 08 20.
Article in English | MEDLINE | ID: mdl-32484545

ABSTRACT

The ability to adapt to a novel host plant may vary among insect populations with different genetic histories, and colonization of a marginal host may be facilitated by genetic admixture of disparate populations. We assembled populations of the seed beetle, Callosobruchus maculatus (F.), from four continents, and compared their ability to infest two hosts, lentil and pea. We also formed two cross-continent hybrids (Africa × N.A. and Africa × S.A.). In pre-selection assays, survival was only ~3% in lentil and ~40% in pea. For three replicate populations per line, colonization success on lentil was measured as cumulative exit holes after 75-175 d. On pea, we estimated the change in larval survival after five generations of selection. Females in all lines laid few eggs on lentil, and survival of F1 larvae was uniformly <5%. Subsequently, however, the lines diverged considerably in population growth. Performance on lentil was highest in the Africa × N.A. hybrid, which produced far more adults (mean > 11,000) than either parental line. At the other extreme, Asian populations on lentil appeared to have gone extinct. The Africa × N.A. line also exhibited the highest survival on pea, and again performed better than either parent line. However, no line displayed a rapid increase in survival on pea, as is sometimes observed on lentil. Our results demonstrate that geographic populations can vary substantially in their responses to the same novel resource. In addition, genetic admixtures (potentially caused by long-distance transport of infested seeds) may facilitate colonization of an initially poor host.


Subject(s)
Coleoptera , Animals , Coleoptera/genetics , Female , Larva/genetics , Ovum
5.
Genes (Basel) ; 11(4)2020 04 08.
Article in English | MEDLINE | ID: mdl-32276323

ABSTRACT

Genes that affect adaptive traits have been identified, but our knowledge of the genetic basis of adaptation in a more general sense (across multiple traits) remains limited. We combined population-genomic analyses of evolve-and-resequence experiments, genome-wide association mapping of performance traits, and analyses of gene expression to fill this knowledge gap and shed light on the genomics of adaptation to a marginal host (lentil) by the seed beetle Callosobruchus maculatus. Using population-genomic approaches, we detected modest parallelism in allele frequency change across replicate lines during adaptation to lentil. Mapping populations derived from each lentil-adapted line revealed a polygenic basis for two host-specific performance traits (weight and development time), which had low to modest heritabilities. We found less evidence of parallelism in genotype-phenotype associations across these lines than in allele frequency changes during the experiments. Differential gene expression caused by differences in recent evolutionary history exceeded that caused by immediate rearing host. Together, the three genomic datasets suggest that genes affecting traits other than weight and development time are likely to be the main causes of parallel evolution and that detoxification genes (especially cytochrome P450s and beta-glucosidase) could be especially important for colonization of lentil by C. maculatus.


Subject(s)
Coleoptera/genetics , Fabaceae/parasitology , Host-Parasite Interactions/genetics , Selection, Genetic , Adaptation, Physiological/genetics , Animals , Coleoptera/pathogenicity , Gene Frequency/genetics , Genomics , Larva/parasitology , Phenotype , Seeds/parasitology
6.
J Econ Entomol ; 112(5): 2418-2424, 2019 09 23.
Article in English | MEDLINE | ID: mdl-31081895

ABSTRACT

Cowpea, Vigna unguiculata (L.) Walp., serves as a major source of dietary protein in many tropical and subtropical regions around the world. To identify loci associated with agronomically desirable traits, eight elite cowpea cultivars were systematically inter-crossed for eight generations to yield 305 recombinant inbred lines. Here, we investigated whether these founder parents also possess resistance to the seed beetle Callosobruchus maculatus (F.), a highly destructive post-harvest pest. We estimated larval survival in seeds, egg-to-adult development time, adult mass at emergence, and seed acceptance for oviposition. Survival varied significantly among cowpea cultivars, but the pattern was complicated by an unexpected source of mortality; on three cultivars, mature larvae in a substantial fraction of seeds (20-36%) exited seeds prematurely, and consequently failed to molt into viable adults. Even if such seeds were eliminated from the analysis, survival in the remaining seeds varied from 49 to 92% across the eight parents. Development time and body mass also differed among hosts, with particularly slow larval development on three closely related cultivars. Egg-laying females readily accepted all cultivars except one with a moderately rugose seed coat. Overall, suitability ranks of the eight cultivars depended on beetle trait; a cultivar that received the most eggs (IT82E-18) also conferred low survival. However, one cultivar (IT93K-503-1) was a relatively poor host for all traits. Given the magnitude of variation among parental cultivars, future assays of genotyped recombinant progeny can identify genomic regions and candidate genes associated with resistance to seed beetles.


Subject(s)
Coleoptera , Vigna , Animals , Female , Larva , Oviposition , Seeds
7.
Mol Ecol ; 28(9): 2136-2154, 2019 05.
Article in English | MEDLINE | ID: mdl-30963641

ABSTRACT

Rapid adaptation can prevent extinction when populations are exposed to extremely marginal or stressful environments. Factors that affect the likelihood of evolutionary rescue from extinction have been identified, but much less is known about the evolutionary dynamics (e.g., rates and patterns of allele frequency change) and genomic basis of successful rescue, particularly in multicellular organisms. We conducted an evolve-and-resequence experiment to investigate the dynamics of evolutionary rescue at the genetic level in the cowpea seed beetle, Callosobruchus maculatus, when it is experimentally shifted to a stressful host plant, lentil. Low survival (~1%) at the onset of the experiment caused population decline. But adaptive evolution quickly rescued the population, with survival rates climbing to 69% by the F5 generation and 90% by the F10 generation. Population genomic data showed that rescue likely was caused by rapid evolutionary change at multiple loci, with many alleles fixing or nearly fixing within five generations of selection on lentil. Selection on these loci was only moderately consistent in time, but parallel evolutionary changes were evident in sublines formed after the lentil line had passed through a bottleneck. By comparing estimates of selection and genomic change on lentil across five independent C. maculatus lines (the new lentil-adapted line, three long-established lines and one case of failed evolutionary rescue), we found that adaptation on lentil occurred via somewhat idiosyncratic evolutionary changes. Overall, our results suggest that evolutionary rescue in this system can be caused by very strong selection on multiple loci driving rapid and pronounced genomic change.


Subject(s)
Coleoptera/genetics , Selection, Genetic , Adaptation, Physiological/genetics , Animals , Bayes Theorem , Biological Evolution , Gene Frequency , Genetic Drift , Genetic Fitness , Lens Plant , Linkage Disequilibrium , Models, Genetic , Polymorphism, Single Nucleotide , Seeds
8.
Contemp Clin Trials ; 72: 137-145, 2018 09.
Article in English | MEDLINE | ID: mdl-30125731

ABSTRACT

Nearly 85% of acute heart failure (AHF) patients who present to the emergency department (ED) with acute heart failure are hospitalized. Once hospitalized, within 30 days post-discharge, 27% of patients are re-hospitalized or die. Attempts to improve outcomes with novel therapies have all failed. The evidence for existing AHF therapies are poor: No currently used AHF treatment is known to improve long-term outcomes. ED treatment is largely the same today as 40 years ago. Admitting patients who could have avoided hospitalization may contribute to adverse outcomes. Hospitalization is not benign; patients enter a vulnerable phase post-discharge, at increased risk for morbidity and mortality. When hospitalization is able to be shortened or avoid completely, certain risks can be mitigated, including risk of medication errors, in-hospital falls, delirium, nosocomial infections, and other iatrogenic complications. Additionally, patients would prefer to be home, not hospitalized. Furthermore, hospitalization and re-hospitalization for AHF predominantly affects patients of lower socioeconomic status (SES). Avoiding hospitalization in patients who do not require admission may improve outcomes and quality of life, while reducing costs. Short stay unit (SSU: <24 h, also referred to as an 'observation unit') management of AHF may be effective for lower risk patients. However, to date there have only been small studies or retrospective analyses on the SSU management for AHF patients. In addition, SSU management has been considered 'cheating' for hospitals trying to avoid 30-day readmission penalties, as SSUs or observation units do not count as an admission. However, more recent analyses demonstrate differential use of observation status has not led to decreases in re-admission, suggesting this concern may be misplaced. Thus, we propose a robust clinical effectiveness trial to demonstrate the effectiveness of this patient-centered strategy.


Subject(s)
Clinical Observation Units , Heart Failure/therapy , Hospitalization , Acute Disease , Cost-Benefit Analysis , Disease Management , Emergency Service, Hospital , Humans , Mortality , Patient Outcome Assessment , Patient Readmission , Quality of Life , Treatment Outcome
9.
Environ Entomol ; 47(5): 1194-1202, 2018 10 03.
Article in English | MEDLINE | ID: mdl-30052864

ABSTRACT

Cosmopolitan pests can consist of geographic populations that differ in their current host ranges or in their ability to colonize a novel host. We compared the responses of cowpea-adapted, seed-beetle populations (Callosobruchus maculatus [F.] (Coleoptera: Chrysomelidae: Bruchinae)) from Africa, North America, and South America to four novel legumes: chickpea, lentil, mung bean, and pea. We also qualitatively compared these results to those obtained earlier for an Asian population. For each host, we measured larval survival to adult emergence and used both no-choice and choice tests to estimate host acceptance. The pattern of larval survival was similar among populations: high or moderately high survival on cowpea, mung bean, and chickpea, intermediate survival on pea, and very low survival on lentil. One exception was unusually high survival of African larvae on pea, and there was modest variation among populations for survival on lentil. The African population was also an outlier with respect to host acceptance; under no-choice conditions, African females showed a much greater propensity to accept the two least preferred hosts, chickpea and lentil. However, greater acceptance of these hosts by African females was not evident in choice tests. Inferences about population differences in host acceptance can thus strongly depend on experimental protocol. Future selection experiments can be used to determine whether the observed population differences in initial performance will affect the probability of producing self-sustaining populations on a marginal crop host.


Subject(s)
Coleoptera/physiology , Fabaceae/parasitology , Animals , Choice Behavior , Female , Host Specificity , Larva/physiology , Male , Seeds/parasitology
10.
Clin Nephrol ; 88(10): 181-192, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28818188

ABSTRACT

BACKGROUND: Current estimates suggest 6,500 undocumented end-stage renal disease (ESRD) patients in the United States are ineligible for scheduled hemodialysis and require emergent dialysis. In order to remain in compliance with Emergency Medicaid, an academic health center altered its emergency dialysis criteria from those emphasizing interdialytic interval to a set emphasizing numerical thresholds. We report the impact of this administrative change on the biochemical parameters, utilization, and adverse outcomes in an undocumented patient cohort. METHODS: This retrospective case series examines 19 undocumented ESRD patients during a 6-month transition divided into three 2-month periods (P1, P2, P3). In P1, patients received emergent dialysis based on interdialytic interval and clinical judgment. In P2 (early transition) and P3 (equilibrium), patients were dialyzed according to strict numerical criteria coupled with clinical judgment. RESULTS: Emergent criteria-based dialysis (P2 and P3) was associated with increased potassium, blood urea nitrogen (BUN), and acidosis as compared to P1 (p < 0.05). Overnight hospitalizations were more common in P2 and P3 (p < 0.05). More frequent adverse events were noted in P2 as compared to P1 and P3, with an odds ratio (OR) for the composite endpoint (intubation, bacteremia, myocardial infarction, intensive care unit admission) of 48 (5.9 - 391.2) and 16.5 (2.5 - 108.6), respectively. Per-patient reimbursement-to-cost ratios increased during criteria-based dialysis periods (P1: 1.49, P2: 2.3, P3: 2.49). DISCUSSION: Strict adherence to criteria-based dialysis models increases biochemical abnormalities while improving Medicaid reimbursement for undocumented immigrants. Alternatives to emergent dialysis are required which minimize cost, while maintaining dignity, safety, and quality of life.
.


Subject(s)
Emergencies , Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Undocumented Immigrants , Adult , Female , Hospitalization , Humans , Intensive Care Units , Male , Middle Aged , Quality of Life , Retrospective Studies , United States , Young Adult
11.
Dementia (London) ; 16(3): 329-343, 2017 Apr.
Article in English | MEDLINE | ID: mdl-26112165

ABSTRACT

Purpose of the study The study objective was to understand providers' perceptions regarding identifying and treating older adults with delirium, a common complication of acute illness in persons with dementia, in the pre-hospital and emergency department environments. Design and methods The authors conducted structured focus group interviews with separate groups of emergency medical services staff, emergency nurses, and emergency physicians. Recordings of each session were transcribed, coded, and analyzed for themes with representative supporting quotations identified. Results Providers shared that the busy emergency department environment was the largest challenge to delirium recognition and treatment. When describing delirium, participants frequently detailed hyperactive features of delirium, rather than hypoactive features. Participants shared that they employed no clear diagnostic strategy for identifying the condition and that they used heterogeneous approaches to treat the condition. To improve care for older adults with delirium, emergency nurses identified the need for more training around the management of the condition. Emergency medical services providers identified the need for more support in managing agitated patients when in transport to the hospital and more guidance from emergency physicians on what information to collect from the patient's home environment. Emergency physicians felt that delirium care would be improved if they could have baseline mental status data on their patients and if they had access to a simple, accurate diagnostic tool for the condition. Implications Emergency medical services providers, emergency nurses, and emergency physicians frequently encounter delirious patients, but do not employ clear diagnostic strategies for identifying the condition and have varying levels of comfort in managing the condition. Clear steps should be taken to improve delirium care in the emergency department including the development of mechanisms to communicate patients' baseline mental status, the adoption of a systematized approach to recognizing delirium, and the institution of a standardized method to treat the condition when identified.


Subject(s)
Attitude of Health Personnel , Delirium/diagnosis , Delirium/therapy , Delirium/nursing , Delirium/psychology , Emergency Medical Services , Focus Groups , Humans , Physicians/psychology
12.
Evolution ; 70(6): 1249-64, 2016 06.
Article in English | MEDLINE | ID: mdl-27130550

ABSTRACT

Trade-offs have often been invoked to explain the evolution of ecological specialization. Phytophagous insects have been especially well studied, but there has been little evidence that resource-based trade-offs contribute to the evolution of host specialization in this group. Here, we combine experimental evolution and partial genome resequencing of replicate seed beetle selection lines to test the trade-off hypothesis and measure the repeatability of evolution. Bayesian estimates of selection coefficients suggest that rapid adaptation to a poor host (lentil) was mediated by standing genetic variation at multiple genetic loci and involved many of the same variants in replicate lines. Sublines that were then switched back to the ancestral host (mung bean) showed a more gradual and variable (less repeatable) loss of adaptation to lentil. We were able to obtain estimates of variance effective population sizes from genome-wide differences in allele frequencies within and between lines. These estimates were relatively large, which suggests that the contribution of genetic drift to the loss of adaptation following reversion was small. Instead, we find that some alleles that were favored on lentil were selected against during reversion on mung bean, consistent with the genetic trade-off hypothesis.


Subject(s)
Adaptation, Biological , Coleoptera/genetics , Genome, Insect , Polymorphism, Single Nucleotide , Animals , Bayes Theorem , Food Chain , Gene Frequency , Models, Genetic , Sequence Analysis, DNA
13.
J Am Med Dir Assoc ; 17(6): 541-6, 2016 06 01.
Article in English | MEDLINE | ID: mdl-27052563

ABSTRACT

OBJECTIVES: To describe emergency department (ED) utilization among long-stay nursing home residents with different levels of dementia severity. DESIGN: Retrospective cohort study. SETTING: Public Health System. PARTICIPANTS: A total of 4491 older adults (age 65 years and older) who were long-stay nursing home residents. MEASUREMENTS: Patient demographics, dementia severity, comorbidities, ED visits, ED disposition decisions, and discharge diagnoses. RESULTS: Forty-seven percent of all long-stay nursing home residents experienced at least 1 transfer to the ED over the course of a year. At their first ED transfer, 36.4% of the participants were admitted to the hospital, whereas 63.1% of those who visited the ED were not. The median time to first ED visit for the participants with advanced stage dementia was 258 days, whereas it was 250 days for the participants with early to moderate stage dementia and 202 days for the participants with no dementia (P = .0034). Multivariate proportional hazard modeling showed that age, race, number of comorbidities, number of hospitalizations in the year prior, and do not resuscitate status all significantly influenced participants' time to first ED visit (P < .05 for all). After accounting for these effects, dementia severity (P = .66), years in nursing home before qualification (P = .46), and gender (P = .36) lost their significance. CONCLUSIONS: This study confirms high rates of transfer of long-stay nursing home residents, with nearly one-half of the participants experiencing at least 1 ED visit over the course of a year. Although dementia severity is not a predictor of time to ED use in our analyses, other factors that influence ED use are readily identifiable. Nursing home providers should be aware of these factors when developing strategies that meet patient care goals and avoid transfer from the nursing home to the ED.


Subject(s)
Dementia/physiopathology , Emergency Service, Hospital/statistics & numerical data , Nursing Homes , Aged , Aged, 80 and over , Comorbidity , Female , Humans , Long-Term Care , Male , Retrospective Studies , Severity of Illness Index
14.
Alzheimer Dis Assoc Disord ; 30(1): 35-40, 2016.
Article in English | MEDLINE | ID: mdl-26523710

ABSTRACT

Although persons with dementia are frequently hospitalized, relatively little is known about the health profile, patterns of health care use, and mortality rates for patients with dementia who access care in the emergency department (ED). We linked data from our hospital system with Medicare and Medicaid claims, Minimum Data Set, and Outcome and Assessment Information Set data to evaluate 175,652 ED visits made by 10,354 individuals with dementia and 15,020 individuals without dementia over 11 years. Survival rates after ED visits and associated charges were examined. Patients with dementia visited the ED more frequently, were hospitalized more often than patients without dementia, and had an increased odds of returning to the ED within 30 days of an index ED visit compared with persons who never had a dementia diagnosis (odds ratio, 2.29; P<0.001). Survival rates differed significantly between patients by dementia status (P<0.001). Mean Medicare payments for ED services were significantly higher among patients with dementia. These results show that older adults with dementia are frequent ED visitors who have greater comorbidity, incur higher charges, are admitted to hospitals at higher rates, return to EDs at higher rates, and have higher mortality after an ED visit than patients without dementia.


Subject(s)
Dementia , Emergency Service, Hospital/statistics & numerical data , Aged , Aged, 80 and over , Comorbidity , Dementia/mortality , Female , Health Care Costs/statistics & numerical data , Humans , Male , Medicaid/statistics & numerical data , Medicare/statistics & numerical data , Middle Aged , Patient Readmission/statistics & numerical data , Survival Rate , United States
15.
Dementia (London) ; 15(5): 913-30, 2016 Sep.
Article in English | MEDLINE | ID: mdl-25128821

ABSTRACT

PURPOSE OF THE STUDY: Cognitive impairment (CI) is one of several factors known to influence hospitalization, hospital length of stay, and rehospitalization among older adults. Redesigning care delivery systems sensitive to the influence of CI may reduce acute care utilization while improving care quality. To develop a foundation of fundamental needs for health care redesign, we conducted focus groups with inpatient and outpatient providers to identify barriers, facilitators, and suggestions for improvements in care delivery for patients with CI. DESIGN AND METHODS: Focus group sessions were conducted with providers to identify their approach to caring for cognitively impaired hospitalized adults; obstacles and facilitators to providing this care; and suggestions for improving the care process. Using a thematic analysis, two reviewers analyzed these transcripts to develop codes and themes. RESULTS: Seven themes emerged from the focus group transcripts. These were: (1) reflections on serving the cognitively impaired population; (2) descriptions of perceived barriers to care; (3) strategies that improve or facilitate caring for hospitalized older adults; (4) the importance of fostering a hospital friendly to the needs of older adults; (5) the need for educating staff, patients, and caregivers; (6) the central role of good communication; and (7) steps needed to provide more effective care. IMPLICATIONS: Providing effective acute care services to older adults with CI is an important challenge in health care reform. An understanding derived from the perspective of multiple professional disciplines is an important first step. Future research will build on this preliminary study in developing new acute care models for patients with CI.


Subject(s)
Attitude of Health Personnel , Cognitive Dysfunction/therapy , Needs Assessment/organization & administration , Quality of Health Care/organization & administration , Health Services Accessibility/organization & administration , Hospitalization , Humans
16.
Ann Emerg Med ; 63(5): 551-560.e2, 2014 May.
Article in English | MEDLINE | ID: mdl-24355431

ABSTRACT

Older adults who visit emergency departments (EDs) often experience delirium, but it is infrequently recognized. A systematic review was therefore conducted to identify what delirium screening tools have been used in ED-based epidemiologic studies of delirium, whether there is a validated set of screening instruments to identify delirium among older adults in the ED or prehospital environments, and an ideal schedule during an older adult's visit to perform a delirium evaluation. MEDLINE/EMBASE, Cochrane, PsycINFO, and CINAHL databases were searched from inception through February 2013 for original, English-language research articles reporting on the assessment of older adults' mental status for delirium. Twenty-two articles met all study inclusion criteria. Overall, 7 screening instruments were identified, though only 1 has undergone initial validation for use in the ED environment and a second instrument is currently undergoing such validation. Minimal information was identified to suggest the ideal scheduling of a delirium assessment process to maximize the recognition of this condition in the ED. Study results indicate that several delirium screening tools have been used in investigations in the ED, though validation of these instruments for this particular environment has been minimal to date. The ideal interval(s) during which a delirium screening process should take place has yet to be determined. Research will be needed both to validate delirium screening instruments to be used for investigation and clinical care in the ED and to define the ideal timing and form of the delirium assessment process for older adults.


Subject(s)
Delirium/diagnosis , Emergency Service, Hospital , Aged , Geriatric Assessment , Humans , Mass Screening , Neuropsychological Tests
17.
Am J Emerg Med ; 31(10): 1495-500, 2013 Oct.
Article in English | MEDLINE | ID: mdl-24035046

ABSTRACT

OBJECTIVE: Many patients discharged from the emergency department (ED) require urgent follow-up with specialty providers. We hypothesized that a unique specialty referral mechanism that minimized barriers would increase follow-up compliance over reported and historical benchmarks. METHODS: Retrospective review of all patients requiring urgent (within 1 month) specialty referrals in 2010 from a safety net hospital ED to dermatology, otolaryngology, neurology, neurosurgery, ophthalmology, urology, plastic surgery, general surgery, or vascular surgery clinics. After specialist input, all patients received a specific follow-up appointment before ED discharge via a specific scheduling service. Necessity for payment at the follow-up visit was waived. RESULTS: Of the 1174 receiving referrals, 85.6% of patients scheduled an appointment and 80.1% kept that appointment. After logistic regression analysis, the factors that remained significantly associated (P < .05) with appointment-keeping compliance were the specialty clinic type (dermatology, 61.5%, to ophthalmology, 98.0%), insurance status (other payer, 87.5%; commercial, 82.8%; Medicaid, 77.9%; Medicare, 85.7%; charity care program, 88.1%; self-pay, 73.0%), age (<18 years, 80.1%; 18-34 years, 75.0%; 35-49 years, 79.2%; 50-64 years, 85.9 %; >64 years, 93.9%), and mean length of time between ED visit and clinic appointment (kept, 10.5 days; not kept, 14.3 days). The specialty clinic (neurology, 72.8%, to vascular surgery, 100%; P < .001) was significantly associated with the likelihood of patients to complete the appointment-making process. Race/Ethnicity was not associated with either scheduling or keeping an appointment. CONCLUSION: A referral process that minimizes barriers can achieve an 80% follow-up compliance rate. Age, insurance, specialty type, and time to appointment are associated with noncompliance.


Subject(s)
Emergency Service, Hospital , Medicine/organization & administration , Quality Improvement/organization & administration , Referral and Consultation/organization & administration , Adolescent , Adult , Age Factors , Aged , Continuity of Patient Care/organization & administration , Female , Humans , Insurance Coverage/statistics & numerical data , Insurance, Health/statistics & numerical data , Male , Middle Aged , Outcome and Process Assessment, Health Care , Patient Compliance/statistics & numerical data , Quality of Health Care/organization & administration , Retrospective Studies , Time Factors , Young Adult
18.
Environ Entomol ; 42(4): 733-42, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23905736

ABSTRACT

Geographic populations of a widespread species can differ in their ability to adapt to a novel environment because they possess different amounts of the requisite genetic variation. We compared responses to the same novel host in ecologically and genetically divergent populations of the seed beetle Callosobruchus maculatus (F.). Populations from Africa and Asia had been derived from and maintained on different legume hosts. In preselection assays, both populations exhibited lower survival, slower development, and smaller size on a third host (adzuki bean), and the difference in performance between the ancestral and novel hosts was especially high for the African population. Replicate lines of each population were switched to adzuki bean or maintained on the ancestral host, and beetle performance was measured on both hosts after 12 generations. Survival on adzuki bean increased substantially in the adzuki-bean lines of the African population, but improved only slightly in the Asian lines. Similarly, only the African adzuki-bean lines exhibited significantly faster development on adzuki bean. Improved performance on adzuki bean did not simultaneously reduce performance on the ancestral host. Together with previous studies, these results confirm that populations of C. maculatus often possess sufficient standing genetic variation for rapid adaptation to a novel host, but the magnitude of the response may depend on the source population. Although international trade in grain legumes can expand beetle host ranges and produce unusual biotypes, the consistent absence of strong genetic trade-offs in larval performance or adult oviposition across hosts makes it unlikely that this insect would form distinct host races.


Subject(s)
Coleoptera/physiology , Fabaceae/physiology , Herbivory , Adaptation, Biological , Animals , Burkina Faso , Coleoptera/genetics , Coleoptera/growth & development , Diet , Female , Genetic Variation , Geography , India , Larva/genetics , Larva/growth & development , Larva/physiology , Male
19.
J Emerg Med ; 45(4): e127-31, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23845521

ABSTRACT

BACKGROUND: Fresh human cadavers provide an effective model for procedural training. Currently, there are no realistic models to teach fascial compartment pressure measurement. OBJECTIVES: We created a human cadaver fascial compartment pressure measurement model and studied its feasibility with a pre-post design. METHODS: Three faculty members, following instructions from a common procedure textbook, used a standard handheld intra-compartment pressure monitor (Stryker(®), Kalamazoo, MI) to measure baseline pressures ("unembalmed") in the anterior, lateral, deep posterior, and superficial posterior compartments of the lower legs of a fresh human cadaver. The right femoral artery was then identified by superficial dissection, cannulated distally towards the lower leg, and connected to a standard embalming machine. After a 5-min infusion, the same three faculty members re-measured pressures ("embalmed") of the same compartments on the cannulated right leg. Unembalmed and embalmed readings for each compartment, and baseline readings for each leg, were compared using a two-sided paired t-test. RESULTS: The mean baseline compartment pressures did not differ between the right and left legs. Using the embalming machine, compartment pressure readings increased significantly over baseline for three of four fascial compartments; all in mm Hg (±SD): anterior from 40 (±9) to 143 (±44) (p = 0.08); lateral from 22 (±2.5) to 160 (±4.3) (p < 0.01); deep posterior from 34 (±7.9) to 161 (±15) (p < 0.01); superficial posterior from 33 (±0) to 140 (±13) (p < 0.01). CONCLUSION: We created a novel and measurable fascial compartment pressure measurement model in a fresh human cadaver using a standard embalming machine. Set-up is minimal and the model can be incorporated into teaching curricula.


Subject(s)
Anterior Compartment Syndrome/diagnosis , Education, Medical/methods , Fascia , Manometry , Cadaver , Embalming , Humans , Pressure
20.
Genetica ; 136(1): 179-87, 2009 May.
Article in English | MEDLINE | ID: mdl-19039667

ABSTRACT

Independent populations subjected to similar environments often exhibit convergent evolution. An unresolved question is the frequency with which such convergence reflects parallel genetic mechanisms. We examined the convergent evolution of egg-laying behavior in the seed-feeding beetle Callosobruchus maculatus. Females avoid ovipositing on seeds bearing conspecific eggs, but the degree of host discrimination varies among geographic populations. In a previous experiment, replicate lines switched from a small host to a large one evolved reduced discrimination after 40 generations. We used line crosses to determine the genetic architecture underlying this rapid response. The most parsimonious genetic models included dominance and/or epistasis for all crosses. The genetic architecture underlying reduced discrimination in two lines was not significantly different from the architecture underlying differences between geographic populations, but the architecture underlying the divergence of a third line differed from all others. We conclude that convergence of this complex trait may in some cases involve parallel genetic mechanisms.


Subject(s)
Coleoptera/genetics , Evolution, Molecular , Oviposition/genetics , Animals , Coleoptera/physiology , Crosses, Genetic , Female , Genetic Variation , Genetics, Population , Male , Oviposition/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...