Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 64
Filter
1.
Injury ; 54(10): 110981, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37596120

ABSTRACT

INTRODUCTION: Suicide claims many lives globally, each year. For every person that dies by suicide, multitudes more attempt it. A national shortage of psychiatrists may prevent many individuals from receiving timely mental health care. For many individuals, the primary entry point into the healthcare system is through the emergency department. The trauma service frequently treats patients with severe self-inflicted injuries and for many this is not the first time. This represents an opportunity for intervention to disrupt the cycle and prevent future death. METHODS: We conducted a retrospective chart review of all patients with self-inflicted injuries, admitted to the trauma surgery service between 2012 and 2021. All patients above 10 years old were included. RESULTS: Four hundred forty-one patients were admitted due to self-injurious behavior in the period under study. The majority of patients (71.9%) had a pre-existing mental health disorder. Fifty six patients suffered fatal injuries; the majority were White (78.6%), males (80.3%), and were inflicted by gunshot (71.4%). Nearly one third of patients with self-inflicted injuries had a history of self-injurious behavior with the average number of attempts being 2.7 (SD: ±3.8). CONCLUSIONS: We need interdisciplinary and innovative solutions for this public health crisis. Perhaps telemedicine can be used to buttress the access to adequate mental health care. More research needs to be done to better identify the barriers individuals encounter in accessing mental health care, both pre- and post-crisis. The goal is that, by identifying the gaps, we can collaboratively bridge them to prevent a preventable death.


Subject(s)
Self-Injurious Behavior , Suicide , Male , Humans , Child , Trauma Centers , Retrospective Studies , Self-Injurious Behavior/epidemiology , Emergency Service, Hospital
2.
Br J Cancer ; 129(1): 81-93, 2023 07.
Article in English | MEDLINE | ID: mdl-37137996

ABSTRACT

BACKGROUND: People with severe mental illness (SMI) are 2.5 times more likely to die prematurely from cancer in England. Lower participation in screening may be a contributing factor. METHODS: Clinical Practice Research Datalink data for 1.71 million, 1.34 million and 2.50 million adults were assessed (using multivariate logistic regression) for possible associations between SMI and participation in bowel, breast and cervical screening, respectively. RESULTS: Screening participation was lower among adults with SMI, than without, for bowel (42.11% vs. 58.89%), breast (48.33% vs. 60.44%) and cervical screening (64.15% vs. 69.72%; all p < 0.001). Participation was lowest in those with schizophrenia (bowel, breast, cervical: 33.50%, 42.02%, 54.88%), then other psychoses (41.97%, 45.57%, 61.98%), then bipolar disorder (49.94%, 54.35%, 69.69%; all p-values < 0.001, except cervical screening in bipolar disorder; p-value > 0.05). Participation was lowest among people with SMI who live in the most deprived quintile of areas (bowel, breast, cervical: 36.17%, 40.23%, 61.47%), or are of a Black ethnicity (34.68%, 38.68%, 64.80%). Higher levels of deprivation and diversity, associated with SMI, did not explain the lower participation in screening. CONCLUSIONS: In England, participation in cancer screening is low among people with SMI. Support should be targeted to ethnically diverse and socioeconomically deprived areas, where SMI prevalence is greatest.


Subject(s)
Mental Disorders , Uterine Cervical Neoplasms , Female , Adult , Humans , Early Detection of Cancer , Cross-Sectional Studies , Uterine Cervical Neoplasms/diagnosis , Uterine Cervical Neoplasms/epidemiology , Uterine Cervical Neoplasms/complications , Mental Disorders/epidemiology , Mental Disorders/complications , Primary Health Care
3.
Proteomes ; 11(1)2023 Feb 09.
Article in English | MEDLINE | ID: mdl-36810563

ABSTRACT

For potato crops, host resistance is currently the most effective and sustainable tool to manage diseases caused by the plasmodiophorid Spongospora subterranea. Arguably, zoospore root attachment is the most critical phase of infection; however, the underlying mechanisms remain unknown. This study investigated the potential role of root-surface cell-wall polysaccharides and proteins in cultivars resistant/susceptible to zoospore attachment. We first compared the effects of enzymatic removal of root cell-wall proteins, N-linked glycans and polysaccharides on S. subterranea attachment. Subsequent analysis of peptides released by trypsin shaving (TS) of root segments identified 262 proteins that were differentially abundant between cultivars. These were enriched in root-surface-derived peptides but also included intracellular proteins, e.g., proteins associated with glutathione metabolism and lignin biosynthesis, which were more abundant in the resistant cultivar. Comparison with whole-root proteomic analysis of the same cultivars identified 226 proteins specific to the TS dataset, of which 188 were significantly different. Among these, the pathogen-defence-related cell-wall protein stem 28 kDa glycoprotein and two major latex proteins were significantly less abundant in the resistant cultivar. A further major latex protein was reduced in the resistant cultivar in both the TS and whole-root datasets. In contrast, three glutathione S-transferase proteins were more abundant in the resistant cultivar (TS-specific), while the protein glucan endo-1,3-beta-glucosidase was increased in both datasets. These results imply a particular role for major latex proteins and glucan endo-1,3-beta-glucosidase in regulating zoospore binding to potato roots and susceptibility to S. subterranea.

4.
Acta Trop ; 239: 106837, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36657506

ABSTRACT

Aedes aegypti is one of the most dominant mosquito species in the urban areas of Miami-Dade County, Florida, and is responsible for the local arbovirus transmissions. Since August 2016, mosquito traps have been placed throughout the county to improve surveillance and guide mosquito control and arbovirus outbreak response. In this paper, we develop a deterministic mosquito population model, estimate model parameters by using local entomological and temperature data, and use the model to calibrate the mosquito trap data from 2017 to 2019. We further use the model to compare the Ae. aegypti population and evaluate the impact of rainfall intensity in different urban built environments. Our results show that rainfall affects the breeding sites and the abundance of Ae. aegypti more significantly in tourist areas than in residential places. In addition, we apply the model to quantitatively assess the effectiveness of vector control strategies in Miami-Dade County.


Subject(s)
Aedes , Arboviruses , Animals , Mosquito Vectors , Mosquito Control/methods , Models, Theoretical , Cell Proliferation
5.
Arch Orthop Trauma Surg ; 143(7): 4411-4424, 2023 Jul.
Article in English | MEDLINE | ID: mdl-36462060

ABSTRACT

BACKGROUND: Extensor mechanism rupture is a severe complication with an incidence of 0.1-2.5% after total knee arthroplasty (TKA). Achilles tendon allograft (ATA) and extensor mechanism allograft (EMA) in TKA surgery have yielded mixed clinical results. Our systematic review aims to identify the proportion of failure in extensor mechanism reconstruction after TKA using allograft and evaluate clinical and functional outcomes and the most common complications. Furthermore, we performed a meta-analysis among studies dealing with isolated patellar tendon ruptures to assess the failure rate, surgical complications, and clinical findings (extensor lag and knee range of motion) of extensor mechanism reconstruction using either ATA or EMA grafts. METHODS: A systematic review of the literature was performed following the PRISMA guidelines, including the studies dealing with the use of EMA and ATA for extensor mechanism rupture following TKA. Coleman Methodology Score and the MINORS score were used to assess the quality of the studies. A meta-analysis was performed to evaluate the failure rate, complications, and clinical findings (extensor lag and knee range of motion) of the ATA and EMA treatments in isolated patellar tendon ruptures. RESULTS: A total of 238 patients (245 knees), with a mean age ranging from 54 to 74 years, who underwent extensor mechanism reconstruction with an allograft were identified in the 18 included studies. We analysed 166 patellar tendon ruptures, 29 quadriceps tendon ruptures, and 29 patellar fractures in the analysis. A chronic injury was described in the majority of included cases. ATA and whole EMA were used in 89 patients (92 knees) and 149 patients (153 knees), respectively. The overall failure percentage was 23%, while EMA and ATA were 23 and 24%. The most common complication was extensor lag (≥ 20°). The overall incidence of postoperative infection was 7%. Eleven of 14 included papers reported more than 100° of the mean postoperative knee flexion. The percentage of patients requiring walking aids is 55 and 34.5% in ATA and EMA, respectively. The failure outcome after extensor mechanism reconstruction in isolated patellar tendon ruptures was 27%, with no statistical difference between EMA and ATA in terms of failure rate and clinical outcomes. CONCLUSIONS: Extensor mechanism reconstruction with allograft represents a valid treatment option in patients with acute or chronic rupture following total knee arthroplasty. Persistent extensor lag represents the most common complication. EMA is associated with a lower frequency of patients requiring walking aids at last follow-up, although it has similar clinical and functional outcomes to ATA. In patellar tendon ruptures, ATA has a comparable success rate with EMA. LEVEL OF EVIDENCE: Level IV, therapeutic study. TRIAL REGISTRATION: PROSPERO 2019 CRD42019141574.


Subject(s)
Achilles Tendon , Arthroplasty, Replacement, Knee , Knee Injuries , Patellar Ligament , Tendon Injuries , Humans , Middle Aged , Aged , Arthroplasty, Replacement, Knee/adverse effects , Patellar Ligament/surgery , Achilles Tendon/transplantation , Knee Joint/surgery , Tendon Injuries/surgery , Tendon Injuries/etiology , Knee Injuries/surgery , Rupture/surgery , Rupture/etiology , Range of Motion, Articular , Allografts/surgery , Treatment Outcome
6.
J Math Biol ; 85(1): 6, 2022 07 07.
Article in English | MEDLINE | ID: mdl-35796836

ABSTRACT

In this paper, we use an integrodifference equation model and pairwise invasion analysis to find what dispersal strategies are evolutionarily stable strategies (also known as evolutionarily steady or ESS) when there is spatial heterogeneity and possibly seasonal variation in habitat suitability. In that case there are both advantages and disadvantages of dispersing. We begin with the case where all spatial locations can support a viable population, and then consider the case where there are non-viable regions in the habitat. If the viable regions vary seasonally, and the viable regions in summer and winter do not overlap, dispersal may really be necessary for sustaining a population. Our findings generally align with previous findings in the literature that were based on other modeling frameworks, namely that dispersal strategies associated with ideal free distributions are evolutionarily stable. In the case where only part of the habitat can sustain a population, we show that a partial occupation ideal free distribution that occupies only the viable region is associated with a dispersal strategy that is evolutionarily stable. As in some previous works, the proofs of these results make use of properties of line sum symmetric functions, which are analogous to those of line sum symmetric matrices but applied to integral operators.


Subject(s)
Biological Evolution , Models, Biological , Ecosystem , Population Dynamics , Seasons
7.
J Med Screen ; 29(4): 224-230, 2022 12.
Article in English | MEDLINE | ID: mdl-35578552

ABSTRACT

OBJECTIVE: Despite several interventions to increase participation in England, most colorectal cancers (CRCs) are diagnosed outside of the screening programme. The aims of this study were to better understand why most CRCs are diagnosed externally, the extent to which this is due to suboptimal uptake of screening, and the extent to which it is due to other factors, such as false-negative test results. SETTING / METHODS: We performed a clinical audit of 1011 patients diagnosed with CRC at St Mark's Hospital (Harrow, UK) between January 2017 and December 2020. Data on the diagnostic pathway and screening history of individuals were extracted from the bowel cancer screening system and assessed using descriptive statistics. RESULTS: 446/1011 (44.1%) patients diagnosed with CRC were eligible for screening at the time of diagnosis. Of these, only 115/446 (25.8%) were diagnosed through screening. Among those diagnosed via non-screening pathways, 210/331 (63.4%) had never taken part in screening, 31/331 (9.4%) had taken part but were not up to date, and 89/331 (26.9%) had taken part and were up-to-date (of these, 82/89 [92.2%] had received a normal or weak positive test result, and 5/89 [5.6%] had received a positive result and declined colonoscopy). CONCLUSION: Nearly two-thirds of screening eligible patients diagnosed through a non-screening pathway had never taken part in screening. This represents the single largest source of inefficiency within the screening programme, followed by missed findings and inconsistent participation. Given the improved outcomes associated with screen-detected cancers, there is a strong public health mandate to encourage participation.


Subject(s)
Colorectal Neoplasms , Early Detection of Cancer , Humans , Colonoscopy , Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/epidemiology , Early Detection of Cancer/methods , Mass Screening/methods , Occult Blood , Retrospective Studies
8.
Am J Infect Control ; 50(12): 1333-1338, 2022 12.
Article in English | MEDLINE | ID: mdl-35131347

ABSTRACT

BACKGROUND: Ventilator-associated pneumonia (VAP) is considered the most common hospital acquired infection seen in critical care settings and leading cause of death in Intensive Care Units (ICU). The objective of this study was to assess whether specimen collection impacted diagnosis and if implementation of a VAP bundle would decrease rates at our center. METHODS: This single center study design is a retrospective chart review from 2017 to 2020 utilizing the electronic medical record. A pre-/postintervention comparison was performed following implementation of a unit wide VAP bundle and nursing education. Descriptive statistics and continuous variables were analyzed with independent group t -tests, and categorical variables were analyzed with chi-squared tests. RESULTS: Ventilator-associated pneumonia rates decreased in the postimplementation time (20.8%, n = 74 vs 12.2%, n = 15; P = .03). There were no significant differences in the patient profile of those who acquired VAP (ie, males 79.7% vs 86.7%, blunt injuries 63.5% vs 86.7% and severity scores 24.8 vs 25.1, pre vs postimplementation, respectively, all P-values greater than .05). DISCUSSION/CONCLUSIONS: Reduction in VAP rates were achieved by implementing a standardized, evidence based, prevention protocol. Further research is warranted as studies have noted that patients requiring mechanical ventilation are at greater risk for VAP than other ICU patients due to the nature of their injuries and increased risk of prolonged mechanical ventilation ≥ 21 days.


Subject(s)
Pneumonia, Ventilator-Associated , Male , Humans , Pneumonia, Ventilator-Associated/prevention & control , Retrospective Studies , Intensive Care Units , Critical Care/methods , Documentation
9.
BMC Anesthesiol ; 22(1): 43, 2022 02 09.
Article in English | MEDLINE | ID: mdl-35139802

ABSTRACT

BACKGROUND: Loss of resistance (LOR) for epidural catheter placement has been utilized for almost a century. LOR is a subjective endpoint associated with a high failure rate. Nerve stimulation (NS) has been described as an objective method for confirming placement of an epidural catheter. We hypothesized that the addition of NS to LOR would improve the success of epidural catheter placement. METHODS: One-hundred patients were randomized to thoracic epidural analgesia (TEA) utilizing LOR-alone or loss of resistance plus nerve stimulation (LOR + NS). The primary endpoint was rate of success, defined as loss of sensation following test dose. Secondary endpoints included performance time. An intention-to-treat analysis was planned, but a per-protocol analysis was performed to investigate the success rate when stimulation was achieved. RESULTS: In the intention-to-treat analysis there was no difference in success rates (90% vs 82% [LOR + NS vs LOR-alone]; P = 0.39). The procedural time increased in the LOR + NS group (33.9 ± 12.8 vs 24.0 ± 8.0 min; P < 0.001). The per-protocol analysis found a statistically higher success rate for the LOR + NS group compared to the LOR-alone group (98% vs. 82%; P = 0.017) when only patients in whom stimulation was achieved were included. CONCLUSIONS: Addition of NS technique did not statistically improve the success rate for epidural placement when analyzed in an intention-to-treat format and was associated with a longer procedural time. In a per-protocol analysis a statistically higher success rate for patients in whom stimulation was obtained highlights the potential benefit of adding NS to LOR. TRIAL REGISTRATION: ClinicalTrials.gov identifier NCT03087604 on 3/22/2017; Institutional Review Board Wake Forest School of Medicine IRB00039522, Food and Drug Administration Investigational Device Exemption: G160273.


Subject(s)
Analgesia, Epidural/instrumentation , Analgesia, Epidural/methods , Electric Stimulation/methods , Epidural Space , Female , Humans , Male , Middle Aged , Prospective Studies , Single-Blind Method , Thoracic Vertebrae
10.
J Vasc Access ; 23(3): 348-352, 2022 May.
Article in English | MEDLINE | ID: mdl-33541202

ABSTRACT

BACKGROUND: Pandemics create challenges for medical centers, which call for innovative adaptations to care for patients during the unusually high census, to distribute stress and work hours among providers, to reduce the likelihood of transmission to health care workers, and to maximize resource utilization. METHODS: We describe a multidisciplinary vascular access team's development to improve frontline providers' workflow by placing central venous and arterial catheters. Herein we describe the development, organization, and processes resulting in the rapid formation and deployment of this team, reporting on notable clinical issues encountered, which might serve as a basis for future quality improvement and investigation. We describe a retrospective, single-center descriptive study in a large, quaternary academic medical center in a major city. The COVID-19 vascular access team included physicians with specialized experience in placing invasive catheters and whose usual clinical schedule had been lessened through deferment of elective cases. The target population included patients with confirmed or suspected COVID-19 in the medical ICU (MICU) needing invasive catheter placement. The line team placed all invasive catheters on patients in the MICU with suspected or confirmed COVID-19. RESULTS AND CONCLUSIONS: Primary data collected were the number and type of catheters placed, time of team member exposure to potentially infected patients, and any complications over the first three weeks. Secondary outcomes pertained to workflow enhancement and quality improvement. 145 invasive catheters were placed on 67 patients. Of these 67 patients, 90% received arterial catheters, 64% central venous catheters, and 25% hemodialysis catheters. None of the central venous catheterizations or hemodialysis catheters were associated with early complications. Arterial line malfunction due to thrombosis was the most frequent complication. Division of labor through specialized expert procedural teams is feasible during a pandemic and offloads frontline providers while potentially conferring safety benefits.


Subject(s)
COVID-19 , Catheterization, Central Venous , Central Venous Catheters , Catheterization, Central Venous/adverse effects , Catheterization, Central Venous/methods , Critical Illness , Humans , Pandemics , Retrospective Studies
11.
Gut ; 71(1): 25-33, 2022 01.
Article in English | MEDLINE | ID: mdl-33741641

ABSTRACT

OBJECTIVE: Although gastric per-oral endoscopic myotomy (G-POEM) is considered a promising technique for the management of refractory gastroparesis, high-quality evidence is limited. We prospectively investigated the efficacy and safety of G-POEM in unselected patients with refractory gastroparesis. DESIGN: In five tertiary centres, patients with symptomatic gastroparesis refractory to standard medical therapy and confirmed by impaired gastric emptying were included. The primary endpoint was clinical success, defined as at least one score decrease in Gastroparesis Cardinal Symptom Index (GCSI) with ≥25% decrease in two subscales, at 12 months. GCSI Score and subscales, adverse events (AEs) and 36-Item Short Form questionnaire of quality of life were evaluated at baseline and 1, 3, 6 and 12 months after G-POEM. Gastric emptying study was performed before and 3 months after the procedure. RESULTS: Of 80 enrolled patients, 75 patients (94%) completed 12-month follow-up. Clinical success at 12 months was 56% (95% CI, 44.8 to 66.7). GCSI Score (including subscales) improved moderately after G-POEM (p<0.05). In a regression model, a baseline GCSI Score >2.6 (OR=3.23, p=0.04) and baseline gastric retention >20% at 4 hours (OR=3.65, p=0.03) were independent predictors of clinical success at 12 months, as was early response to G-POEM at 1 month after therapy (OR 8.75, p<0.001). Mild procedure-related AEs occurred in 5 (6%) patients. CONCLUSION: G-POEM is a safe procedure, but showed only modest overall effectiveness in the treatment of refractory gastroparesis. Further studies are required to identify the best candidates for G-POEM; unselective use of this procedure should be discouraged. TRIAL REGISTRATION NUMBER: ClinicalTrials.gov Registry NCT02732821.


Subject(s)
Gastroparesis/surgery , Pyloromyotomy , Female , Humans , Male , Middle Aged , Prospective Studies , Quality of Life
12.
Front Neurol ; 12: 679918, 2021.
Article in English | MEDLINE | ID: mdl-34456844

ABSTRACT

Objective: The aim of this study is to evaluate the evolution of GPi DBS targeting. Methods: This retrospective, single-center study included patients implanted with GPi DBS leads for dystonia or PD during the years 2004 to 2018 at the University of Florida Fixel Institute for Neurological Diseases. Each patient underwent a high-resolution targeting study on the day prior to the surgery, which was fused with a high resolution CT scan that was acquired on the day of the procedure. Intraoperative target location was selected using a digitized 3D Schaltenbrand-Bailey atlas. All patients underwent a high-resolution head CT scan without contrast approximately one month after lead implantation and accurate measurement of neuroanatomical lead position was acquired after fusion of pre-operative and post-operative image studies. Results: We analyzed 253 PD patients with 352 leads and 80 dystonia patients with 141 leads. During 15 years of follow-up, lead locations in the PD group migrated more laterally (ß = 0.09, p < 0.0001), posteriorly [slope (ß) = 0.04, p < 0.05], and dorsally (ß = 0.07, p < 0.001), whereas leads in the dystonia group did not significantly change position aside from a trend in the dorsal direction (ß = 0.06, p = 0.053). Conclusion: The evolving target likely results from multiple factors including improvements in targeting techniques and clinical feedback intraoperatively and post-operatively. Our demonstrates the potential importance of a systematic post-operative DBS lead measurement protocol to ensure quality control and to inform and optimize DBS programming.

13.
J Burn Care Res ; 42(2): 182-185, 2021 03 04.
Article in English | MEDLINE | ID: mdl-33200770

ABSTRACT

The increasing trend of admissions due to recreational fires prompted a 5-year review. The retrospective chart review of pediatric burn injuries from campfires or bonfires treated at a single medical center's burn unit. The study included children within the ages of 0 to 15 admitted or transferred from January 2012 to December 2016 with first, second, and/or third degree burns by bonfires. These patients accrued burns due to active fires as well as postfire ember contact. Two hundred-eighty nine (289) were pediatric admissions out of which 66 (22.8%) were pediatric admissions associated with recreational fires. The mean annual admission for campfire or bonfire burns was 13 ± .98. The mean age was 4 ± 2.47 years. Gender distribution revealed 21 female and 45 male pediatric patients under the age of 15. From the available data, 8 (12%) of these burns occurred at home in the backyard and 16 (24%) at a public camp or park. Injury mechanisms were more commonly a result of direct contact with hot coals and embers (65%). Falls into open flame accounted for 23% (n = 15) of injuries, and flash flames accounted for 12% of injuries (n = 8). The presence of supervision was unknown in 56%; however, lack of supervision was a factor in 14% of our study population. By gaining a better understanding of the type of injury, mechanism of injury, and the demographic of recreational fire burn victims, policy, and awareness campaigns were instituted in an effort to reduce the incidence of recreational fire burns.


Subject(s)
Accidents/statistics & numerical data , Burns/epidemiology , Burns/therapy , Camping/statistics & numerical data , Fires/statistics & numerical data , Adolescent , Burn Units , Child , Child, Preschool , Female , Foot Injuries/epidemiology , Foot Injuries/therapy , Hand Injuries/epidemiology , Hand Injuries/therapy , Humans , Length of Stay , Male , Retrospective Studies , Risk Factors
14.
Front Microbiol ; 11: 576646, 2020.
Article in English | MEDLINE | ID: mdl-33193192

ABSTRACT

Despite continued efforts to improve biosecurity protocols, Campylobacter continues to be detected in the majority of commercial chicken flocks across Europe. Using an extensive data set of Campylobacter prevalence within a chicken breeder flock for over a year, multiple Bayesian models are presented to explore the dynamics of the spread of Campylobacter in response to seasonal variation, species-specificity, bird health, and total colonization prevalence. These models indicated that birds within the flock varied greatly in their response to bacterial challenge, and that this phenomenon had a large impact on the overall prevalence of different species of Campylobacter. Campylobacter jejuni appeared more frequently in the summer, while Campylobacter coli persisted for a longer duration, amplified by the most susceptible birds in the flock. Our study suggests that strains of Campylobacter that appear most frequently likely possess no demographic advantage, but are instead amplified due to the health of the birds that ingest it.

15.
OTO Open ; 4(2): 2473974X20932506, 2020.
Article in English | MEDLINE | ID: mdl-32537556

ABSTRACT

OBJECTIVE: To review new devices and drugs relevant to otolaryngology-head and neck surgery that were approved by the US Food and Drug Administration (FDA) in 2019. DATA SOURCES: Approval notifications for 2019 were extracted from the ENT (ear, nose, and throat) and general and plastic surgery sections of the FDA's medical devices and therapeutics listings. REVIEW METHODS: New therapeutics and medical devices identified from the query were analyzed by members of the American Academy of Otolaryngology-Head and Neck Surgery's Medical Devices and Drugs Committee. Technologies were assessed by 2 independent reviewers to ascertain relevance to otolaryngology, prioritized, and classified to subspecialty field with critical review based on extant scientific literature. CONCLUSIONS: Query of the FDA drug and device database returned 105 ENT devices (50 cleared, 55 with premarket approval, and 0 de novo), 543 general and plastic surgery devices (372 cleared, 170 with premarket approval, and 1 de novo), and 46 new otolaryngology-relevant drug approvals that occurred in 2019. Advances spanned all subspecialty areas with otology predominating, primarily due to hearing-related technologies. While scientific evidence was available for all new devices, there was significant heterogeneity in rigor of supporting scientific data. IMPLICATIONS FOR PRACTICE: Technological and pharmaceutical innovation is an important catalyst for advances in the surgical specialties. Familiarity with new devices and therapeutics in otolaryngology-head and neck surgery ensures that clinicians keep abreast of developments with potential to improve prevailing standards of care.

16.
AoB Plants ; 12(2): plz048, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32346468

ABSTRACT

Although dispersal is generally viewed as a crucial determinant for the fitness of any organism, our understanding of its role in the persistence and spread of plant populations remains incomplete. Generalizing and predicting dispersal processes are challenging due to context dependence of seed dispersal, environmental heterogeneity and interdependent processes occurring over multiple spatial and temporal scales. Current population models often use simple phenomenological descriptions of dispersal processes, limiting their ability to examine the role of population persistence and spread, especially under global change. To move seed dispersal ecology forward, we need to evaluate the impact of any single seed dispersal event within the full spatial and temporal context of a plant's life history and environmental variability that ultimately influences a population's ability to persist and spread. In this perspective, we provide guidance on integrating empirical and theoretical approaches that account for the context dependency of seed dispersal to improve our ability to generalize and predict the consequences of dispersal, and its anthropogenic alteration, across systems. We synthesize suitable theoretical frameworks for this work and discuss concepts, approaches and available data from diverse subdisciplines to help operationalize concepts, highlight recent breakthroughs across research areas and discuss ongoing challenges and open questions. We address knowledge gaps in the movement ecology of seeds and the integration of dispersal and demography that could benefit from such a synthesis. With an interdisciplinary perspective, we will be able to better understand how global change will impact seed dispersal processes, and potential cascading effects on plant population persistence, spread and biodiversity.

17.
J Math Biol ; 80(1-2): 3-37, 2020 01.
Article in English | MEDLINE | ID: mdl-30392106

ABSTRACT

We study the evolutionary stability of dispersal strategies, including but not limited to those that can produce ideal free population distributions (that is, distributions where all individuals have equal fitness and there is no net movement of individuals at equilibrium). The environment is assumed to be variable in space but constant in time. We assume that there is a separation of times scales, so that dispersal occurs on a fast timescale, evolution occurs on a slow timescale, and population dynamics and interactions occur on an intermediate timescale. Starting with advection-diffusion models for dispersal without population dynamics, we use the large time limits of profiles for population distributions together with the distribution of resources in the environment to calculate growth and interaction coefficients in logistic and Lotka-Volterra ordinary differential equations describing population dynamics. We then use a pairwise invasibility analysis approach motivated by adaptive dynamics to study the evolutionary and/or convergence stability of strategies determined by various assumptions about the advection and diffusion terms in the original advection-diffusion dispersal models. Among other results we find that those strategies which can produce an ideal free distribution are evolutionarily stable.


Subject(s)
Biological Evolution , Models, Biological , Spatio-Temporal Analysis , Animals , Population Dynamics/statistics & numerical data , Population Dynamics/trends , Time Factors
18.
J Math Biol ; 80(1-2): 61-92, 2020 01.
Article in English | MEDLINE | ID: mdl-30783745

ABSTRACT

Many types of organisms disperse through heterogeneous environments as part of their life histories. For various models of dispersal, including reaction-advection-diffusion models in continuously varying environments, it has been shown by pairwise invasibility analysis that dispersal strategies which generate an ideal free distribution are evolutionarily steady strategies (ESS, also known as evolutionarily stable strategies) and are neighborhood invader strategies (NIS). That is, populations using such strategies can both invade and resist invasion by populations using strategies that do not produce an ideal free distribution. (The ideal free distribution arises from the assumption that organisms inhabiting heterogeneous environments should move to maximize their fitness, which allows a mathematical characterization in terms of fitness equalization.) Classical reaction diffusion models assume that landscapes vary continuously. Landscape ecologists consider landscapes as mosaics of patches where individuals can make movement decisions at sharp interfaces between patches of different quality. We use a recent formulation of reaction-diffusion systems in patchy landscapes to study dispersal strategies by using methods inspired by evolutionary game theory and adaptive dynamics. Specifically, we use a version of pairwise invasibility analysis to show that in patchy environments, the behavioral strategy for movement at boundaries between different patch types that generates an ideal free distribution is both globally evolutionarily steady (ESS) and is a global neighborhood invader strategy (NIS).


Subject(s)
Biological Evolution , Models, Biological , Adaptation, Physiological , Animals , Ecosystem , Game Theory , Movement , Population Dynamics
19.
AoB Plants ; 11(5): plz042, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31579119

ABSTRACT

The distribution and abundance of plants across the world depends in part on their ability to move, which is commonly characterized by a dispersal kernel. For seeds, the total dispersal kernel (TDK) describes the combined influence of all primary, secondary and higher-order dispersal vectors on the overall dispersal kernel for a plant individual, population, species or community. Understanding the role of each vector within the TDK, and their combined influence on the TDK, is critically important for being able to predict plant responses to a changing biotic or abiotic environment. In addition, fully characterizing the TDK by including all vectors may affect predictions of population spread. Here, we review existing research on the TDK and discuss advances in empirical, conceptual modelling and statistical approaches that will facilitate broader application. The concept is simple, but few examples of well-characterized TDKs exist. We find that significant empirical challenges exist, as many studies do not account for all dispersal vectors (e.g. gravity, higher-order dispersal vectors), inadequately measure or estimate long-distance dispersal resulting from multiple vectors and/or neglect spatial heterogeneity and context dependence. Existing mathematical and conceptual modelling approaches and statistical methods allow fitting individual dispersal kernels and combining them to form a TDK; these will perform best if robust prior information is available. We recommend a modelling cycle to parameterize TDKs, where empirical data inform models, which in turn inform additional data collection. Finally, we recommend that the TDK concept be extended to account for not only where seeds land, but also how that location affects the likelihood of establishing and producing a reproductive adult, i.e. the total effective dispersal kernel.

20.
AoB Plants ; 11(3): plz020, 2019 Jun.
Article in English | MEDLINE | ID: mdl-31198528

ABSTRACT

When climatic or environmental conditions change, plant populations must either adapt to these new conditions, or track their niche via seed dispersal. Adaptation of plants to different abiotic environments has mostly been discussed with respect to physiological and demographic parameters that allow local persistence. However, rapid modifications in response to changing environmental conditions can also affect seed dispersal, both via plant traits and via their dispersal agents. Studying such changes empirically is challenging, due to the high variability in dispersal success, resulting from environmental heterogeneity, and substantial phenotypic variability of dispersal-related traits of seeds and their dispersers. The exact mechanisms that drive rapid changes are often not well understood, but the ecological implications of these processes are essential determinants of dispersal success, and deserve more attention from ecologists, especially in the context of adaptation to global change. We outline the evidence for rapid changes in seed dispersal traits by discussing variability due to plasticity or genetics broadly, and describe the specific traits and biological systems in which variability in dispersal is being studied, before discussing some of the potential underlying mechanisms. We then address future research needs and propose a simulation model that incorporates phenotypic plasticity in seed dispersal. We close with a call to action and encourage ecologists and biologist to embrace the challenge of better understanding rapid changes in seed dispersal and their consequences for the reaction of plant populations to global change.

SELECTION OF CITATIONS
SEARCH DETAIL
...