Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 72
Filter
1.
J Affect Disord ; 356: 137-144, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38593941

ABSTRACT

BACKGROUND: This study aims to understand the mechanisms contributing to the elevated risk of depression among sexual minority older adults compared to heterosexuals. Specifically, the role of loneliness as a potential mediator is investigated to inform targeted interventions for preventing depression in sexual minority populations. METHODS: Data from the English Longitudinal Study of Ageing, focusing on adults aged over 50, were analysed. Sexual orientation (sexual minority or heterosexual) and loneliness scores (UCLA scale) were assessed at wave six (2010-2011), while depressive symptoms (CESD) were assessed at wave seven (2013-14). Linear regression models and mediation analyses, using g-computation formula and adjusted for confounders, were conducted. RESULTS: The sample included 6794 participants, with 478 (7.0 %) identifying as sexual minorities. After adjustments, sexual minorities scored higher on depressive symptoms at wave seven (mean difference): 0.23, 95 % CI 0.07 to 0.39) and loneliness at wave six (MD: 0.27, 95 % CI 0.08 to 0.46). Loneliness was positively associated with depressive symptoms (coefficient: 0.27, 95 % CI 0.26 to 0.29). In mediation analyses, loneliness explained 15 % of the association between sexual orientation and subsequent depressive symptoms. LIMITATIONS: The dataset used sexual behaviour rather than desire and identity, potentially skewing representation of sexual minorities. Additionally, transgender older adults were not included due to limited gender diversity reported within the ELSA dataset. CONCLUSIONS: Loneliness appears to be a significant modifiable mechanism contributing to the heightened risk of depressive symptoms in sexual minority older adults compared with their heterosexual counterparts.


Subject(s)
Depression , Loneliness , Sexual and Gender Minorities , Humans , Loneliness/psychology , Male , Female , Aged , Depression/psychology , Depression/epidemiology , Prospective Studies , Sexual and Gender Minorities/psychology , Sexual and Gender Minorities/statistics & numerical data , Middle Aged , Longitudinal Studies , Sexual Behavior/psychology , Heterosexuality/psychology , Heterosexuality/statistics & numerical data , England , Aged, 80 and over
2.
Health Technol Assess ; 28(16): 1-93, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38551135

ABSTRACT

Background: Guidelines for sepsis recommend treating those at highest risk within 1 hour. The emergency care system can only achieve this if sepsis is recognised and prioritised. Ambulance services can use prehospital early warning scores alongside paramedic diagnostic impression to prioritise patients for treatment or early assessment in the emergency department. Objectives: To determine the accuracy, impact and cost-effectiveness of using early warning scores alongside paramedic diagnostic impression to identify sepsis requiring urgent treatment. Design: Retrospective diagnostic cohort study and decision-analytic modelling of operational consequences and cost-effectiveness. Setting: Two ambulance services and four acute hospitals in England. Participants: Adults transported to hospital by emergency ambulance, excluding episodes with injury, mental health problems, cardiac arrest, direct transfer to specialist services, or no vital signs recorded. Interventions: Twenty-one early warning scores used alongside paramedic diagnostic impression, categorised as sepsis, infection, non-specific presentation, or other specific presentation. Main outcome measures: Proportion of cases prioritised at the four hospitals; diagnostic accuracy for the sepsis-3 definition of sepsis and receiving urgent treatment (primary reference standard); daily number of cases with and without sepsis prioritised at a large and a small hospital; the minimum treatment effect associated with prioritisation at which each strategy would be cost-effective, compared to no prioritisation, assuming willingness to pay £20,000 per quality-adjusted life-year gained. Results: Data from 95,022 episodes involving 71,204 patients across four hospitals showed that most early warning scores operating at their pre-specified thresholds would prioritise more than 10% of cases when applied to non-specific attendances or all attendances. Data from 12,870 episodes at one hospital identified 348 (2.7%) with the primary reference standard. The National Early Warning Score, version 2 (NEWS2), had the highest area under the receiver operating characteristic curve when applied only to patients with a paramedic diagnostic impression of sepsis or infection (0.756, 95% confidence interval 0.729 to 0.783) or sepsis alone (0.655, 95% confidence interval 0.63 to 0.68). None of the strategies provided high sensitivity (> 0.8) with acceptable positive predictive value (> 0.15). NEWS2 provided combinations of sensitivity and specificity that were similar or superior to all other early warning scores. Applying NEWS2 to paramedic diagnostic impression of sepsis or infection with thresholds of > 4, > 6 and > 8 respectively provided sensitivities and positive predictive values (95% confidence interval) of 0.522 (0.469 to 0.574) and 0.216 (0.189 to 0.245), 0.447 (0.395 to 0.499) and 0.274 (0.239 to 0.313), and 0.314 (0.268 to 0.365) and 0.333 (confidence interval 0.284 to 0.386). The mortality relative risk reduction from prioritisation at which each strategy would be cost-effective exceeded 0.975 for all strategies analysed. Limitations: We estimated accuracy using a sample of older patients at one hospital. Reliable evidence was not available to estimate the effectiveness of prioritisation in the decision-analytic modelling. Conclusions: No strategy is ideal but using NEWS2, in patients with a paramedic diagnostic impression of infection or sepsis could identify one-third to half of sepsis cases without prioritising unmanageable numbers. No other score provided clearly superior accuracy to NEWS2. Research is needed to develop better definition, diagnosis and treatments for sepsis. Study registration: This study is registered as Research Registry (reference: researchregistry5268). Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 17/136/10) and is published in full in Health Technology Assessment; Vol. 28, No. 16. See the NIHR Funding and Awards website for further award information.


Sepsis is a life-threatening condition in which an abnormal response to infection causes heart, lung or kidney failure. People with sepsis need urgent treatment. They need to be prioritised at the emergency department rather than waiting in the queue. Paramedics attempt to identify people with possible sepsis using an early warning score (based on simple measurements, such as blood pressure and heart rate) alongside their impression of the patient's diagnosis. They can then alert the hospital to assess the patient quickly. However, an inaccurate early warning score might miss cases of sepsis or unnecessarily prioritise people without sepsis. We aimed to measure how accurately early warning scores identified people with sepsis when used alongside paramedic diagnostic impression. We collected data from 71,204 people that two ambulance services transported to four different hospitals in 2019. We recorded paramedic diagnostic impressions and calculated early warning scores for each patient. At one hospital, we linked ambulance records to hospital records and identified who had sepsis. We then calculated the accuracy of using the scores alongside diagnostic impression to diagnose sepsis. Finally, we used modelling to predict how many patients (with and without sepsis) paramedics would prioritise using different strategies based on early warning scores and diagnostic impression. We found that none of the currently available early warning scores were ideal. When they were applied to all patients, they prioritised too many people. When they were only applied to patients whom the paramedics thought had infection, they missed many cases of sepsis. The NEWS2, score, which ambulance services already use, was as good as or better than all the other scores we studied. We found that using the NEWS2, score in people with a paramedic impression of infection could achieve a reasonable balance between prioritising too many patients and avoiding missing patients with sepsis.


Subject(s)
Early Warning Score , Emergency Medical Services , Sepsis , Adult , Humans , Cost-Benefit Analysis , Retrospective Studies , Sepsis/diagnosis
3.
J Biomech ; 162: 111902, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38103314

ABSTRACT

The uncontrolled manifold (UCM) analysis has gained broad application in biomechanics and neuroscience for investigating the structure of motor variability in functional tasks. The UCM utilizes inter-trial analysis to partition the variance of elemental variables (e.g., finger forces, joint angles) that affect (VORT) and do not affect (VUCM) a performance variable (e.g., total force, end-effector position). However, to facilitate the translation of UCM into clinical settings, it is crucial to demonstrate the reliability of UCM estimates: VORT, VUCM, and their normalized difference, ΔV. This study aimed to determine the test-retest reliability using the intraclass correlation coefficient (ICC3,K), Bland-Altman plots, the standard error of measurement (SEM), and the minimal detectable change (MDC) of UCM estimate. Fifteen healthy individuals (24.8 ± 1.2 yrs old) performed a finger coordination task, with sessions separated by one hour, one day, and one week. Excellent reliability was found for VORT (ICC3,K = 0.97) and VUCM (ICC3,K = 0.92), whereas good reliability was observed for ΔV (ICC3,K = 0.84). Bland-Altman plots reveled no systematic differences. SEM% values were 24.57 %, 26.80 % and 12.49 % for VORT, VUCM and ΔV respectively, while the normalized MDC% values were 68.12 %, 74.30 % and 34.61 % for VORT, VUCM and ΔV respectively. Our results support the use of UCM as a reliable method for investigating the structure of movement variability. The excellent measurement properties make the UCM a promising tool for tracking changes in motor behavior over time (i.e., effects of interventions in prospective studies).


Subject(s)
Fingers , Movement , Humans , Reproducibility of Results , Prospective Studies , Biomechanical Phenomena
5.
Emerg Med J ; 40(11): 768-776, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37673643

ABSTRACT

BACKGROUND: Ambulance services need to identify and prioritise patients with sepsis for early hospital assessment. We aimed to determine the accuracy of early warning scores alongside paramedic diagnostic impression to identify sepsis that required urgent treatment. METHODS: We undertook a retrospective diagnostic cohort study involving adult emergency medical cases transported to Sheffield Teaching Hospitals ED by Yorkshire Ambulance Service in 2019. We used routine ambulance service data to calculate 21 early warning scores and categorise paramedic diagnostic impressions as sepsis, infection, non-specific presentation or other presentation. We linked cases to hospital records and identified those meeting the sepsis-3 definition who received urgent hospital treatment for sepsis (reference standard). Analysis determined the accuracy of strategies that combined early warning scores at varying thresholds for positivity with paramedic diagnostic impression. RESULTS: We linked 12 870/24 955 (51.6%) cases and identified 348/12 870 (2.7%) with a positive reference standard. None of the strategies provided sensitivity greater than 0.80 with positive predictive value greater than 0.15. The area under the receiver operating characteristic curve for the National Early Warning Score, version 2 (NEWS2) applied to patients with a diagnostic impression of sepsis or infection was 0.756 (95% CI 0.729, 0.783). No other early warning score provided clearly superior accuracy to NEWS2. Paramedic impression of sepsis or infection had sensitivity of 0.572 (0.519, 0.623) and positive predictive value of 0.156 (0.137, 0.176). NEWS2 thresholds of >4, >6 and >8 applied to patients with a diagnostic impression of sepsis or infection, respectively, provided sensitivities and positive predictive values of 0.522 (0.469, 0.574) and 0.216 (0.189, 0.245), 0.447 (0.395, 0.499) and 0.274 (0.239, 0.313), and 0.314 (0.268, 0.365) and 0.333 (0.284, 0.386). CONCLUSION: No strategy is ideal but using NEWS2 alongside paramedic diagnostic impression of infection or sepsis could identify one-third to half of sepsis cases without prioritising unmanageable numbers. No other score provided clearly superior accuracy to NEWS2. TRIAL REGISTRATION NUMBER: researchregistry5268, https://www.researchregistry.com/browse-the-registry%23home/registrationdetails/5de7bbd97ca5b50015041c33/.


Subject(s)
Early Warning Score , Emergency Medical Services , Sepsis , Humans , Adult , Cohort Studies , Retrospective Studies , ROC Curve , Sepsis/diagnosis , Hospital Mortality
7.
BJU Int ; 131(6): 755-762, 2023 06.
Article in English | MEDLINE | ID: mdl-36495480

ABSTRACT

OBJECTIVE: To identify clinicopathological or radiological factors that may predict a diagnosis of upper urinary tract urothelial cell carcinoma (UTUC) to inform which patients can proceed directly to radical nephroureterectomy (RNU) without the delay for diagnostic ureteroscopy (URS). PATIENTS AND METHODS: All consecutive patients investigated for suspected UTUC in a high-volume UK centre between 2011 and 2017 were identified through retrospective analysis of surgical logbooks and a prospectively maintained pathology database. Details on clinical presentation, radiological findings, and URS/RNU histopathology results were evaluated. Multivariate regression analysis was performed to evaluate predictors of a final diagnosis of UTUC. RESULTS: In all, 260 patients were investigated, of whom 230 (89.2%) underwent URS. RNU was performed in 131 patients (50.4%), of whom 25 (9.6%) proceeded directly without URS - all of whom had a final histopathological diagnosis of UTUC - and 15 (11.5%) underwent RNU after URS despite no conclusive histopathological confirmation of UTUC. Major surgery was avoided in 77 patients (33.5%) where a benign or alternative diagnosis was made on URS, and 14 patients (6.1%) underwent nephron-sparing surgery. Overall, 178 patients (68.5%) had a final diagnosis of UTUC confirmed on URS/RNU histopathology. On multivariate logistic regression analysis, a presenting complaint of visible haematuria (hazard ratio [HR] 5.17, confidence interval [CI] 1.91-14.0; P = 0.001), a solid lesion reported on imaging (HR 37.8, CI = 11.7-122.1; P < 0.001) and a history of smoking (HR 3.07, CI 1.35-6.97; P = 0.007), were predictive of a final diagnosis of UTUC. From this cohort, 51 (96.2%) of 53 smokers who presented with visible haematuria and who had a solid lesion on computed tomography urogram had UTUC on final histopathology. CONCLUSION: We identified specific factors which may assist clinicians in selecting which patients may reliably proceed to RNU without the delay of diagnostic URS. These findings may inform a prospective multicentre analysis including additional variables such as urinary cytology.


Subject(s)
Carcinoma, Transitional Cell , Kidney Neoplasms , Ureteral Neoplasms , Urinary Bladder Neoplasms , Humans , Carcinoma, Transitional Cell/diagnosis , Carcinoma, Transitional Cell/surgery , Ureteroscopy/methods , Hematuria/etiology , Retrospective Studies , Prospective Studies , Ureteral Neoplasms/diagnosis , Ureteral Neoplasms/surgery , Ureteral Neoplasms/pathology , Kidney Neoplasms/diagnosis , Kidney Neoplasms/surgery
8.
J Hazard Mater ; 443(Pt B): 130136, 2023 02 05.
Article in English | MEDLINE | ID: mdl-36444046

ABSTRACT

Manure can be a source of antibiotic resistance genes (ARGs) that enter the soil. However, previous studies assessing ARG persistence in soil have generally lacked continuity over sampling times, consistency of location, and assessing the impact of discontinuing manure application. We evaluated both short- and long-term ARG accumulation dynamics in soil with a 40-year known history of manure use. Manure application caused a greater abundance of tetracycline, macrolide, and sulfonamide ARGs in the soil. There was an initial spike in ARG abundance resulting from manure bacteria harboring ARGs being introduced to soil, followed by resident soil bacteria out-competing them, which led to ARG dissipation within a year. However, over four decades, annual manure application caused linear or exponential ARG accumulation, and bacteria associated with ARGs differed compared to those in the short term. Eleven years after discontinuing manure application, most soil ARG levels declined but remained elevated. We systematically explored the historical accumulation of ARGs in manured soil, and provide insight into factors that affect their persistence.


Subject(s)
Manure , Soil , Anti-Bacterial Agents/pharmacology , Drug Resistance, Microbial/genetics , Macrolides
9.
Sci Total Environ ; 852: 158402, 2022 Dec 15.
Article in English | MEDLINE | ID: mdl-36055500

ABSTRACT

In orchard systems, organic amendments and cover crops may enhance soil organic carbon (SOC) and total nitrogen (STN) stocks, but on a global scale a comprehensive understanding of these practices is needed. This study reports a worldwide meta-analysis of 131 peer-reviewed publications, to quantify potential SOC and STN accumulation in orchard soils induced by organic fertilization and cover cropping. Annual gains of 3.73 Mg C/ha and 0.38 Mg N/ha were realized with the introduction of organic fertilizer, while cover crop management led to annual increases of 2.00 Mg C/ha and 0.20 Mg N/ha. The SOC and STN accumulation rates depended mostly on climatic conditions and initial SOC and STN content. The SOC and STN accumulated fastest during the first three years of cover crop implementation, at 2.98 Mg C/ha/yr and 0.25 Mg N/ha/yr and declined thereafter. Organic fertilization caused significantly more annual SOC and STN accumulation at higher (400-800 mm) than lower (<400 mm) rainfall levels. When cover cropping for more than five years, SOC accumulated the fastest with <800 mm of mean annual rainfall. Organic fertilization led to faster SOC accumulation with mean annual temperature between 15 and 20 °C than >20 °C. Organic amendments led to the slowest SOC accumulation rate when the initial SOC concentration was <10 g C/kg. This study provides policy makers and orchard managers science-based evidence to help guide adaptive management practices that build SOC stocks, improve soil conditions and enhance resilience of orchard systems to climate change.


Subject(s)
Carbon , Soil , Carbon/analysis , Nitrogen/analysis , Fertilizers/analysis , Agriculture , Carbon Sequestration , Fertilization
10.
Genes (Basel) ; 13(4)2022 03 24.
Article in English | MEDLINE | ID: mdl-35456385

ABSTRACT

Crop phenotyping experiments have long struggled to have a reliable control treatment that excludes frost and associated freezing damage to plants. Previous attempts used a barrier, such as a removable shelter or cloth to exclude frost. However, these methods were labour intensive and varied in their effectiveness. An automated diesel heater was used to protect field plots of wheat (Triticum aestivum L.) from frost damage. In 2018 and 2019 there were 22 and 33 radiation frost events from July to October at the field site. The heater maintained canopy air temperature above freezing (>0 °C) for the duration of the frost (~6−8 h). Heated plots had 2−3 °C warmer minimum canopy air temperatures. Cold and chilling damage was still present in heated plots and represented 20−30% floret sterility; freezing damage in non-heated plots accounted for an additional 10−30% floret sterility. Grain mapping revealed: grain set in the apical spikelets is most affected by frost damage; proximal florets (G1 and G2) contribute the most to grain yield, but distal (G3 and G4) are important contributors to grain yield when sterility in proximal florets occurs. These results demonstrate that a plot heater is a useful tool to study frost-induced freezing damage in cereal crops, by way of preventing freezing damage in heated field plots for direct comparison to naturally frosted plots. This approach could be used to develop improved damage functions for crop simulation models through a dose and timing-response experiment for natural frost incidence on cereal crops in field plots.


Subject(s)
Infertility , Triticum , Edible Grain , Freezing , Temperature
11.
J Environ Manage ; 301: 113820, 2022 Jan 01.
Article in English | MEDLINE | ID: mdl-34583281

ABSTRACT

Soil salinization is a widespread problem affecting global food production. Phytoremediation is emerging as a viable and cost-effective technology to reclaim salt-affected soil. However, its efficiency is not clear due to the uncertainty of plant responses in saline soils. The main objective of this paper is to propose a phytoremediation dynamic model (PDM) for salt-affected soil within the process-based biogeochemical denitrification-decomposition (DNDC) model. The PDM represents two salinity processes of phytoremediation: plant salt uptake and salt-affected biomass growth. The salt-soil-plant interaction is simulated as a coupled mass balance equation of water and salt plant uptake. The salt extraction ability by plant is a combination of salt uptake efficiency (F) and transpiration rate. For water filled pore space (WFPS), the statistical measures RMSE, MAE, and R2 during the calibration period are 2.57, 2.14, and 0.49, and they are 2.67, 2.34, and 0.56 during the validation period, respectively. For soil salinity, RMSE, MAE, and R2 during the calibration period are 0.02, 0.02, and 0.92, and 0.06, 0.04, and 0.68 during the validation period, respectively, which are reasonably good for further scenario analysis. Over the four years, cumulative salt uptake varied based on weather conditions. At the optimal salt uptake efficiency (F = 20), cumulative salt uptake from soil was 16-90% for alfalfa, 11-70% for barley, and 10-80% for spring wheat. While at the lowest salt uptake efficiency (F = 40), cumulative salt uptake was nearly zero for all crops. Although barley has the highest peak transpiration flux, alfalfa and spring wheat have greater cumulative salt uptake because their peak transpiration fluxes occurred more frequently than in barley. For salt-tolerant crops biomass growth depends on their threshold soil salinity which determines their ability to take up salt without affecting biomass growth. In order to phytoremediate salt-affected soil, salt-tolerant crops having longer duration of crop physiological stages should be used, but their phytoremediation effectiveness will depend on weather conditions and the soil environment.


Subject(s)
Salinity , Soil , Biodegradation, Environmental , Crops, Agricultural , Denitrification , Water
12.
J Happiness Stud ; 23(5): 1887-1900, 2022.
Article in English | MEDLINE | ID: mdl-34840523

ABSTRACT

Although considerable research has examined the traits and features involved in living a good life (Baumeister et al. in J Posit Psychol 8(6):505-516, 2013; Ryan et al. in Self-determination theory: Basic psychological needs in motivation, development, and wellness, Guilford Press, 2006; Wong in Can Psychol/Psychol Can 52(2):69-81, 2011), little research has examined personal philosophies of the good life and the motivational outcomes associated with these views. Through a prospective longitudinal study across one academic year, we examined whether perceiving oneself to be living coherently with personal conceptions of the good life was associated with greater autonomous goal motivation and, subsequently, goal progress and greater subjective well-being (SWB) over time. We hypothesize that perceiving oneself as living coherently in terms of one's own philosophy of flourishing relates to greater volition, goal progress and happiness. Our results suggest that when individuals assess themselves as following their own philosophy of the good life, they tend to experience greater autonomous motivation, goal progress and SWB. Implications for personality coherence and Self-Determination Theory are discussed.

14.
J Environ Qual ; 50(6): 1452-1463, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34331709

ABSTRACT

Supplementing beef cattle with 3-nitrooxypropanol (3-NOP) decreases enteric methane production, but it is unknown if fertilizing soil with 3-NOP manure influences soil health. We measured soil health indicators 2 yr after manure application to a bromegrass (Bromus L.) and alfalfa (Medicago sativa L.) mixed crop. Treatments were: composted conventional manure (without supplements); stockpiled conventional manure; composted manure from cattle supplemented with 3-NOP; stockpiled 3-NOP manure; composted manure from cattle supplemented with 3-NOP and monensin (3-NOP+Mon), a supplement that improves digestion; stockpiled 3-NOP+Mon manure; inorganic fertilizer (150 kg N ha-1 and 50 kg P ha-1 ); and an unamended control. Select chemical (K+ , Mg2+ , Mn+ , Zn+ , pH, and Olsen-P), biological (soil organic matter, active C, respiration, and extractable protein), physical (wet aggregate stability, bulk density, total porosity, and macro-, meso-, and micro-porosity), and hydraulic (saturation, field capacity, wilting point, water holding capacity, and hydraulic conductivity) variables were measured. The inclusion of monensin decreased soil Zn+ concentrations by 70% in stockpiled 3-NOP+Mon compared with stockpiled conventional manure. Active C and protein in composted conventional manure were 37 and 92% higher compared with stockpiled manure, respectively, but did not vary between 3-NOP treatments. 3-Nitrooxypropanol did not significantly alter other soil health indicators. Our results suggest that composted and stockpiled 3-NOP manure can be used as a nutrient source for forage crops without requiring changes to current manure management because it has minimal influence on soil health.


Subject(s)
Fertilizers , Manure , Animals , Cattle , Propanols , Soil
15.
Emerg Med J ; 38(8): 587-593, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34083427

ABSTRACT

BACKGROUND: The WHO and National Institute for Health and Care Excellence recommend various triage tools to assist decision-making for patients with suspected COVID-19. We aimed to compare the accuracy of triage tools for predicting severe illness in adults presenting to the ED with suspected COVID-19. METHODS: We undertook a mixed prospective and retrospective observational cohort study in 70 EDs across the UK. We collected data from people attending with suspected COVID-19 and used presenting data to determine the results of assessment with the WHO algorithm, National Early Warning Score version 2 (NEWS2), CURB-65, CRB-65, Pandemic Modified Early Warning Score (PMEWS) and the swine flu adult hospital pathway (SFAHP). We used 30-day outcome data (death or receipt of respiratory, cardiovascular or renal support) to determine prognostic accuracy for adverse outcome. RESULTS: We analysed data from 20 891 adults, of whom 4611 (22.1%) died or received organ support (primary outcome), with 2058 (9.9%) receiving organ support and 2553 (12.2%) dying without organ support (secondary outcomes). C-statistics for the primary outcome were: CURB-65 0.75; CRB-65 0.70; PMEWS 0.77; NEWS2 (score) 0.77; NEWS2 (rule) 0.69; SFAHP (6-point rule) 0.70; SFAHP (7-point rule) 0.68; WHO algorithm 0.61. All triage tools showed worse prediction for receipt of organ support and better prediction for death without organ support. At the recommended threshold, PMEWS and the WHO criteria showed good sensitivity (0.97 and 0.95, respectively) at the expense of specificity (0.30 and 0.27, respectively). The NEWS2 score showed similar sensitivity (0.96) and specificity (0.28) when a lower threshold than recommended was used. CONCLUSION: CURB-65, PMEWS and the NEWS2 score provide good but not excellent prediction for adverse outcome in suspected COVID-19, and predicted death without organ support better than receipt of organ support. PMEWS, the WHO criteria and NEWS2 (using a lower threshold than usually recommended) provide good sensitivity at the expense of specificity. TRIAL REGISTRATION NUMBER: ISRCTN56149622.


Subject(s)
COVID-19/therapy , Emergency Service, Hospital , Pneumonia, Viral/therapy , Triage/methods , Aged , COVID-19/epidemiology , Early Warning Score , Female , Humans , Male , Middle Aged , Pandemics , Pneumonia, Viral/epidemiology , Pneumonia, Viral/virology , Predictive Value of Tests , Prognosis , Prospective Studies , Retrospective Studies , SARS-CoV-2 , United Kingdom
16.
Resuscitation ; 164: 130-138, 2021 07.
Article in English | MEDLINE | ID: mdl-33961960

ABSTRACT

AIMS: We aimed to describe the characteristics and outcomes of adults admitted to hospital with suspected COVID-19 according to their DNACPR decisions, and identify factors associated with DNACPR decisions. METHODS: We undertook a secondary analysis of 13,977 adults admitted to hospital with suspected COVID-19 and included in the Pandemic Respiratory Infection Emergency System Triage (PRIEST) study. We recorded presenting characteristics and outcomes (death or organ support) up to 30 days. We categorised patients as early DNACPR (before or on the day of admission) or late/no DNACPR (no DNACPR or occurring after the day of admission). We undertook descriptive analysis comparing these groups and multivariable analysis to identify independent predictors of early DNACPR. RESULTS: We excluded 1249 with missing DNACPR data, and identified 3929/12748 (31%) with an early DNACPR decision. They had higher mortality (40.7% v 13.1%) and lower use of any organ support (11.6% v 15.7%), but received a range of organ support interventions, with some being used at rates comparable to those with late or no DNACPR (e.g. non-invasive ventilation 4.4% v 3.5%). On multivariable analysis, older age (p < 0.001), active malignancy (p < 0.001), chronic lung disease (p < 0.001), limited performance status (p < 0.001), and abnormal physiological variables were associated with increased recording of early DNACPR. Asian ethnicity was associated with reduced recording of early DNACPR (p = 0.001). CONCLUSIONS: Early DNACPR decisions were associated with recognised predictors of adverse outcome, and were inversely associated with Asian ethnicity. Most people with an early DNACPR decision survived to 30 days and many received potentially life-saving interventions. REGISTRATION: ISRCTN registry, ISRCTN28342533, http://www.isrctn.com/ISRCTN28342533.


Subject(s)
COVID-19 , Cardiopulmonary Resuscitation , Adult , Aged , Clergy , Cohort Studies , Ethnicity , Humans , Pandemics , Resuscitation Orders , SARS-CoV-2 , Triage
17.
BMC Geriatr ; 21(1): 119, 2021 02 11.
Article in English | MEDLINE | ID: mdl-33573589

ABSTRACT

BACKGROUND: Understanding intervention delivery as intended, particularly in complex interventions, should be underpinned by good quality fidelity assessment. We present the findings from a fidelity assessment embedded as part of a trial of a complex community-based psychosocial intervention, Journeying through Dementia (JtD). The intervention was designed to equip individuals with the knowledge and skills to successfully self-manage, maintain independence, and live well with dementia and involves both group and individual sessions. The methodological challenges of developing a conceptual framework for fidelity assessment and creating and applying purposely designed measures derived from this framework are discussed to inform future studies. METHODS: A conceptual fidelity framework was created out of core components of the intervention (including the intervention manual and training for delivery), associated trial protocols and pre-defined fidelity standards and criteria against which intervention delivery and receipt could be measured. Fidelity data collection tools were designed and piloted for reliability and usability. Data collection in four selected sites (fidelity sites) was via non-participatory observations of the group aspect of the intervention, attendance registers and interventionist (facilitator and supervisor) self-report. RESULTS: Interventionists from all four fidelity sites attended intervention training. The majority of group participants at the four sites (71%) received the therapeutic dose of 10 out of 16 sessions. Weekly group meeting attendance (including at 'out of venue' sessions) was excellent at 80%. Additionally, all but one individual session was attended by the participants who completed the intervention. It proved feasible to create tools derived from the fidelity framework to assess in-venue group aspects of this complex intervention. Results of fidelity assessment of the observed groups were good with substantial inter-rater reliability between researchers KAPPA 0.68 95% CI (0.58-0.78). Self-report by interventionists concurred with researcher assessments. CONCLUSIONS: There was good fidelity to training and delivery of the group aspect of the intervention at four sites. However, the methodological challenges of assessing all aspects of this complex intervention could not be overcome due to practicalities, assessment methods and ethical considerations. Questions remain regarding how we can assess fidelity in community-based complex interventions without impacting upon intervention or trial delivery. TRIAL REGISTRATION: ISRCTN17993825 .


Subject(s)
Dementia , Psychosocial Intervention , Dementia/diagnosis , Dementia/therapy , Humans , Reproducibility of Results , Self Report
18.
Clin Interv Aging ; 16: 231-244, 2021.
Article in English | MEDLINE | ID: mdl-33574660

ABSTRACT

OBJECTIVE: To identify the barriers and facilitators to the implementation of a complex psychosocial intervention though a study exploring the experiences of participants, carers and interventionists during a trial. METHODS: Individual semi-structured interviews were conducted with participants, their carers, and interventionists from a sample of recruiting sites that took part in the Journeying through Dementia randomized controlled trial (RCT). Interview data were transcribed and analysed using framework analysis. Co-researcher data analysis workshops were also conducted to explore researcher interpretations of the data through the lens of those with lived experience of dementia. Triangulation enabled comparison of findings from the interviews with findings from the co-researcher workshops. RESULTS: Three main themes emerged from the interview data: being prepared; intervention engagement; and participation and outcomes from engagement. From these themes, a number of factors that can moderate delivery and receipt of the intervention as intended were identified. These were context and environment; readiness, training, skills and competencies of the workforce; identifying meaningful participation and relationships. CONCLUSION: This study highlighted that the observed benefit of the intervention was nuanced for each individual. Mechanisms of change were influenced by a range of individual, social and contextual factors. Future research should therefore consider how best to identify and measure the multifaceted interplay of mechanisms of change in complex interventions. TRIAL REGISTRATION: ISRCTN17993825.


Subject(s)
Caregivers/psychology , Dementia/psychology , Dementia/therapy , Psychosocial Intervention/methods , Aged , Aged, 80 and over , Female , Humans , Interviews as Topic , Male , Qualitative Research
19.
PLoS One ; 16(1): e0245840, 2021.
Article in English | MEDLINE | ID: mdl-33481930

ABSTRACT

OBJECTIVES: We aimed to derive and validate a triage tool, based on clinical assessment alone, for predicting adverse outcome in acutely ill adults with suspected COVID-19 infection. METHODS: We undertook a mixed prospective and retrospective observational cohort study in 70 emergency departments across the United Kingdom (UK). We collected presenting data from 22445 people attending with suspected COVID-19 between 26 March 2020 and 28 May 2020. The primary outcome was death or organ support (respiratory, cardiovascular, or renal) by record review at 30 days. We split the cohort into derivation and validation sets, developed a clinical score based on the coefficients from multivariable analysis using the derivation set, and the estimated discriminant performance using the validation set. RESULTS: We analysed 11773 derivation and 9118 validation cases. Multivariable analysis identified that age, sex, respiratory rate, systolic blood pressure, oxygen saturation/inspired oxygen ratio, performance status, consciousness, history of renal impairment, and respiratory distress were retained in analyses restricted to the ten or fewer predictors. We used findings from multivariable analysis and clinical judgement to develop a score based on the NEWS2 score, age, sex, and performance status. This had a c-statistic of 0.80 (95% confidence interval 0.79-0.81) in the validation cohort and predicted adverse outcome with sensitivity 0.98 (0.97-0.98) and specificity 0.34 (0.34-0.35) for scores above four points. CONCLUSION: A clinical score based on NEWS2, age, sex, and performance status predicts adverse outcome with good discrimination in adults with suspected COVID-19 and can be used to support decision-making in emergency care. REGISTRATION: ISRCTN registry, ISRCTN28342533, http://www.isrctn.com/ISRCTN28342533.


Subject(s)
COVID-19/diagnosis , Adult , Aged , Aged, 80 and over , COVID-19/epidemiology , COVID-19/pathology , Female , Humans , Male , Middle Aged , Prognosis , Prospective Studies , Retrospective Studies , Risk Assessment , SARS-CoV-2/isolation & purification , Severity of Illness Index , United Kingdom/epidemiology
20.
Eur Urol Focus ; 7(4): 835-842, 2021 Jul.
Article in English | MEDLINE | ID: mdl-32381397

ABSTRACT

BACKGROUND: Late relapse (LR) in testicular cancer is defined as disease recurrence more than 2yr after primary treatment. Optimal management for this rare group is unknown. OBJECTIVE: To identify prognostic factors relevant to outcomes in a large LR series following primary treatment with platinum-based chemotherapy. DESIGN, SETTING, AND PARTICIPANTS: We performed a retrospective analysis of all patients treated for advanced testicular cancer within the Anglian Germ Cell Cancer Network between 1995 and 2016. We identified 53 cases of LR following initial treatment for metastatic disease with platinum-based chemotherapy, and collected data on patient and tumour characteristics, treatments, and outcomes. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Progression-free survival (PFS) and overall survival (OS) were calculated for all patients. Survival curves were plotted according to the Kaplan-Meier method and univariate analysis of descriptive variables was performed using the log-rank method. RESULTS AND LIMITATIONS: Across the cohort, PFS at 36 mo was 41% and OS was 61%. Multiple factors were correlated with PFS. Use of dose-intense or high-dose chemotherapy was associated with better PFS compared to conventional-dose chemotherapy (PFS 48 vs 9.8 mo; p=0.0036). Resection of residual disease post-relapse chemotherapy was associated with better PFS (hazard ratio 3.46; p=0.0076). There was a nonsignificant trend towards worse PFS in very late (>7 yr) relapses. The study is limited by its retrospective nature and selection bias cannot be excluded. CONCLUSIONS: This study provides new insight into prognostic factors in LR. It confirms that surgery is critical to optimal outcomes, and suggests that dose-intense or high-dose chemotherapy in multisite nonresectable disease should be considered wherever feasible. PATIENT SUMMARY: We studied patients with testicular cancer that recurred at least 2yr after initial treatment with chemotherapy. We found that patients who are able to have surgery to remove cancer and who have more intensive chemotherapy may be more likely to live longer.


Subject(s)
Neoplasms, Germ Cell and Embryonal , Testicular Neoplasms , Humans , Male , Neoplasm Recurrence, Local/drug therapy , Neoplasms, Germ Cell and Embryonal/drug therapy , Neoplasms, Germ Cell and Embryonal/surgery , Retrospective Studies , Testicular Neoplasms/drug therapy , Testicular Neoplasms/surgery
SELECTION OF CITATIONS
SEARCH DETAIL
...