Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 248
Filter
1.
Phys Med Biol ; 2024 Sep 20.
Article in English | MEDLINE | ID: mdl-39303742

ABSTRACT

OBJECTIVE: To develop a 2D MR acceleration method utilizing Principal Component Analysis (PCA) in a hybrid fashion for rapid real-time applications. Approach: Retrospective testing was performed on 10 lung, 10 liver and 10 prostate 3T MRI data sets for image quality and target contourability. Sampling of k-space is performed by acquiring central (low-frequency) data in every frame while the high-frequency data is incoherently undersampled such that all of k-space is acquired in a pre-determined number of frames. Firstly, Principal Components (PCs) representative of intra-frame correlations between central and outer k-space data are used to estimate unsampled data in the frame of interest. Then to add further stability, PCs representative of time-domain fluctuations within a reconstruction window of the most recent frames are fit to outer k-space data (including above estimations) to obtain final estimates in the frame of interest. Accelerated reconstructions between 3x and 8x were tested for image quality and contourability along with the optimal number of PCs for fitting. Main Results: It was found that at higher acceleration rates, image quality did not deteriorate significantly. Similarly, it was found that the images were of sufficient quality to contour a target using auto-contouring software at all tested acceleration rates and sites. SSIM values were found to be ≥ 0.91 at all accelerations tested. Similarly dice coefficients at the different sites were found to be ≥ 0.89 even at 8x accelerations which is on par with or better than intra-observer variation. Significance: This method appears to produce improved image quality and contourability compared to previous PCA methods while also allowing a greater number of PCs to be used in reconstruction. The method can be run using a simple single-channel coil and does not require significant computing power to meet real-time interventional standards (reconstruction times ~50 ms/frame on Intel i5 CPU). .

2.
Anat Rec (Hoboken) ; 2024 Aug 05.
Article in English | MEDLINE | ID: mdl-39099296

ABSTRACT

Turbinals are key bony elements of the mammalian nasal cavity, involved in heat and moisture conservation as well as olfaction. While turbinals are well known in some groups, their diversity is poorly understood at the scale of placental mammals, which span 21 orders. Here, we investigated the turbinal bones and associated lamellae for one representative of each extant order of placental mammals. We segmented and isolated each independent turbinal and lamella and found an important diversity of variation in the number of turbinals, as well as their size, and shape. We found that the turbinal count varies widely, from zero in the La Plata dolphin, (Pontoporia blainvillei) to about 110 in the African bush elephant (Loxodonta africana). Multiple turbinal losses and additional gains took place along the phylogeny of placental mammals. Some changes are clearly attributed to ecological adaptation, while others are probably related to phylogenetic inertia. In addition, this work highlights the problem of turbinal nomenclature in some placental orders with numerous and highly complex turbinals, for which homologies are extremely difficult to resolve. Therefore, this work underscores the importance of developmental studies to better clarify turbinal homology and nomenclature and provides a standardized comparative framework for further research.

3.
J Stroke Cerebrovasc Dis ; 33(11): 107968, 2024 Aug 29.
Article in English | MEDLINE | ID: mdl-39214434

ABSTRACT

BACKGROUND: In-hospital falls are frequent post-stroke medical complications and will remain of concern because it may not be possible to prevent all of them. We aimed to i) compare admission clinical and sociodemographic characteristics between fallers and non-fallers ii) determine falls characteristics iii) compare length of stay (LOS), discharge functional independence, ambulation and destination between fallers and non-fallers. METHODS: A matched case-control study, comparing individuals (n = 302) who fell during inpatient post-acute rehabilitation, matched (on time to admission, age and motor Functional Independence Measure (mFIM)) to individuals (n = 302) who didn´t fall, admitted within 3 months post-injury to a center between 2008 and 2023. Ambulation was assessed using the Functional Ambulation Category (FAC). RESULTS: Mean age at admission was 50±8 years. No baseline differences were seen between groups in the proportion of patients with aphasia, diabetes, dyslipidemia, hypertension, neglect, atrial fibrillation, dysphagia, dominant side affected, medication for depression, FAC assessment, body mass index and educational level. A first-fall in the first week was experienced by 22.2 % and in the first three weeks by 54.3 %. Most falls occurred at the patients' room (75.1 %) mostly due to distractions (55.3 %) and transferring without help (32.4 %) with 18 % occurring in the bathroom, fallers were alone in 68.6 % of the cases. Fallers had an 8-day longer mean LOS compared to non-fallers, yet there were no differences in discharge mFIM or FAC scores. While non-fallers had a higher proportion of poor mFIM outcomes (28.5 % vs. 17.9 %) and no ambulation (20.7 % vs. 12.4 %), fallers showed greater mFIM gains (26 vs. 22 points). Discharge destinations were similar across both groups. CONCLUSIONS: Despite no baseline differences, fallers experienced longer stays with comparable independence and ambulation scores at discharge. Most falls occurred in patients' rooms during unsupervised activities. Preventive recommendations have been provided to address these risks and enhance patient safety.

4.
Sensors (Basel) ; 24(14)2024 Jul 19.
Article in English | MEDLINE | ID: mdl-39066092

ABSTRACT

(1) Background: Restoring arm and hand function is one of the priorities of people with cervical spinal cord injury (cSCI). Noninvasive electromagnetic neuromodulation is a current approach that aims to improve upper-limb function in individuals with SCI. The aim of this study is to review updated information on the different applications of noninvasive electromagnetic neuromodulation techniques that focus on restoring upper-limb functionality and motor function in people with cSCI. (2) Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were used to structure the search protocol. A systematic review of the literature was performed in three databases: the Cochrane Library, PubMed, and Physiotherapy Evidence Database (PEDro). (3) Results: Twenty-five studies were included: four were on transcranial magnetic stimulation (TMS), four on transcranial direct current stimulation (tDCS), two on transcutaneous spinal cord stimulation (tSCS), ten on functional electrical stimulation (FES), four on transcutaneous electrical nerve stimulation (TENS), and one on neuromuscular stimulation (NMS). The meta-analysis could not be completed due to a lack of common motor or functional evaluations. Finally, we realized a narrative review of the results, which reported that noninvasive electromagnetic neuromodulation combined with rehabilitation at the cerebral or spinal cord level significantly improved upper-limb functionality and motor function in cSCI subjects. Results were significant compared with the control group when tSCS, FES, TENS, and NMS was applied. (4) Conclusions: To perform a meta-analysis and contribute to more evidence, randomized controlled trials with standardized outcome measures for the upper extremities in cSCI are needed, even though significant improvement was reported in each non-invasive electromagnetic neuromodulation study.


Subject(s)
Spinal Cord Injuries , Transcranial Magnetic Stimulation , Upper Extremity , Humans , Spinal Cord Injuries/physiopathology , Spinal Cord Injuries/rehabilitation , Spinal Cord Injuries/therapy , Upper Extremity/physiopathology , Transcranial Magnetic Stimulation/methods , Peripheral Nervous System/physiopathology , Central Nervous System/physiopathology , Central Nervous System/radiation effects , Central Nervous System/physiology , Transcutaneous Electric Nerve Stimulation/methods , Transcranial Direct Current Stimulation/methods , Cervical Cord/injuries
5.
Environ Entomol ; 2024 Jul 02.
Article in English | MEDLINE | ID: mdl-38956828

ABSTRACT

The twolined spittlebug, Prosapia bicincta (Say), is a major economic pest of forage grass and turfgrass. Prosapia bicincta was first detected in rangelands on Hawai'i Island in 2016 and has since spread to an estimated 72,000 ha in the North and South Kona districts. This study aimed to quantify P. bicincta abundance, plant associations, and impacts on groundcover over time. Monthly surveys of P. bicincta nymphs and adults were conducted from February 2018 to September 2022 along 17 established 100-m transects at 4 ranches located in Kona, Hawai'i Island, spanning an elevation gradient from 519 to 1,874 m above sea level (a.s.l.). Monitoring revealed P. bicincta occurs from 519 to 1,679 m a.s.l., primarily in Kikuyu grass (Cenchrus clandestinus (Hochst. ex Chiov.)) Morrone (Poales: Poaceae) pastures. Peaks in P. bicincta abundance coincided with the wet season, with most activity occurring from April to October and little to no activity between November and March. Mid elevation (1,000-1,300 m) transects had significantly higher mean P. bicincta abundance (126 nymphs/m2) relative to low (500-999 m) (64 nymphs/m2) and high elevations (>1,300 m) (20 nymphs/m2). Sites with the highest abundance of P. bicincta were also associated with the greatest decrease in mean grass cover (30%) and were replaced by forbs, bare ground, and shrubs. Grasses accounted for 72% of the total P. bicincta detections, with the remaining plants comprised of legumes (16%), sedges (6%), and forbs (6%). Twenty new P. bicincta plant associations were found. This information will help improve the effectiveness of management to suppress populations below economic thresholds.

6.
Front Immunol ; 15: 1395427, 2024.
Article in English | MEDLINE | ID: mdl-39007135

ABSTRACT

Systemic lupus erythematosus (SLE, lupus) is a debilitating, multisystem autoimmune disease that can affect any organ in the body. The disease is characterized by circulating autoantibodies that accumulate in organs and tissues, which triggers an inflammatory response that can cause permanent damage leading to significant morbidity and mortality. Lyn, a member of the Src family of non-receptor protein tyrosine kinases, is highly implicated in SLE as remarkably both mice lacking Lyn or expressing a gain-of-function mutation in Lyn develop spontaneous lupus-like disease due to altered signaling in B lymphocytes and myeloid cells, suggesting its expression or activation state plays a critical role in maintaining tolerance. The past 30 years of research has begun to elucidate the role of Lyn in a duplicitous signaling network of activating and inhibitory immunoreceptors and related targets, including interactions with the interferon regulatory factor family in the toll-like receptor pathway. Gain-of-function mutations in Lyn have now been identified in human cases and like mouse models, cause severe systemic autoinflammation. Studies of Lyn in SLE patients have presented mixed findings, which may reflect the heterogeneity of disease processes in SLE, with impairment or enhancement in Lyn function affecting subsets of SLE patients that may be a means of stratification. In this review, we present an overview of the phosphorylation and protein-binding targets of Lyn in B lymphocytes and myeloid cells, highlighting the structural domains of the protein that are involved in its function, and provide an update on studies of Lyn in SLE patients.


Subject(s)
Lupus Erythematosus, Systemic , Signal Transduction , src-Family Kinases , Lupus Erythematosus, Systemic/immunology , Lupus Erythematosus, Systemic/genetics , src-Family Kinases/metabolism , src-Family Kinases/genetics , Humans , Animals , B-Lymphocytes/immunology , Mice
7.
Eur Spine J ; 2024 Jun 09.
Article in English | MEDLINE | ID: mdl-38852115

ABSTRACT

PURPOSE: Existing literature on pediatric traumatic spinal cord injury (PTSCI) demonstrates large variations in characteristics, incidence, time-periods and etiology, worldwide. Epidemiological studies addressing injuries to the total spine, conducted in Southern European regions are remarkably scarce; therefore we aimed to investigate long-term trends analyzing etiology, fracture location and type, single or multiple fractures, associated lesions and neurological status in Catalonia, Spain. METHODS: We conducted a retrospective observational study. We analyzed post-acute patients after PTSCI, aged 0-17, admitted with neurological deficits between 1986 and 2022 to a specialized hospital in Catalonia. Neurological deficits were assessed using the American Spinal Injury Association Impairment Scale (AIS). RESULTS: Two hundred and forty nine children were included, 174 (69.9%) boys and 75 (30.1%) girls; mean age was 13.9 years (range, 2 months to 17 years). Two hundred and four children (82%) had ≥ 1 spinal fractures, 66 (26.5%) dislocations and 8 (3.2%) SCIWORA. Fractures were multilevel contiguous in 108 (43.4%) cases. Fracture types comprised 81 vertebral compactions (32.5%), 22 burst fractures (8.8%), 7 odontoid (2.8%) and 4 tear-drops (1.6%). There were ≥ 1 associated lesions in 112 cases (45%): in limbs in 23 cases (9.2%), thorax or abdomen in 59 (23.7%) and skull or face in 81 (32.5%). In 44 cases (39% of the 112) there were multiple lesions. Locations comprised cervical spine in 105 cases (42%), thoracic spine in 124 (49%), lumbar spine in 18 (7%), and sacrum in 2 (0.8%). Road traffic accidents (RTAs) were the main etiology (62.2%) over the whole period. However, from 2016 onwards, RTAs dropped below the rate of falls and sports injuries. The most common sites for injury in those aged 9 years or older were in the cervical (41.1%) and thoracic (50.7%) regions. Those aged 8 or under were far more likely to sustain a complete SCI (80.0%) or an accompanying traumatic brain injury (45.0%) likely due to higher numbers of pedestrian versus car RTAs. A significant peak in the occurrence of cases during 2006-2010 (20.1%) was identified with an absolute drop immediately after, during 2011-2015 (8.8%). A marked shift in trend is observed between 2016-2022 regarding age of injuries (an increase in 9 years or older), etiology (increase in falls and sports versus RTA), AIS grade (increase in incomplete lesions AIS B-D versus AIS A), severity (increase in tetraplegia versus paraplegia) and location (increase in cervical versus lumbar and thoracic injuries). CONCLUSIONS: A shift in trend is observed in the past 7 years regarding age of injuries (increase in those older than 9), etiology (increase in falls and sports versus RTA), AIS grade (increase in incomplete lesions AIS B-D versus AIS A), severity (increase in tetraplegia versus paraplegia) and location (increase in cervical). LEVEL OF EVIDENCE: IV.

8.
Article in English | MEDLINE | ID: mdl-38724224

ABSTRACT

Advanced cirrhosis confers a significant symptom burden and has a 50% 2-year mortality rate in those with decompensated disease. There is increasing demand for supportive and palliative care (SAPC) for these patients, yet no consensus on the best model of delivery. It is necessary to identify the needs of such patients and their carers, and evaluate whether they are being met.A literature search was conducted using key words pertaining to adult patients with liver cirrhosis and their SAPC needs. Study quality was assessed and findings grouped by theme. 51 full texts were selected for inclusion, 8 qualitative studies, 33 quantitative studies, 7 systematic reviews, 2 mixed methods studies and 1 Delphi methods. Key findings were grouped into three main themes: SAPC needs, access to SAPC and models of care.Patients with cirrhosis have significant psychological and physical symptom burden with many unmet needs. These data failed to identify the best service model of care. The impact of specialist palliative care (SPC) referral was limited by small numbers and late referrals. With the majority of studies conducted in the USA, it is unclear how well these findings translate to other healthcare systems. Comparison between hepatology led services and SPC was limited by inconsistent outcome measures and prevented pooling of data sets. These data also had limited evaluation of patient-reported outcome measures. We propose the development of a core outcome set to ensure consistent and meaningful evaluation of the SAPC needs of patients with advanced non-malignant liver cirrhosis.

9.
NeuroRehabilitation ; 54(3): 457-472, 2024.
Article in English | MEDLINE | ID: mdl-38640178

ABSTRACT

BACKGROUND: Most studies focus on the risk factors associated with the development of pressure ulcers (PUs) during acute phase or community care for individuals with spinal cord injury (SCI). OBJECTIVES: This study aimed to i) compare clinical and demographic characteristics of inpatients after SCI with PUs acquired during rehabilitation vs inpatients without PUs and ii) evaluate an existing PU risk assessment tool iii) identify first PU predictors. METHODS: Individuals (n = 1,135) admitted between 2008 and 2022 to a rehabilitation institution within 60 days after SCI were included. Admission Functional Independence Measure (FIM), American Spinal Injury Association Impairment Scale (AIS) and mEntal state, Mobility, Incontinence, Nutrition, Activity (EMINA) were assessed. Kaplan-Meier curves and Cox proportional hazards models were fitted. RESULTS: Overall incidence of PUs was 8.9%. Of these, 40.6% occurred in the first 30 days, 47.5% were sacral, 66.3% were Stage II. Patients with PUs were older, mostly with traumatic injuries (67.3%), AIS A (54.5%), lower FIM motor (mFIM) score and mechanical ventilation. We identified specific mFIM items to increase EMINA specificity. Adjusted Cox model yielded sex (male), age at injury, AIS grade, mFIM and diabetes as PUs predictors (C-Index = 0.749). CONCLUSION: Inpatients can benefit from combined assessments (EMINA + mFIM) and clinical features scarcely addressed in previous studies to prevent PUs.


Subject(s)
Inpatients , Pressure Ulcer , Spinal Cord Injuries , Humans , Spinal Cord Injuries/rehabilitation , Spinal Cord Injuries/complications , Pressure Ulcer/etiology , Male , Female , Middle Aged , Adult , Inpatients/statistics & numerical data , Aged , Risk Factors , Incidence , Retrospective Studies , Risk Assessment
10.
BMC Zool ; 9(1): 10, 2024 Apr 29.
Article in English | MEDLINE | ID: mdl-38685130

ABSTRACT

BACKGROUND: Mammalian skeletons are largely formed before birth. Heterochronic changes in skeletal formation can be investigated by comparing the order of ossification for different elements of the skeleton. Due to the challenge of collecting prenatal specimens in viviparous taxa, opportunistically collected museum specimens provide the best material for studying prenatal skeletal development across many mammalian species. Previous studies have investigated ossification sequence in a range of mammalian species, but little is known about the pattern of bone formation in Carnivora. Carnivorans have diverse ecologies, diets, and biomechanical specializations and are well-suited for investigating questions in evolutionary biology. Currently, developmental data on carnivorans is largely limited to domesticated species. To expand available data on carnivoran skeletal development, we used micro-computed tomography (micro-CT) to non-invasively evaluate the degree of ossification in all prenatal carnivoran specimens housed in the Harvard Museum of Comparative Zoology. By coding the presence or absence of bones in each specimen, we constructed ossification sequences for each species. Parsimov-based genetic inference (PGi) was then used to identify heterochronic shifts between carnivoran lineages and reconstruct the ancestral ossification sequence of Carnivora. RESULTS: We used micro-CT to study prenatal ossification sequence in six carnivora species: Eumetopias jubatus (Steller sea lion, n = 6), Herpestes javanicus (small Indian mongoose, n = 1), Panthera leo (lion, n = 1), Urocyon cinereoargenteus (gray fox, n = 1), Ursus arctos arctos (Eurasian brown bear, n = 1), and Viverricula indica (small Indian civet, n = 5). Due to the relatively later stage of collection for the available specimens, few heterochronic shifts were identified. Ossification sequences of feliform species showed complete agreement with the domestic cat. In caniforms, the bear and fox ossification sequences largely matched the dog, but numerous heterochronic shifts were identified in the sea lion. CONCLUSIONS: We use museum specimens to generate cranial and postcranial micro-CT data on six species split between the two major carnivoran clades: Caniformia and Feliformia. Our data suggest that the ossification sequence of domestic dogs and cats are likely good models for terrestrial caniforms and feliforms, respectively, but not pinnipeds.

11.
Frontline Gastroenterol ; 15(2): 104-109, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38486664

ABSTRACT

Introduction: Liver disease deaths are rising, but specialist palliative care services for hepatology are limited. Expansion across the NHS is required. Methods: We surveyed clinicians, patients and carers to design an 'ideal' service. Using standard NHS tariffs, we calculated the cost of this service. In hospitals where specialist palliative care was available for liver disease, patient-level costs and bed utilisation in last year of life (LYOL) were compared between those seen by specialist palliative care before death and those not. Results: The 'ideal' service was described. Costs were calculated as whole time equivalent for a minimal service, which could be scaled up. From a hospital with an existing service, patients seen by specialist palliative care had associated costs of £14 728 in LYOL, compared with £18 558 for those dying without. Savings more than balanced the costs of introducing the service. Average bed days per patient in LYOL were reduced (19.4 vs 25.7) also intensive care unit bed days (1.1 vs 1.8). Despite this, time from first admission in LYOL to death was similar in both groups (6 months for the specialist palliative care group vs 5 for those not referred). Conclusions: We have produced a template business case for an 'ideal' advanced liver disease support service, which self-funds and saves many bed days. The model can be easily adapted for local use in other trusts. We describe the methodology for calculating patient-level costs and the required service size. We present a financially compelling argument to expand a service to meet a growing need.

12.
Top Stroke Rehabil ; 31(6): 604-614, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38375551

ABSTRACT

BACKGROUND: Most research focuses around impairments in body function and structure, with relatively only a small number exploring their social impact. OBJECTIVES: 1) compare characteristics for individuals who before stroke were blue collar vs. white collar workers 2) identify clinical, functional, and job-related factors associated with return to work within 1 year after discharge 3) identify specific ADL individual items (assessed at rehabilitation discharge) as return to work predictors and 4) identify return to work causal mediators. METHODS: Retrospective observational cohort study, analyzing adult patients with stroke admitted to rehabilitation between 2007 and 2021, including baseline Barthel Index (BI) and return to work assessments between 2008 and 2022. Kaplan-Meier survival curves and Cox proportional hazards were applied. Causal mediation analyses using 1000-bootstrapped simulations were performed. RESULTS: A total of 802 individuals were included (14.6% returned to work), 53.6% blue-collar and 46.4% white-collar. Blue-collar workers showed significantly higher proportion of ischemic stroke, diabetes, dyslipidemia, and hypertension.Individuals not returning to work presented a higher proportion of blue collar, dominant side affected, aphasia, lower BI scores, and larger length of stay (LOS). Multivariable Cox proportional hazards identified age at injury, aphasia, hypertension, and total discharge BI score (C-Index = 0.74). Univariable Cox models identified three independent BI items at all levels of independence: bathing (C-Index = 0.58), grooming (C-Index = 0.56) and feeding (C-Index = 0.59). BI efficiency (gain/LOS) was a causal mediator. CONCLUSION: Blue collar workers showed higher proportion of risk factors and comorbidities. Novel factors, predictors, and a return to work mediator were identified.


Subject(s)
Return to Work , Stroke Rehabilitation , Stroke , Adult , Aged , Female , Humans , Male , Middle Aged , Cohort Studies , Inpatients , Occupations , Retrospective Studies , Return to Work/statistics & numerical data , Stroke/physiopathology
13.
Phys Eng Sci Med ; 47(2): 551-561, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38285272

ABSTRACT

Bolus is commonly used to improve dose distributions in radiotherapy in particular if dose to skin must be optimised such as in breast or head and neck cancer. We are documenting four years of experience with 3D printed bolus at a large cancer centre. In addition to this we review the quality assurance (QA) program developed to support it. More than 2000 boluses were produced between Nov 2018 and Feb 2023 using fused deposition modelling (FDM) printing with polylactic acid (PLA) on up to five Raise 3D printers. Bolus is designed in the radiotherapy treatment planning system (Varian Eclipse), exported to an STL file followed by pre-processing. After checking each bolus with CT scanning initially we now produce standard quality control (QC) wedges every month and whenever a major change in printing processes occurs. A database records every bolus printed and manufacturing details. It takes about 3 days from designing the bolus in the planning system to delivering it to treatment. A 'premium' PLA material (Spidermaker) was found to be best in terms of homogeneity and CT number consistency (80 HU +/- 8HU). Most boluses were produced for photon beams (93.6%) with the rest used for electrons. We process about 120 kg of PLA per year with a typical bolus weighing less than 500 g and the majority of boluses 5 mm thick. Print times are proportional to bolus weight with about 24 h required for 500 g material deposited. 3D printing using FDM produces smooth and reproducible boluses. Quality control is essential but can be streamlined.


Subject(s)
Printing, Three-Dimensional , Humans , Quality Assurance, Health Care/standards , Quality Control , Radiotherapy Planning, Computer-Assisted , Tomography, X-Ray Computed , Radiotherapy Dosage , Polyesters/chemistry
14.
Sci Total Environ ; 912: 168668, 2024 Feb 20.
Article in English | MEDLINE | ID: mdl-38007116

ABSTRACT

Today, the limited sources of freshwater supply are a significant concern. Exploiting alternative sources, especially seawater, has been the focus, but purifying it is energy-intensive. Integrating desalination with renewable energy is a proposed solution, but it comes with high costs and environmental risks during construction. Hence, this study presents a framework to enhance the modeling, optimization, and evaluation of green water-power cogeneration systems to achieve the sustainability goals of cities and societies. An improved division algorithm (DA) determines the optimal component sizes based on criteria like minimal energy demand, reduced environmental and resource damage, low total life cycle cost (TLCC), and high reliability. Optimization considers varying loss of power supply probability (LPSP) levels (0 %, 2 %, 5 %, and 10 %). The environmental assessment utilizes a life cycle assessment (LCA) approach with IMPACT 2002+ and cumulative energy demand (CED) calculations. The study models the green cogeneration systems based on weather conditions, water demand, and power requirements of Al Lulu Island, Abu Dhabi, UAE. The system comprises photovoltaic panels, wind turbines, tidal generators, and backup systems (fuel cells). Results reveal that TLCC ranges from $186,263 to $486,876 for the highest LPSP. The solar-tidal-based configuration offers the lowest TLCC ($186,263) while substituting solar with wind energy increases TLCC by 160 %. The wind-tidal-based configuration has the lowest specific environmental impact (1020 mPt/yr) and cumulative energy demand (39.06 GJ/yr) for the highest LPSP. In contrast, the solar-tidal-wind-based configuration inflicts the most damage, with 62.63 GJ/yr and 1794 mPt/yr for the highest LPSP. The finding indicates that the DA is faster (100 iterations) than the genetic algorithm (1000 iterations), particle swarm optimization (400 iterations), and artificial bee swarm optimization (300 iterations). The study underscores the solar-tidal-based configuration as the optimal choice across multiple criteria, offering a promising solution for freshwater supply and environmental sustainability on Al Lulu Island.

15.
PM R ; 16(8): 815-825, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38155582

ABSTRACT

BACKGROUND: Telerehabilitation in spinal cord injury (teleSCI) is a growing field that can improve access to care and health outcomes in patients with spinal cord injury (SCI). The clinical effectiveness of teleSCI is not known. OBJECTIVES: To compare independence in activities of daily living and mobility capacity in patients following teleSCI and matched controls undergoing traditional rehabilitation. DESIGN: Matched case-control study. SETTING: TeleSCI occurring in home setting (cases) versus traditional rehabilitation on inpatient unit (controls). PARTICIPANTS: Forty-two consecutive patients with SCI followed with teleSCI were compared to 42 historical rehabilitation inpatients (controls) matched for age, time since injury to rehabilitation admission, level of injury (paraplegia/tetraplegia), complete or incomplete injury, and etiology (traumatic/nontraumatic). The teleSCI group (n = 42) was also compared to the complete cohort of historical controls (n = 613). INTERVENTIONS: The teleSCI group followed home-based telerehabilitation (3.5 h/day, 5 days/week, 67 days average duration) and historical controls followed in-person rehabilitation. MAIN OUTCOME MEASURE(S): The Functional Independence Measure (FIM), the Spinal Cord Independence Measure (SCIM) and the Walking Index for Spinal Cord Injury (WISCI). We formally compared gains, efficiency and effectiveness. International Standards for Neurological Classification of Spinal Cord Injury and the American Spinal Injury Association Impairment Scale (AIS) were used. RESULTS: The teleSCI group (57.1% nontraumatic, 71.4% paraplegia, 73.8% incomplete, 52.4% AIS grade D) showed no significant differences compared with historical controls in AIS grades, neurological levels, duration, gains, efficiency and effectiveness in FIM, SCIM, or WISCI, although the teleSCI cohort had significantly higher admission FIM scores compared with the complete cohort of historical controls. CONCLUSIONS: TeleSCI may provide similar improvements in mobility and functional outcomes as traditional rehabilitation in medically stable patients (predominantly with paraplegia and motor incomplete SCI) when provided with appropriate support and equipment.


Subject(s)
Activities of Daily Living , Recovery of Function , Spinal Cord Injuries , Telerehabilitation , Humans , Spinal Cord Injuries/rehabilitation , Spinal Cord Injuries/physiopathology , Female , Male , Adult , Middle Aged , Case-Control Studies , Treatment Outcome , Disability Evaluation , Retrospective Studies
16.
Insects ; 14(12)2023 Dec 03.
Article in English | MEDLINE | ID: mdl-38132597

ABSTRACT

Eurytoma erythrinae Gates & Delvare (Hymenoptera: Eurytomidae) is an important biological control agent of the erythrina gall wasp (EGW), Quadrastichus erythrinae Kim (Hymenoptera: Eulophidae), an invasive species likely originating in eastern Africa that is a threat to Erythrina trees in Hawaii and worldwide. Thousands of Erythrina trees in Hawaii have succumbed to EGW since 2005 and died within a few years of infestation. The endemic wiliwili tree, Erythrina sandwicensis, an important component of Hawaii's dry forests and one of few deciduous native trees, were severely impacted by this wasp. Early during the invasion by EGW it became evident that the endemic species may be driven to extinction, and exploration programs for natural enemies of the EGW started in December 2005. East Africa was selected as the starting point for natural enemy exploration owing to high native Erythina species richness. Several gall formers were found in Tanzania and a putative color variant type of Q. erythrinae was detected in association with three ectoparasitoids. During January 2006, the dominant parasitoid of this gall former was introduced to Hawaii and described as the new species, E. erythrinae. It was found in Ghana and South Africa attacking other gall wasp species on Erythrina. Eurytoma erythrinae was a voracious ectoparasitoid feeding as a predator on 1-5 adjacent EGW immatures to complete its development. Host specificity studies that included seven nontarget gall-forming species showed no evidence of attraction or parasitism by this parasitoid. Mean ± SEM longevity of host-deprived females (40.4 ± 2.2 days) was significantly higher than males (20.5 ± 1.1 days). Host feeding enhanced longevity of ovipositing females (51.3 ± 1.5 days). Female E. erythrinae is synovigenic, with high egg-maturation rate. Peak fecundity (105-239 offspring/female), host feeding biology, short life cycle (18.4 ± 0.1 days), and synchronization with the host were additional desirable attributes of this species. The parasitoid was approved for field release in Hawaii in November 2008. A total of 3998 wasps were distributed on six Hawaiian Islands, with establishment in less than a year. Impacts on high density infestations of EGW were sufficient to prevent tree deaths. Limited rates of parasitism on low-density galled leaves, flowers, and seedpods necessitated the consideration for releasing a second parasitoid, Aprostocetus nitens Prinsloo & Kelly (Hymenoptera: Eulophidae). We report on the reproductive characteristics and host specificity of E. erythinae that could be of importance for classical biocontrol programs in areas with an EGW problem.

17.
Aliment Pharmacol Ther ; 58(11-12): 1217-1229, 2023 12.
Article in English | MEDLINE | ID: mdl-37781965

ABSTRACT

BACKGROUND: The prevalence, prediction and impact of acute kidney injury (AKI) in alcohol-related hepatitis (AH) is uncertain. AIMS: We aimed to determine AKI incidence; association with mortality; evaluate serum biomarkers and the modifying effects of prednisolone and pentoxifylline in the largest AH cohort to date. METHODS: Participants in the Steroids or Pentoxifylline for Alcoholic Hepatitis trial with day zero (D0) creatinine available were included. AKI was defined by modified International Club of Ascites criteria; incident AKI as day 7 (D7) AKI without D0-AKI. Survival was compared by Kaplan-Meier; mortality associations by Cox regression; associations with AKI by binary logistic regression; biomarkers by AUROC analyses. RESULTS: D0-AKI was present in 198/1051 (19%) participants; incident AKI developed in a further 119/571 (21%) with available data. Participants with D0-AKI had higher 90-day mortality than those without (32% vs. 25%, p = 0.008), as did participants with incident AKI compared to those without D0-AKI or incident AKI (47% vs. 25%, p < 0.001). Incident AKI was associated with D90 mortality adjusted for age and discriminant function (AHR 2.15, 1.56-2.97, p < 0.001); D0-AKI was not. Prednisolone therapy reduced incident AKI (AOR 0.55, 0.36-0.85, p = 0.007) but not mortality. D0 bilirubin and IL-8 combined, miR-6826-5p, and miR-6811-3p predicted incident AKI (AUROCs 0.726, 0.821, 0.770, p < 0.01). CONCLUSIONS: Incident AKI is associated with 90-day mortality independent of liver function. Prednisolone therapy was associated with reduced incident AKI. IL-8 and several miRNAs are potential biomarkers to predict AKI. Novel therapies to prevent incident AKI should be evaluated in AH to reduce mortality.


Subject(s)
Acute Kidney Injury , Hepatitis, Alcoholic , MicroRNAs , Pentoxifylline , Humans , MicroRNAs/genetics , Hepatitis, Alcoholic/diagnosis , Hepatitis, Alcoholic/drug therapy , Interleukin-8 , Patient Acuity , Prednisolone/adverse effects , Biomarkers
18.
J Stroke Cerebrovasc Dis ; 32(10): 107267, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37579640

ABSTRACT

OBJECTIVES: To compare independence in activities of daily living (ADLs) in post-acute patients with stroke following tele-rehabilitation and matched in-person controls. MATERIALS AND METHODS: Matched case-control study. A total of 35 consecutive patients with stroke who followed tele-rehabilitation were compared to 35 historical in-person patients (controls) matched for age, functional independence at admission and time since injury to rehabilitation admission (<60 days). The tele-rehabilitation group was also compared to the complete cohort of historical controls (n=990). Independence in ADLs was assessed using the Functional Independence Measure (FIM) and the Barthel Index (BI). We formally compared FIM and BI gains calculated as discharge score - admission scores, efficiency measured as gains / length of stay and effectiveness defined as (discharge score-admission score)/ (maximum score-admission score). We analyzed the minimal clinically important difference (MCID) for FIM and BI. RESULTS: The groups showed no significant differences in type of stroke (ischemic or hemorrhagic), location, severity, age at injury, length of stay, body mass index, diabetes, dyslipidemia, hypertension, aphasia, neglect, affected side of the body, dominance or educational level. The groups showed no significant differences in gains, efficiency nor effectiveness either using FIM or Barthel Index. We identified significant differences in two specific BI items (feeding and transfer) in favor of the in-person group. No differences were observed in the proportion of patients who achieved MCID. CONCLUSIONS: No significant differences were seen between total ADL scores for tele-rehabilitation and in-person rehabilitation. Future research studies should analyze a combined rehabilitation approach that utilizes both models.


Subject(s)
Stroke Rehabilitation , Stroke , Telerehabilitation , Humans , Infant , Activities of Daily Living , Case-Control Studies , Stroke/diagnosis , Stroke/therapy , Recovery of Function , Treatment Outcome , Retrospective Studies
19.
Insects ; 14(7)2023 Jul 03.
Article in English | MEDLINE | ID: mdl-37504609

ABSTRACT

Coffee berry borer (CBB) is the most serious insect pest of coffee worldwide, causing significant reductions in yield and quality. Following the introduction of CBB to Puerto Rico (2007) and Hawaii (2010), researchers, extension agents, industry representatives, and coffee growers have worked together to develop an integrated pest management (IPM) program that is both effective and economically feasible for each island. Since the introduction of the IPM program in Hawaii, research efforts have led to a better understanding of CBB population dynamics, as well as optimized monitoring, cultural practices, and commercial Beauveria bassiana applications. As a result of these efforts, a substantial reduction in average CBB infestation and an increase in coffee yields and quality have been documented in Hawaii over the last decade. However, significant challenges remain in addressing high production and labor costs, limited availability of labor, and a lack of training for field workers in both regions. Although considerable effort has gone into research to support CBB IPM in Hawaii and Puerto Rico, the adoption of these strategies by coffee farmers needs to be increased. More diversified methods of outreach and education are needed to reach growers in rural, isolated areas. Significant gaps exist in the ability and willingness of growers and workers to access and digest information online, emphasizing the importance of on-farm workshops and farmer-to-farmer teaching. Additional methods of training are needed to help coffee farmers and field workers learn how to properly conduct cultural controls and optimize the use of biological control agents such as B. bassiana.

20.
Nat Commun ; 14(1): 4425, 2023 07 21.
Article in English | MEDLINE | ID: mdl-37479710

ABSTRACT

The evolution of endothermy in vertebrates is a major research topic in recent decades that has been tackled by a myriad of research disciplines including paleontology, anatomy, physiology, evolutionary and developmental biology. The ability of most mammals to maintain a relatively constant and high body temperature is considered a key adaptation, enabling them to successfully colonize new habitats and harsh environments. It has been proposed that in mammals the anterior nasal cavity, which houses the maxilloturbinal, plays a pivotal role in body temperature maintenance, via a bony system supporting an epithelium involved in heat and moisture conservation. The presence and the relative size of the maxilloturbinal has been proposed to reflect the endothermic conditions and basal metabolic rate in extinct vertebrates. We show that there is no evidence to relate the origin of endothermy and the development of some turbinal bones by using a comprehensive dataset of µCT-derived maxilloturbinals spanning most mammalian orders. Indeed, we demonstrate that neither corrected basal metabolic rate nor body temperature significantly correlate with the relative surface area of the maxilloturbinal. Instead, we identify important variations in the relative surface area, morpho-anatomy, and complexity of the maxilloturbinal across the mammalian phylogeny and species ecology.


Subject(s)
Acclimatization , Mammals , Animals , Basal Metabolism , Body Temperature , Ecology
SELECTION OF CITATIONS
SEARCH DETAIL