Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 116
Filter
1.
Sci Rep ; 14(1): 10431, 2024 05 07.
Article in English | MEDLINE | ID: mdl-38714841

ABSTRACT

Reverse zoonotic respiratory diseases threaten great apes across Sub-Saharan Africa. Studies of wild chimpanzees have identified the causative agents of most respiratory disease outbreaks as "common cold" paediatric human pathogens, but reverse zoonotic transmission pathways have remained unclear. Between May 2019 and August 2021, we conducted a prospective cohort study of 234 children aged 3-11 years in communities bordering Kibale National Park, Uganda, and 30 adults who were forest workers and regularly entered the park. We collected 2047 respiratory symptoms surveys to quantify clinical severity and simultaneously collected 1989 nasopharyngeal swabs approximately monthly for multiplex viral diagnostics. Throughout the course of the study, we also collected 445 faecal samples from 55 wild chimpanzees living nearby in Kibale in social groups that have experienced repeated, and sometimes lethal, epidemics of human-origin respiratory viral disease. We characterized respiratory pathogens in each cohort and examined statistical associations between PCR positivity for detected pathogens and potential risk factors. Children exhibited high incidence rates of respiratory infections, whereas incidence rates in adults were far lower. COVID-19 lockdown in 2020-2021 significantly decreased respiratory disease incidence in both people and chimpanzees. Human respiratory infections peaked in June and September, corresponding to when children returned to school. Rhinovirus, which caused a 2013 outbreak that killed 10% of chimpanzees in a Kibale community, was the most prevalent human pathogen throughout the study and the only pathogen present at each monthly sampling, even during COVID-19 lockdown. Rhinovirus was also most likely to be carried asymptomatically by adults. Although we did not detect human respiratory pathogens in the chimpanzees during the cohort study, we detected human metapneumovirus in two chimpanzees from a February 2023 outbreak that were genetically similar to viruses detected in study participants in 2019. Our data suggest that respiratory pathogens circulate in children and that adults become asymptomatically infected during high-transmission times of year. These asymptomatic adults may then unknowingly carry the pathogens into forest and infect chimpanzees. This conclusion, in turn, implies that intervention strategies based on respiratory symptoms in adults are unlikely to be effective for reducing reverse zoonotic transmission of respiratory viruses to chimpanzees.


Subject(s)
Common Cold , Pan troglodytes , Animals , Humans , Child , Female , Male , Child, Preschool , Common Cold/epidemiology , Common Cold/virology , Adult , Uganda/epidemiology , Prospective Studies , Zoonoses/epidemiology , Zoonoses/virology , COVID-19/epidemiology , COVID-19/virology , COVID-19/transmission , Ape Diseases/epidemiology , Ape Diseases/virology , Respiratory Tract Infections/epidemiology , Respiratory Tract Infections/virology , Respiratory Tract Infections/veterinary , Rhinovirus/isolation & purification , Rhinovirus/genetics , SARS-CoV-2/isolation & purification , Incidence
2.
Front Plant Sci ; 15: 1398903, 2024.
Article in English | MEDLINE | ID: mdl-38751840

ABSTRACT

Sugarcane smut and Pachymetra root rots are two serious diseases of sugarcane, with susceptible infected crops losing over 30% of yield. A heritable component to both diseases has been demonstrated, suggesting selection could improve disease resistance. Genomic selection could accelerate gains even further, enabling early selection of resistant seedlings for breeding and clonal propagation. In this study we evaluated four types of algorithms for genomic predictions of clonal performance for disease resistance. These algorithms were: Genomic best linear unbiased prediction (GBLUP), including extensions to model dominance and epistasis, Bayesian methods including BayesC and BayesR, Machine learning methods including random forest, multilayer perceptron (MLP), modified convolutional neural network (CNN) and attention networks designed to capture epistasis across the genome-wide markers. Simple hybrid methods, that first used BayesR/GWAS to identify a subset of 1000 markers with moderate to large marginal additive effects, then used attention networks to derive predictions from these effects and their interactions, were also developed and evaluated. The hypothesis for this approach was that using a subset of markers more likely to have an effect would enable better estimation of interaction effects than when there were an extremely large number of possible interactions, especially with our limited data set size. To evaluate the methods, we applied both random five-fold cross-validation and a structured PCA based cross-validation that separated 4702 sugarcane clones (that had disease phenotypes and genotyped for 26k genome wide SNP markers) by genomic relationship. The Bayesian methods (BayesR and BayesC) gave the highest accuracy of prediction, followed closely by hybrid methods with attention networks. The hybrid methods with attention networks gave the lowest variation in accuracy of prediction across validation folds (and lowest MSE), which may be a criteria worth considering in practical breeding programs. This suggests that hybrid methods incorporating the attention mechanism could be useful for genomic prediction of clonal performance, particularly where non-additive effects may be important.

3.
Front Pediatr ; 12: 1379131, 2024.
Article in English | MEDLINE | ID: mdl-38756971

ABSTRACT

Introduction: Respiratory illness is the most common childhood disease globally, especially in developing countries. Previous studies have detected viruses in approximately 70-80% of respiratory illnesses. Methods: In a prospective cohort study of 234 young children (ages 3-11 years) and 30 adults (ages 22-51 years) in rural Western Uganda sampled monthly from May 2019 to August 2021, only 24.2% of nasopharyngeal swabs collected during symptomatic disease had viruses detectable by multiplex PCR diagnostics and metagenomic sequencing. In the remaining 75.8% of swabs from symptomatic participants, we measured detection rates of respiratory bacteria Haemophilus influenzae, Moraxella catarrhalis, and Streptococcus pneumoniae by quantitative PCR. Results: 100% of children tested positive for at least one bacterial species. Detection rates were 87.2%, 96.8%, and 77.6% in children and 10.0%, 36.7%, and 13.3% for adults for H. influenzae, M. catarrhalis, and S. pneumoniae, respectively. In children, 20.8% and 70.4% were coinfected with two and three pathogens, respectively, and in adults 6.7% were coinfected with three pathogens but none were coinfected with two. Detection of any of the three pathogens was not associated with season or respiratory symptoms severity, although parsing detection status by symptoms was challenged by children experiencing symptoms in 80.3% of monthly samplings, whereas adults only reported symptoms 26.6% of the time. Pathobiont colonization in children in Western Uganda was significantly more frequent than in children living in high-income countries, including in a study of age-matched US children that utilized identical diagnostic methods. Detection rates were, however, comparable to rates in children living in other Sub-Saharan African countries. Discussion: Overall, our results demonstrate that nonviral colds contribute significantly to respiratory disease burden among children in rural Uganda and that high rates of respiratory pathobiont colonization may play a role. These conclusions have implications for respiratory health interventions in the area, such as increasing childhood immunization rates and decreasing air pollutant exposure.

4.
Front Pediatr ; 12: 1336009, 2024.
Article in English | MEDLINE | ID: mdl-38650995

ABSTRACT

Introduction: Respiratory disease is a major cause of morbidity and mortality in the developing world, but prospective studies of temporal patterns and risk factors are rare. Methods: We studied people in rural Western Uganda, where respiratory disease is pervasive. We followed 30 adults (ages 22-51 years; 534 observations) and 234 children (ages 3-11 years; 1,513 observations) between May 2019 and July 2022 and collected monthly data on their respiratory symptoms, for a total of 2,047 case records. We examined associations between demographic and temporal factors and respiratory symptoms severity. Results: The timing of our study (before, during, and after the emergence of COVID-19) allowed us to document the effects of public health measures instituted in the region. Incidence rates of respiratory symptoms before COVID-19 lockdown were 568.4 cases per 1,000 person-months in children and 254.2 cases per 1,000 person-months in adults. These rates were 2.6 times higher than the 2019 global average for children but comparable for adults. Younger children (ages 3-6 years) had the highest frequencies and severities of respiratory symptoms. Study participants were most likely to experience symptoms in February, which is a seasonal pattern not previously documented. Incidence and severity of symptoms in children decreased markedly during COVID-19 lockdown, illustrating the broad effects of public health measures on the incidence of respiratory disease. Discussion: Our results demonstrate that patterns of respiratory disease in settings such as Western Uganda resemble patterns in developed economies in some ways (age-related factors) but not in others (increased incidence in children and seasonal pattern). Factors such as indoor air quality, health care access, timing of school trimesters, and seasonal effects (rainy/dry seasons) likely contribute to the differences observed.

5.
Leukemia ; 2024 Apr 05.
Article in English | MEDLINE | ID: mdl-38580835

ABSTRACT

We examined the prevalence, risk factors, and association between pre-frailty and subsequent mortality after blood or marrow transplantation (BMT). Study participants were drawn from the BMT Survivor Study (BMTSS) and included 3346 individuals who underwent BMT between 1974 and 2014 at one of three transplant centers and survived ≥2 years post-BMT. Participants completed the BMTSS survey at a median of 9 years from BMT and were followed for subsequent mortality for a median of 5 years after survey completion. Closest-age and same-sex biological siblings also completed the survey. Previously published self-reported indices (exhaustion, weakness, low energy expenditure, slowness, unintentional weight loss) classified participants as non-frail (0-1 indices) or pre-frail (2 indices). National Death Index was used to determine vital status and cause of death. Overall, 626 (18.7%) BMT survivors were pre-frail. BMT survivors had a 3.2-fold higher odds of being pre-frail (95% CI = 1.9-5.3) compared to siblings. Compared to non-frail survivors, pre-frail survivors had higher hazards of all-cause mortality (adjusted hazard ratio [aHR] = 1.6, 95% CI = 1.4-2.0). Female sex, pre-BMT radiation, smoking, lack of exercise, anxiety, and severe/life-threatening chronic health conditions were associated with pre-frailty. The novel association between pre-frailty and subsequent mortality provides evidence for interventions as pre-frail individuals may transition back to their robust state.

6.
Cleft Palate Craniofac J ; : 10556656241241200, 2024 Mar 21.
Article in English | MEDLINE | ID: mdl-38515321

ABSTRACT

OBJECTIVE: To determine if the elastic chain premaxillary retraction (ECPR) appliance increases inter-medial and inter-lateral canthal dimension in patients with bilateral complete cleft lip and palate (BCLP). DESIGN: Retrospective cohort study. SETTING: Specialized tertiary care facility. PATIENTS, PARTICIPANTS: 126 patients with BCLP; 75 had ECPR, 51 had no pre-surgical manipulation. INTERVENTIONS: Three-dimensional facial photographs were obtained prior to insertion of appliance (T0), post-appliance therapy prior to appliance removal/labial repair (T1), and several months after labial repair (T2) for a longitudinal ECPR group, and were obtained after age 4 years (T3) for a non-longitudinal ECPR group and for the non-ECPR group. MAIN OUTCOME MEASURES: Inter-medial and inter-lateral canthal dimension (en-en, ex-ex) was determined for all groups/time-points. Measurements were compared between groups and to norms. RESULTS: The mean en-en and ex-ex was 32.6 ± 3.2 mm and 84.4 ± 6.3 mm for the ECPR group and 33.5 ± 3.1 mm and 86.7 ± 7.2 mm for the non-ECPR group at T3. Inter-medial and inter-lateral canthal dimensions were significantly greater than normal (P < .05) in both groups; there was no significant difference between groups (P > .05). The mean en-en and ex-ex for the Longitudinal ECPR group was 27.5 ± 2.4 mm and 66.7 ± 3.7 mm at T0, 29.6 ± 2.4 mm and 70.4 ± 2.9 mm at T1, and 29.2 ± 2.3 mm and 72.3 ± 3.8 mm at T2. en-en and ex-ex increased significantly from T0-T1 (P < .05), decreased at T2 (P > .05) and was significantly larger than normal at all time-points (P < .05). CONCLUSIONS: Inter-medial and inter-lateral canthal dimension increased after ECPR but returned to baseline growth trajectory. These dimensions were above normal at all time-points. There was no difference between those that did and did not have dentofacial orthopedic manipulation.

7.
Sci Rep ; 14(1): 4419, 2024 02 23.
Article in English | MEDLINE | ID: mdl-38388834

ABSTRACT

The skin is the primary feeding site of ticks that infest livestock animals such as cattle. The highly specialised functions of skin at the molecular level may be a factor contributing to variation in susceptibility to tick infestation; but these remain to be well defined. The aim of this study was to investigate the bovine skin transcriptomic profiles of tick-naïve and tick-infested cattle and to uncover the gene expression networks that influence contrasting phenotypes of host resistance to ticks. RNA-Seq data was obtained from skin of Brangus cattle with high (n = 5) and low (n = 6) host resistance at 0 and 12 weeks following artificial tick challenge with Rhipicephalus australis larvae. No differentially expressed genes were detected pre-infestation between high and low resistance groups, but at 12-weeks there were 229 differentially expressed genes (DEGs; FDR < 0.05), of which 212 were the target of at least 1866 transcription factors (TFs) expressed in skin. Regulatory impact factor (RIF) analysis identified 158 significant TFs (P < 0.05) of which GRHL3, and DTX1 were also DEGs in the experiment. Gene term enrichment showed the significant TFs and DEGs were enriched in processes related to immune response and biological pathways related to host response to infectious diseases. Interferon Type 1-stimulated genes, including MX2, ISG15, MX1, OAS2 were upregulated in low host resistance steers after repeated tick challenge, suggesting dysregulated wound healing and chronic inflammatory skin processes contributing to host susceptibility to ticks. The present study provides an assessment of the bovine skin transcriptome before and after repeated tick challenge and shows that the up-regulation of pro-inflammatory genes is a prominent feature in the skin of tick-susceptible animals. In addition, the identification of transcription factors with high regulatory impact provides insights into the potentially meaningful gene-gene interactions involved in the variation of phenotypes of bovine host resistance to ticks.


Subject(s)
Cattle Diseases , Rhipicephalus , Tick Infestations , Animals , Cattle , Rhipicephalus/genetics , Disease Susceptibility , Tick Infestations/genetics , Tick Infestations/veterinary , Transcriptome , Inflammation/genetics , Transcription Factors/genetics , Cattle Diseases/genetics
8.
J Laryngol Otol ; 138(6): 667-671, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38369910

ABSTRACT

OBJECTIVE: This study analyses outcomes for 660 patients managed via a novel telescopic pathway for suspected head and neck cancer referrals. METHOD: Data were collected prospectively between January 2021 and December 2022, capturing all two-week-wait referrals triaged as low risk and managed via a nurse-led clinic for nasendoscopic examination and consultant-led remote assessment. RESULTS: In total, 660 patients were included. There were six head and neck cancers diagnosed, giving a conversion rate of 0.9 per cent. Mean (standard deviation) time to informing the patient whether they did or did not have cancer (28-day faster diagnosis standard) was 28.6 days (20.2), with no significant difference observed in patients imaged prior to review (p = 0.63). No missed cancers were detected in the follow-up period. CONCLUSION: Low-risk head and neck cancer referrals can be safely managed in a nurse-led clinic for recorded examination with asynchronous consultant-led management. Further work is required to ensure adherence to the new faster diagnosis standard.


Subject(s)
Head and Neck Neoplasms , Referral and Consultation , Humans , Head and Neck Neoplasms/therapy , Head and Neck Neoplasms/diagnosis , Referral and Consultation/statistics & numerical data , Prospective Studies , Male , Female , Middle Aged , Time Factors , Waiting Lists , Aged , Triage/methods , Adult , Remote Consultation/methods
9.
Leukemia ; 38(3): 601-609, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38374408

ABSTRACT

We determined the risk of late morbidity and mortality after autologous blood or marrow transplantation (BMT) for lymphoma performed before age 40. The cohort included autologous BMT recipients who had survived ≥2 years after transplantation (N = 583 [HL = 59.9%; NHL = 40.1%]) and a comparison cohort (N = 1070). Participants self-reported sociodemographics and chronic health conditions. A severity score (grade 3 [severe], 4 [life threatening] or 5 [fatal]) was assigned to the conditions using CTCAE v5.0. Logistic regression estimated the odds of grade 3-4 conditions in survivors vs. comparison subjects. Proportional subdistribution hazards models identified predictors of grade 3-5 conditions among BMT recipients. Median age at BMT was 30.0 years (range: 2.0-40.0) and median follow-up was 9.8 years (2.0-32.1). Survivors were at a 3-fold higher adjusted odds for grade 3-4 conditions (95% CI = 2.3-4.1) vs. comparison subjects. Factors associated with grade 3-5 conditions among BMT recipients included age at BMT (>30 years: adjusted hazard ratio [aHR] = 2.31; 95% CI = 1.27-4.19; reference: ≤21 years), pre-BMT radiation (aHR = 1.52; 95% CI = 1.13-2.03; reference: non-irradiated), and year of BMT (≥2000: aHR = 0.54; 95% CI = 0.34-0.85; reference: <1990). The 25 years cumulative incidence of relapse-related and non-relapse-related mortality was 18.2% and 25.9%, respectively. The high risk for late morbidity and mortality after autologous BMT for lymphoma performed at age <40 calls for long-term anticipatory risk-based follow-up.


Subject(s)
Bone Marrow Transplantation , Lymphoma , Child , Humans , Adolescent , Young Adult , Adult , Bone Marrow Transplantation/adverse effects , Bone Marrow , Neoplasm Recurrence, Local , Lymphoma/therapy , Transplantation, Autologous/adverse effects , Morbidity
10.
Am Surg ; 90(5): 1023-1029, 2024 May.
Article in English | MEDLINE | ID: mdl-38073251

ABSTRACT

BACKGROUND: Cancer centers provide superior care but are less accessible to rural populations. Health systems that integrate a cancer center may provide broader access to quality surgical care, but penetration to rural hospitals is unknown. METHODS: Cancer center data were linked to health system data to describe health systems based on whether they included at least one accredited cancer center. Health systems with and without cancer centers were compared based on rural hospital presence. Bivariate tests and multivariable logistic regression were used with results reported as P-values and odds ratios (OR) with 95% confidence intervals (CIs). RESULTS: Ninety percent of cancer centers are in a health system, and 72% of health systems (434/607) have a cancer center. Larger health systems (P = .03) with more trainees (P = .03) more often have cancer centers but are no more likely to include rural hospitals (11% vs 6%, P = .43; adjusted OR .69, 95% CI .28-1.70). The minority of cancer centers not in health systems (N = 95) more often serve low complexity patient populations (P = .02) in non-metropolitan areas (P = .03). DISCUSSION: Health systems with rural hospitals are no more likely to have a cancer center. Ongoing health system integration will not necessarily expand rural patients' access to surgical care under existing health policy infrastructure and incentives.


Subject(s)
Hospitals, Rural , Neoplasms , Humans , Quality of Health Care , Rural Population
11.
Plant Genome ; 17(1): e20417, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38066702

ABSTRACT

Genomic selection in sugarcane faces challenges due to limited genomic tools and high genomic complexity, particularly because of its high and variable ploidy. The classification of genotypes for single nucleotide polymorphisms (SNPs) becomes difficult due to the wide range of possible allele dosages. Previous genomic studies in sugarcane used pseudo-diploid genotyping, grouping all heterozygotes into a single class. In this study, we investigate the use of continuous genotypes as a proxy for allele-dosage in genomic prediction models. The hypothesis is that continuous genotypes could better reflect allele dosage at SNPs linked to mutations affecting target traits, resulting in phenotypic variation. The dataset included genotypes of 1318 clones at 58K SNP markers, with about 26K markers filtered using standard quality controls. Predictions for tonnes of cane per hectare (TCH), commercial cane sugar (CCS), and fiber content (Fiber) were made using parametric, non-parametric, and Bayesian methods. Continuous genotypes increased accuracy by 5%-7% for CCS and Fiber. The pseudo-diploid parametrization performed better for TCH. Reproducing kernel Hilbert spaces model with Gaussian kernel and AK4 (arc-cosine kernel with hidden layer 4) kernel outperformed other methods for TCH and CCS, suggesting that non-additive effects might influence these traits. The prevalence of low-dosage markers in the study may have limited the benefits of approximating allele-dosage information with continuous genotypes in genomic prediction models. Continuous genotypes simplify genomic prediction in polyploid crops, allowing additional markers to be used without adhering to pseudo-diploid inheritance. The approach can particularly benefit high ploidy species or emerging crops with unknown ploidy.


Subject(s)
Saccharum , Saccharum/genetics , Bayes Theorem , Genotype , Phenotype , Genomics
12.
Cancer ; 130(5): 803-815, 2024 03 01.
Article in English | MEDLINE | ID: mdl-37880912

ABSTRACT

BACKGROUND: Blood or marrow transplantation (BMT) survivors carry a high burden of morbidity, yet health care utilization by this vulnerable population remains understudied. Patterns and predictors of various domains of health care utilization in long-term BMT survivors were evaluated. METHODS: Study participants were drawn from the Bone Marrow Transplant Survivor Study (BMTSS). Patients transplanted between 1974 and 2014 at one of three transplant centers who had survived ≥2 years after BMT and were aged ≥18 years at the time of the study were included. A BMTSS survey served as the source of data for health care utilization, sociodemographics, and chronic health conditions. Domains of health care utilization in the 2 years preceding study participation included routine checkups, BMT-related visits, transplant/cancer center visits, emergency room (ER) visits, hospitalizations, and high health care utilization (≥7 physician visits during the 2 years before the study). Clinical characteristics and therapeutic exposures were abstracted from medical records. RESULTS: In this cohort of 3342 BMT survivors (52% allogeneic), the prevalence of health care utilization declined over time since BMT for both allogeneic and autologous BMT survivors, such that among those who had survived ≥20 years, only 49%-53% had undergone routine checkups, 37%-38% reported BMT-related visits, and 28%-29% reported transplant/cancer center visits. The presence of severe/life-threatening conditions and chronic graft-vs-host disease increased the odds of health care utilization across all domains. Lower education, lack of insurance, and Hispanic ethnicity were associated with a lower prevalence of routine checkups and/or transplant/cancer center visits. Lower income increased the odds of ER visits but reduced the odds of hospitalizations or high health care utilization. CONCLUSIONS: This study identified vulnerable populations of long-term BMT survivors who would benefit from specialized risk-based anticipatory care to reduce high health care utilization, ER visits, and hospitalizations.


Subject(s)
Bone Marrow , Hematopoietic Stem Cell Transplantation , Humans , Adolescent , Adult , Bone Marrow Transplantation , Survivors , Chronic Disease , Patient Acceptance of Health Care
13.
JAMA Neurol ; 81(2): 163-169, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38147345

ABSTRACT

Importance: Cerebral amyloid angiopathy (CAA) is a common cause of spontaneous intracerebral hemorrhage in older patients. Although other types of intracranial hemorrhage can occur in conjunction with CAA-related intracerebral hemorrhage, the association between CAA and other subtypes of intracranial hemorrhage, particularly in the absence of intracerebral hemorrhage, remains poorly understood. Objective: To determine whether CAA is an independent risk factor for isolated nontraumatic subdural hemorrhage (SDH). Design, Setting, and Participants: A population-based cohort study was performed using a 2-stage analysis of prospectively collected data in the UK Biobank cohort (discovery phase, 2006-2022) and the All of Us Research Program cohort (replication phase, 2018-2022). Participants included those who contributed at least 1 year of data while they were older than 50 years, in accordance with the diagnostic criteria for CAA. Participants with prevalent intracranial hemorrhage were excluded. Data were analyzed from October 2022 to October 2023. Exposure: A diagnosis of CAA, identified using the International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) diagnosis code. Main Outcomes and Measures: The outcome was an isolated nontraumatic SDH, identified using ICD-10-CM codes. Two identical analyses were performed separately in the 2 cohorts. First, the risk of SDH in patients with and without CAA was assessed using Cox proportional hazards models, adjusting for demographic characteristics, cardiovascular comorbidities, and antithrombotic medication use. Second, multivariable logistic regression was used to study the association between CAA and SDH. Results: The final analytical sample comprised 487 223 of the total 502 480 individuals in the UK Biobank cohort and 158 008 of the total 372 082 individuals in the All of Us cohort. Among the 487 223 participants in the discovery phase of the UK Biobank, the mean (SD) age was 56.5 (8.1) years, and 264 195 (54.2%) were female. There were 649 cases of incident SDH. Of the 126 participants diagnosed with CAA, 3 (2.4%) developed SDH. In adjusted Cox regression analyses, participants with CAA had an increased risk of having an SDH compared with those without CAA (hazard ratio [HR], 8.0; 95% CI, 2.6-24.8). Multivariable logistic regression analysis yielded higher odds of SDH among participants with CAA (odds ratio [OR], 7.6; 95% CI, 1.8-20.4). Among the 158 008 participants in the All of Us cohort, the mean (SD) age was 63.0 (9.5) years, and 89 639 (56.7%) were female. The findings were replicated in All of Us, in which 52 participants had CAA and 320 had an SDH. All of Us participants with CAA had an increased risk of having an SDH compared with those without CAA (HR, 4.9; 95% CI, 1.2-19.8). In adjusted multivariable logistic regression analysis, CAA was associated with higher odds of SDH (OR, 5.2; 95% CI, 0.8-17.6). Conclusions and Relevance: In 2 large, heterogeneous cohorts, CAA was associated with increased risk of SDH. These findings suggest that CAA may be a novel risk factor for isolated nontraumatic SDH.


Subject(s)
Cerebral Amyloid Angiopathy , Population Health , Humans , Female , Aged , Middle Aged , Male , Cohort Studies , Hematoma, Subdural/epidemiology , Cerebral Amyloid Angiopathy/complications , Cerebral Amyloid Angiopathy/epidemiology , Cerebral Hemorrhage/complications , Magnetic Resonance Imaging/adverse effects
14.
Front Plant Sci ; 14: 1260517, 2023.
Article in English | MEDLINE | ID: mdl-38023905

ABSTRACT

Mate-allocation strategies in breeding programs can improve progeny performance by harnessing non-additive genetic effects. These approaches prioritise predicted progeny merit over parental breeding value, making them particularly appealing for clonally propagated crops such as sugarcane. We conducted a comparative analysis of mate-allocation strategies, exploring utilising non-additive and heterozygosity effects to maximise clonal performance with schemes that solely consider additive effects to optimise breeding value. Using phenotypic and genotypic data from a population of 2,909 clones evaluated in final assessment trials of Australian sugarcane breeding programs, we focused on three important traits: tonnes of cane per hectare (TCH), commercial cane sugar (CCS), and Fibre. By simulating families from all possible crosses (1,225) with 50 progenies each, we predicted the breeding and clonal values of progeny using two models: GBLUP (considering additive effects only) and extended-GBLUP (incorporating additive, non-additive, and heterozygosity effects). Integer linear programming was used to identify the optimal mate-allocation among selected parents. Compared to breeding value-based approaches, mate-allocation strategies based on clonal performance yielded substantial improvements, with predicted progeny values increasing by 57% for TCH, 12% for CCS, and 16% for fibre. Our simulation study highlights the effectiveness of mate-allocation approaches that exploit non-additive and heterozygosity effects, resulting in superior clonal performance. However, there was a notable decline in additive gain, particularly for TCH, likely due to significant epistatic effects. When selecting crosses based on clonal performance for TCH, the inbreeding coefficient of progeny was significantly lower compared to random mating, underscoring the advantages of leveraging non-additive and heterozygosity effects in mitigating inbreeding depression. Thus, mate-allocation strategies are recommended in clonally propagated crops to enhance clonal performance and reduce the negative impacts of inbreeding.

15.
Genet Sel Evol ; 55(1): 71, 2023 Oct 16.
Article in English | MEDLINE | ID: mdl-37845626

ABSTRACT

BACKGROUND: It has been challenging to implement genomic selection in multi-breed tropical beef cattle populations. If commercial (often crossbred) animals could be used in the reference population for these genomic evaluations, this could allow for very large reference populations. In tropical beef systems, such animals often have no pedigree information. Here we investigate potential models for such data, using marker heterozygosity (to model heterosis) and breed composition derived from genetic markers, as covariates in the model. Models treated breed effects as either fixed or random, and included genomic best linear unbiased prediction (GBLUP) and BayesR. A tropically-adapted beef cattle dataset of 29,391 purebred, crossbred and composite commercial animals was used to evaluate the models. RESULTS: Treating breed effects as random, in an approach analogous to genetic groups allowed partitioning of the genetic variance into within-breed and across breed-components (even with a large number of breeds), and estimation of within-breed and across-breed genomic estimated breeding values (GEBV). We demonstrate that moderately-accurate (0.30-0.43) GEBV can be calculated using these models. Treating breed effects as random gave more accurate GEBV than treating breed as fixed. A simple GBLUP model where no breed effects were fitted gave the same accuracy (and correlations of GEBV very close to 1) as a model where GEBV for within-breed and the GEBV for (random) across-breed effects were included. When GEBV were predicted for herds with no data in the reference population, BayesR resulted in the highest accuracy, with 3% accuracy improvement averaged across traits, especially when the validation population was less related to the reference population. Estimates of heterosis from our models were in line with previous estimates from beef cattle. A method for estimating the number of effective breed comparisons for each breed combination accumulated across contemporary groups is presented. CONCLUSIONS: When no pedigree is available, breed composition and heterosis for inclusion in multi-breed genomic evaluation can be estimated from genotypes. When GEBV were predicted for herds with no data in the reference population, BayesR resulted in the highest accuracy.


Subject(s)
Genome , Polymorphism, Single Nucleotide , Animals , Cattle/genetics , Genomics/methods , Genotype , Phenotype , Models, Genetic
16.
Blood Adv ; 7(22): 7028-7044, 2023 11 28.
Article in English | MEDLINE | ID: mdl-37682779

ABSTRACT

We examined the association between risky health behaviors (smoking, heavy alcohol consumption, and lack of vigorous physical activity) and all-cause and cause-specific late mortality after blood or marrow transplantation (BMT) to understand the role played by potentially modifiable risk factors. Study participants were drawn from the BMT Survivor Study (BMTSS) and included patients who received transplantation between 1974 and 2014, had survived ≥2 years after BMT, and were aged ≥18 years at study entry. Survivors provided information on sociodemographic characteristics, chronic health conditions, and health behaviors. National Death Index was used to determine survival and cause of death. Multivariable regression analyses determined the association between risky health behaviors and all-cause mortality (Cox regression) and nonrecurrence-related mortality (NRM; subdistribution hazard regression), after adjusting for relevant sociodemographic, clinical variables and therapeutic exposures. Overall, 3866 participants completed the BMTSS survey and were followed for a median of 5 years to death or 31 December 2021; and 856 participants (22.1%) died after survey completion. Risky health behaviors were associated with increased hazard of all-cause mortality (adjusted hazard ratio [aHR] former smoker, 1.2; aHR current smoker, 1.7; reference, nonsmoker; aHR heavy drinker, 1.4; reference, nonheavy drinker; and aHR no vigorous activity, 1.2; reference, vigorous activity) and NRM (aHR former smoker, 1.3; aHR current smoker, 1.6; reference, nonsmoker; aHR heavy drinker, 1.4; reference: nonheavy drinker; and aHR no vigorous activity, 1.2; reference, vigorous activity). The association between potentially modifiable risky health behaviors and late mortality offers opportunities for development of interventions to improve both the quality and quantity of life after BMT.


Subject(s)
Bone Marrow , Health Risk Behaviors , Humans , Adolescent , Adult , Risk Factors
17.
Plant Genome ; 16(4): e20390, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37728221

ABSTRACT

Sugarcane has a complex, highly polyploid genome with multi-species ancestry. Additive models for genomic prediction of clonal performance might not capture interactions between genes and alleles from different ploidies and ancestral species. As such, genomic prediction in sugarcane presents an interesting case for machine learning (ML) methods, which are purportedly able to deal with high levels of complexity in prediction. Here, we investigated deep learning (DL) neural networks, including multilayer networks (MLP) and convolution neural networks (CNN), and an ensemble machine learning approach, random forest (RF), for genomic prediction in sugarcane. The data set used was 2912 sugarcane clones, scored for 26,086 genome wide single nucleotide polymorphism markers, with final assessment trial data for total cane harvested (TCH), commercial cane sugar (CCS), and fiber content (Fiber). The clones in the latest trial (2017) were used as a validation set. We compared prediction accuracy of these methods to genomic best linear unbiased prediction (GBLUP) extended to include dominance and epistatic effects. The prediction accuracies from GBLUP models were up to 0.37 for TCH, 0.43 for CCS, and 0.48 for Fiber, while the optimized ML models had prediction accuracies of 0.35 for TCH, 0.38 for CCS, and 0.48 for Fiber. Both RF and DL neural network models have comparable predictive ability with the additive GBLUP model but are less accurate than the extended GBLUP model.


Subject(s)
Saccharum , Saccharum/genetics , Plant Breeding , Genomics/methods , Machine Learning , Polyploidy
18.
JACC CardioOncol ; 5(4): 504-517, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37614590

ABSTRACT

Background: The long-term risk of coronary heart disease (CHD) and clinical models that predict this risk remain understudied in blood or marrow transplantation (BMT) recipients. Objectives: This study sought to examine the risk of CHD after BMT and identify the associated risk factors. Methods: Participants included patients transplanted between 1974 and 2014 at City of Hope, University of Minnesota, or University of Alabama at Birmingham and those who survived ≥2 years after BMT. Multivariable logistic regression models assessed CHD risk in BMT survivors compared with a sibling cohort. A self-reported questionnaire and medical records provided information regarding sociodemographics, comorbidities, and therapeutic exposures, which were used to develop a CHD risk prediction nomogram. Results: Overall, 6,677 BMT recipients participated; the mean age at BMT was 43.9 ± 17.7 years, 58.3% were male, and 73.3% were non-Hispanic Whites. The median length of follow-up was 6.9 years (range: 2-46.2 years) from BMT. CHD was reported in 249 participants, with a 20-year cumulative incidence of 5.45% ± 0.39%. BMT survivors had a 1.6-fold greater odds of CHD compared with a sibling cohort (95% CI: 1.09-2.40). A nomogram was then developed to predict the risk of CHD at 10 and 20 years after BMT including age at BMT (HR: 1.06/y; 95% CI: 1.04-1.08), male sex (HR: 1.89; 95% CI: 1.15-3.11), a history of smoking (HR: 1.61; 95% CI: 1.01-2.58), diabetes (HR: 2.45; 95% CI: 1.23-4.89), hypertension (HR: 2.02; 95% CI: 1.15-3.54), arrhythmia (HR: 1.90; 95% CI: 0.89-4.06), and pre-BMT chest radiation (yes vs no: HR: 2.83; 95% CI: 1.20-6.67; unknown vs no: HR: 0.88; 95% CI: 0.34-2.28). The C-statistic was 0.77 in the test set (95% CI: 0.70-0.83). Conclusions: This study identified BMT recipients at high risk for CHD, informing targeted screening for early detection and aggressive control of risk factors.

19.
BMC Cancer ; 23(1): 390, 2023 May 01.
Article in English | MEDLINE | ID: mdl-37127595

ABSTRACT

BACKGROUND: Patients undergoing hematopoietic cell transplantation (HCT) are at high risk of chronic health complications, including frailty and physical dysfunction. Conventional exercise programs have been shown to improve frailty in other cancer populations, but these have largely been based out of rehabilitation facilities that may act as geographic and logistical barriers. There is a paucity of information on the feasibility of implementing telehealth exercise interventions in long-term HCT survivors. METHODS: We conducted a pilot randomized trial to assess the feasibility of an 8-week telehealth exercise intervention in 20 pre-frail or frail HCT survivors. Participants were randomized to either a telehealth exercise (N = 10) or delayed control (N = 10). We administered a remote physical function assessment at baseline, followed by an 8-week telehealth exercise intervention (30-60 min/session, 3 sessions/week), and post-intervention. The primary endpoint was feasibility as determined by 1) > 70% of participants completing all remote physical functional assessments, and 2) > 70% of participants in the exercise group completing > 70% (17/24) of the prescribed exercise sessions. Exploratory outcomes included changes in gait speed, handgrip strength, and short physical performance battery. RESULTS: The mean [standard deviation] age at study enrollment was 64.7 [9.1] years old. Twelve had undergone allogenic and 8 had undergone autologous HCT at an average of 17 years from study enrollment. Both feasibility criteria were achieved. Nineteen patients (95%) completed all remote study outcome assessments at baseline and post-intervention, and nine participants in the exercise group completed > 70% of prescribed exercise sessions. Overall, no significant group x time interaction was observed on handgrip strength, fatigue, body mass index, and short physical performance battery test (P < 0.05). However, there were significant within-group improvements in four-meter gait speed (+ 13.9%; P = 0.004) and 5-minute gait speed (+ 25.4%; P = 0.04) in the exercise group whereas non-significant changes in four-meter gait speed (-3.8%) and 5-minute gait speed (-5.8%) were observed after 8 weeks. CONCLUSION: Implementing an 8-week telehealth exercise intervention for long-term HCT survivors was feasible. Our findings set the stage for innovative delivery of supervised exercise intervention that reduces the burden of frailty in HCT survivors as well as other at-risk cancer survivors. TRIAL REGISTRATION: The protocol and informed consent were approved by the institutional IRB (IRB#20731) and registered (ClinicalTrials.gov NCT04968119; date of registration: 20/07/2021).


Subject(s)
Frailty , Hematopoietic Stem Cell Transplantation , Telemedicine , Humans , Aged , Child , Frail Elderly , Hand Strength , Feasibility Studies , Pilot Projects , Exercise Therapy/methods , Survivors
20.
J Exp Psychol Gen ; 152(10): 2882-2896, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37155284

ABSTRACT

[Correction Notice: An Erratum for this article was reported online in Journal of Experimental Psychology: General on Aug 10 2023 (see record 2023-96713-001). In the original article, there were affiliation errors for the first and 14th authors. The affiliations for Dorsa Amir are Department of Psychology, University of California, Berkeley; and Department of Psychology, Boston College. The affiliation for Katherine McAuliffe is Department of Psychology, Boston College. All versions of this article have been corrected.] Inequity aversion is an important factor in fairness behavior. Previous work suggests that children show more cross-cultural variation in their willingness to reject allocations that would give them more rewards than their partner-advantageous inequity-as opposed to allocations that would give them less than their partner-disadvantageous inequity. However, as past work has relied solely on children's decisions to accept or reject these offers, the algorithms underlying this pattern of variation remain unclear. Here, we explore the computational signatures of inequity aversion by applying a computational model of decision-making to data from children (N = 807) who played the Inequity Game across seven societies. Specifically, we used drift-diffusion models to formally distinguish evaluative processing (i.e., the computation of the subjective value of accepting or rejecting inequity) from alternative factors such as decision speed and response strategies. Our results suggest that variation in the development of inequity aversion across societies is best accounted for by variation in the drift rate-the direction and strength of the evaluative preference. Our findings underscore the utility of looking beyond decision data to better understand behavioral diversity. (PsycInfo Database Record (c) 2023 APA, all rights reserved).


Subject(s)
Cooperative Behavior , Social Behavior , Humans , Child , Choice Behavior , Child Behavior/psychology , Universities
SELECTION OF CITATIONS
SEARCH DETAIL
...