Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 348
Filter
2.
BJOG ; 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38561325
4.
Cancer Res ; 84(12): 1996-2008, 2024 Jun 14.
Article in English | MEDLINE | ID: mdl-38635885

ABSTRACT

Metabolic subtypes of glioblastoma (GBM) have different prognoses and responses to treatment. Deuterium metabolic imaging with 2H-labeled substrates is a potential approach to stratify patients into metabolic subtypes for targeted treatment. In this study, we used 2H magnetic resonance spectroscopy and magnetic resonance spectroscopic imaging (MRSI) measurements of [6,6'-2H2]glucose metabolism to identify metabolic subtypes and their responses to chemoradiotherapy in patient-derived GBM xenografts in vivo. The metabolism of patient-derived cells was first characterized in vitro by measuring the oxygen consumption rate, a marker of mitochondrial tricarboxylic acid cycle activity, as well as the extracellular acidification rate and 2H-labeled lactate production from [6,6'-2H2]glucose, which are markers of glycolytic activity. Two cell lines representative of a glycolytic subtype and two representative of a mitochondrial subtype were identified. 2H magnetic resonance spectroscopy and MRSI measurements showed similar concentrations of 2H-labeled glucose from [6,6'-2H2]glucose in all four tumor models when implanted orthotopically in mice. The glycolytic subtypes showed higher concentrations of 2H-labeled lactate than the mitochondrial subtypes and normal-appearing brain tissue, whereas the mitochondrial subtypes showed more glutamate/glutamine labeling, a surrogate for tricarboxylic acid cycle activity, than the glycolytic subtypes and normal-appearing brain tissue. The response of the tumors to chemoradiation could be detected within 24 hours of treatment completion, with the mitochondrial subtypes showing a decrease in both 2H-labeled glutamate/glutamine and lactate concentrations and glycolytic tumors showing a decrease in 2H-labeled lactate concentration. This technique has the potential to be used clinically for treatment selection and early detection of treatment response. SIGNIFICANCE: Deuterium magnetic resonance spectroscopic imaging of glucose metabolism has the potential to differentiate between glycolytic and mitochondrial metabolic subtypes in glioblastoma and to evaluate early treatment responses, which could guide patient treatment.


Subject(s)
Brain Neoplasms , Chemoradiotherapy , Deuterium , Glioblastoma , Glucose , Glioblastoma/metabolism , Glioblastoma/diagnostic imaging , Glioblastoma/therapy , Glioblastoma/pathology , Glioblastoma/drug therapy , Humans , Animals , Mice , Brain Neoplasms/metabolism , Brain Neoplasms/diagnostic imaging , Brain Neoplasms/therapy , Brain Neoplasms/drug therapy , Brain Neoplasms/pathology , Glucose/metabolism , Chemoradiotherapy/methods , Cell Line, Tumor , Glycolysis , Xenograft Model Antitumor Assays , Mitochondria/metabolism , Magnetic Resonance Spectroscopy/methods , Magnetic Resonance Imaging/methods , Female
5.
BJOG ; 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38302677

ABSTRACT

OBJECTIVE: To investigate the validity of the conclusion from Cochrane reviews and meta-analyses that treatment with calcium supplementation during pregnancy reduces the risk for pre-eclampsia by 55%, which has been influential in international guidelines and future research. DESIGN: Sensitivity analysis of data from Cochrane reviews of trials evaluating high-dose calcium supplementation (of at least 1 g/day) for reduction of pre-eclampsia risk. SETTING: Systematic review and meta-analysis. POPULATION: The Cochrane reviews and meta-analyses included 13 trials enrolling a total of 15 730 women. Random-effects meta-analysis of these studies resulted in a mean risk ratio (RR, calcium/placebo) of 0.45 (95% confidence interval [CI] 0.31-0.65; p < 0.0001). METHODS: We carried out a sensitivity analysis of evidence from the relevant Cochrane review, to examine the impact of study size. MAIN OUTCOME MEASURES: pre-eclampsia. RESULTS: In the three largest studies, accounting for 13 815 (88%) of total recruitment, mean RR was 0.92 (95% CI 0.80-1.06) and there was no evidence of heterogeneity between studies (I2 = 0). With inclusion of the smaller studies, mean RR decreased to 0.45 and I2 increased to 70%. CONCLUSIONS: In assessment of the effect of calcium supplementation on pre-eclampsia risk, the naive focus on the mean of the random-effects meta-analysis in the presence of substantial heterogeneity is highly misleading.

6.
Hypertension ; 81(2): 311-318, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38232144

ABSTRACT

BACKGROUND: Cardiovascular disease is the leading cause of mortality in women. Pregnancy is an ideal period to implement cardiovascular prevention strategies as women seek medical help. We aimed to develop a predictive model to identify women at increased risk for chronic hypertension (CH) based on information collected in the index pregnancy. METHODS: Cohort of 26 511 women seen in 2 consecutive pregnancies. Included were women without CH, with information on maternal characteristics and blood pressure at 11 to 13 weeks' gestation, and the development of preeclampsia or gestational hypertension (GH) in the index pregnancy. Logistic regression models were fitted for the prediction of the development of future CH by the 20th week of the subsequent pregnancy. The performance of screening and risk calibration of the model were assessed. RESULTS: In this study 1560 (5.9%) women developed preeclampsia or GH (index pregnancy), and 215 (0.8%) developed future CH, with a median of 3.0 years later. Predictors of development of future CH were maternal age, weight, and blood pressure; Black and South Asian ethnicity; family history of preeclampsia; parity; and development of preeclampsia or GH. Preeclampsia or GH detected 52.1% (45.2%-58.9%) of future CH. At a screen-positive rate of 10%, a model including maternal characteristics, early pregnancy blood pressure, and development of preeclampsia or GH detected 73.5% (67.1-79.3) of future CH. CONCLUSIONS: Early pregnancy maternal characteristics, blood pressure, and development of preeclampsia or GH identify three-fourths of women at risk for future CH. Our results offer an important preventative strategy for identifying women at increased risk of future CH, which is applicable worldwide.


Subject(s)
Cardiovascular Diseases , Hypertension, Pregnancy-Induced , Pre-Eclampsia , Pregnancy , Female , Humans , Male , Pre-Eclampsia/diagnosis , Pre-Eclampsia/epidemiology , Pre-Eclampsia/prevention & control , Hypertension, Pregnancy-Induced/diagnosis , Hypertension, Pregnancy-Induced/epidemiology , Blood Pressure , Maternal Age , Cardiovascular Diseases/complications , Risk Factors
7.
BJOG ; 131(4): 483-492, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37749709

ABSTRACT

OBJECTIVE: To report the predictive performance for preterm birth (PTB) of the Fetal Medicine Foundation (FMF) triple test and National Institute for health and Care Excellence (NICE) guidelines used to screen for pre-eclampsia and examine the impact of aspirin in the prevention of PTB. DESIGN: Secondary analysis of data from the SPREE study and the ASPRE trial. SETTING: Multicentre studies. POPULATION: In SPREE, women with singleton pregnancies had screening for preterm pre-eclampsia at 11-13 weeks of gestation by the FMF method and NICE guidelines. There were 16 451 pregnancies that resulted in delivery at ≥24 weeks of gestation and these data were used to derive the predictive performance for PTB of the two methods of screening. The results from the ASPRE trial were used to examine the effect of aspirin in the prevention of PTB in the population from SPREE. METHODS: Comparison of performance of FMF method and NICE guidelines for pre-eclampsia in the prediction of PTB and use of aspirin in prevention of PTB. MAIN OUTCOME MEASURE: Spontaneous PTB (sPTB), iatrogenic PTB for pre-eclampsia (iPTB-PE) and iatrogenic PTB for reasons other than pre-eclampsia (iPTB-noPE). RESULTS: Estimated incidence rates of sPTB, iPTB-PE and iPTB-noPE were 3.4%, 0.8% and 1.6%, respectively. The corresponding detection rates were 17%, 82% and 25% for the triple test and 12%, 39% and 19% for NICE guidelines, using the same overall screen positive rate of 10.2%. The estimated proportions prevented by aspirin were 14%, 65% and 0%, respectively. CONCLUSION: Prediction of sPTB and iPTB-noPE by the triple test was poor and poorer by the NICE guidelines. Neither sPTB nor iPTB-noPE was reduced substantially by aspirin.


Subject(s)
Pre-Eclampsia , Premature Birth , Female , Humans , Infant, Newborn , Pregnancy , Aspirin/therapeutic use , Biomarkers , Iatrogenic Disease , Placenta Growth Factor , Pre-Eclampsia/diagnosis , Pre-Eclampsia/prevention & control , Pre-Eclampsia/epidemiology , Pregnancy Trimester, First , Premature Birth/epidemiology , Uterine Artery , Clinical Trials as Topic
8.
Am J Obstet Gynecol ; 230(4): 448.e1-448.e15, 2024 Apr.
Article in English | MEDLINE | ID: mdl-37778678

ABSTRACT

BACKGROUND: Epidemiological studies have shown that women with preeclampsia (PE) are at increased long term cardiovascular risk. This risk might be associated with accelerated vascular ageing process but data on vascular abnormalities in women with PE are scarce. OBJECTIVE: This study aimed to identify the most discriminatory maternal vascular index in the prediction of PE at 35 to 37 weeks' gestation and to examine the performance of screening for PE by combinations of maternal risk factors and biophysical and biochemical markers at 35 to 37 weeks' gestation. STUDY DESIGN: This was a prospective observational nonintervention study in women attending a routine hospital visit at 35 0/7 to 36 6/7 weeks' gestation. The visit included recording of maternal demographic characteristics and medical history, vascular indices, and hemodynamic parameters obtained by a noninvasive operator-independent device (pulse wave velocity, augmentation index, cardiac output, stroke volume, central systolic and diastolic blood pressures, total peripheral resistance, and fetal heart rate), mean arterial pressure, uterine artery pulsatility index, and serum concentration of placental growth factor and soluble fms-like tyrosine kinase-1. The performance of screening for delivery with PE at any time and at <3 weeks from assessment using a combination of maternal risk factors and various combinations of biomarkers was determined. RESULTS: The study population consisted of 6746 women with singleton pregnancies, including 176 women (2.6%) who subsequently developed PE. There were 3 main findings. First, in women who developed PE, compared with those who did not, there were higher central systolic and diastolic blood pressures, pulse wave velocity, peripheral vascular resistance, and augmentation index. Second, the most discriminatory indices were systolic and diastolic blood pressures and pulse wave velocity, with poor prediction from the other indices. However, the performance of screening by a combination of maternal risk factors plus mean arterial pressure was at least as high as that of a combination of maternal risk factors plus central systolic and diastolic blood pressures; consequently, in screening for PE, pulse wave velocity, mean arterial pressure, uterine artery pulsatility index, placental growth factor, and soluble fms-like tyrosine kinase-1 were used. Third, in screening for both PE within 3 weeks and PE at any time from assessment, the detection rate at a false-positive rate of 10% of a biophysical test consisting of maternal risk factors plus mean arterial pressure, uterine artery pulsatility index, and pulse wave velocity (PE within 3 weeks: 85.2%; 95% confidence interval, 75.6%-92.1%; PE at any time: 69.9%; 95% confidence interval, 62.5%-76.6%) was not significantly different from a biochemical test using the competing risks model to combine maternal risk factors with placental growth factor and soluble fms-like tyrosine kinase-1 (PE within 3 weeks: 80.2%; 95% confidence interval, 69.9%-88.3%; PE at any time: 64.2%; 95% confidence interval, 56.6%-71.3%), and they were both superior to screening by low placental growth factor concentration (PE within 3 weeks: 53.1%; 95% confidence interval, 41.7%-64.3%; PE at any time: 44.3; 95% confidence interval, 36.8%-52.0%) or high soluble fms-like tyrosine kinase-1-to-placental growth factor concentration ratio (PE within 3 weeks: 65.4%; 95% confidence interval, 54.0%-75.7%; PE at any time: 53.4%; 95% confidence interval, 45.8%-60.9%). CONCLUSION: First, increased maternal arterial stiffness preceded the clinical onset of PE. Second, maternal pulse wave velocity at 35 to 37 weeks' gestation in combination with mean arterial pressure and uterine artery pulsatility index provided effective prediction of subsequent development of preeclampsia.


Subject(s)
Pre-Eclampsia , Pregnancy , Female , Humans , Pre-Eclampsia/diagnosis , Pre-Eclampsia/epidemiology , Placenta Growth Factor , Vascular Endothelial Growth Factor Receptor-1 , Pulse Wave Analysis , Risk Assessment , Biomarkers , Uterine Artery/diagnostic imaging , Uterine Artery/physiology , Pulsatile Flow , Gestational Age
9.
Redox Biochem Chem ; 5-6: None, 2023 Dec.
Article in English | MEDLINE | ID: mdl-38046619

ABSTRACT

Retinitis pigmentosa (RP) is a disease characterised by photoreceptor cell death. It can be initiated by mutations in a number of different genes, primarily affecting rods, which will die first, resulting in loss of night vision. The secondary death of cones then leads to loss of visual acuity and blindness. We set out to investigate whether increased mitochondrial reactive oxygen species (ROS) formation, plays a role in this sequential photoreceptor degeneration. To do this we measured mitochondrial H2O2 production within mouse eyes in vivo using the mass spectrometric probe MitoB. We found higher levels of mitochondrial ROS that preceded photoreceptor loss in four mouse models of RP: Pde6brd1/rd1; Prhp2rds/rds; RPGR-/-; Cln6nclf. In contrast, there was no increase in mitochondrial ROS in loss of function models of vision loss (GNAT-/-, OGC), or where vision loss was not due to photoreceptor death (Cln3). Upregulation of Nrf2 transcriptional activity with dimethylfumarate (DMF) lowered mitochondrial ROS in RPGR-/- mice. These findings have important implications for the mechanism and treatment of RP.

10.
Invest Ophthalmol Vis Sci ; 64(15): 33, 2023 Dec 01.
Article in English | MEDLINE | ID: mdl-38133503

ABSTRACT

Purpose: Genome editing is an emerging group of technologies with the potential to ameliorate dominant, monogenic human diseases such as late-onset retinal degeneration (L-ORD). The goal of this study was to identify disease stages and retinal locations optimal for evaluating the efficacy of a future genome editing trial. Methods: Twenty five L-ORD patients (age range, 33-77 years; median age, 59 years) harboring the founder variant S163R in C1QTNF5 were enrolled from three centers in the United Kingdom and United States. Patients were examined with widefield optical coherence tomography (OCT) and chromatic perimetry under dark-adapted and light-adapted conditions to derive phenomaps of retinal disease. Results were analyzed with a model of a shared natural history of a single delayed exponential across all subjects and all retinal locations. Results: Critical age for the initiation of photoreceptor loss ranged from 48 years at the temporal paramacular retina to 74 years at the inferior midperipheral retina. Subretinal deposits (sRET-Ds) became more prevalent as critical age was approached. Subretinal pigment epithelial deposits (sRPE-Ds) were detectable in the youngest patients showing no other structural or functional abnormalities at the retina. The sRPE-D thickness continuously increased, reaching 25 µm in the extrafoveal retina and 19 µm in the fovea at critical age. Loss of light sensitivity preceded shortening of outer segments and loss of photoreceptors by more than a decade. Conclusions: Retinal regions providing an ideal treatment window exist across all severity stages of L-ORD.


Subject(s)
Genetic Therapy , Retinal Degeneration , Humans , Adult , Middle Aged , Aged , Late Onset Disorders/genetics , Late Onset Disorders/pathology , Late Onset Disorders/therapy , Retinal Degeneration/genetics , Retinal Degeneration/pathology , Retinal Degeneration/therapy , Collagen/genetics , Male , Female , Fovea Centralis/pathology , Tomography, Optical Coherence , Genetic Therapy/methods , Gene Editing
11.
Environ Sci Pollut Res Int ; 30(59): 123427-123438, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37982950

ABSTRACT

Water diversion projects have proven to be effective interventions to improve water quality in irrigation ditches. This study focused on quantifying the water quality improvement by utilizing a hydrodynamic water quality model in Funing County, Yancheng City. The model performed a spatial analysis of pollution concentrations across the study area. Various optimization scenarios were designed based on the diversion project and hydrological structure connectivity. The model was used to simulate changes in nutrient concentrations under different scenarios. The findings of this study were as follows: (1) Rural areas had lower nutrient concentrations and superior hydrological connectivity than urban areas. (2) The effect of water quality improvement correlated positively with increased flow rates introduced by the diversion project. Specifically, when the flow rate increased by 50%, the average reductions were 20% for NH4+, 5.2% for TN, and 5.1% for TP. Furthermore, introduced clean water led to more pronounced improvements in the overall regional water quality. (3) Although increasing the number of ditches improved water pollution concentration, the impact was not significant. (4) Model simulation results showed that 18 to 45% water diversion intensity effectively improved water quality, and the optimal water diversion intensity was 27 to 30%. The optimal water diversion intensities offered valuable insights for managing this region. The study's methods contributed to the promotion of sustainable development in regional water resources and the integrated management of the water environment.


Subject(s)
Quality Improvement , Water Quality , Water Pollution/prevention & control , Water Pollution/analysis , Computer Simulation , Water Resources , China , Environmental Monitoring/methods
12.
NanoImpact ; 31: 100480, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37625671

ABSTRACT

A significant bottleneck of current agricultural systems remains the very low agronomic efficiency of conventional agrochemicals, particularly in sandy soils. Carbon nanomaterials (CNMs) have been proposed to address this inefficiency in sandy soils, which could potentially improve soil fertility and enhance crop growth and physiological processes. However, the effects of different rates of CNMs on crop physiological and soil biochemical quality in sandy soils must be compared to other carbon sources (e.g., biochar) before CNMs can be broadly used. To address this, a 70-day pot experiment was set up, growing lettuce under ten treatments: a negative control with no CNMs, biochar or fertilizer; a fertilizer-only control; three CNMs-only unfertilized treatments (CNMs at 200, 400 and 800 mg kg-1 soil); two biochar treatments with fertilizer (biochar at 0.5% and 1% by soil mass + fertilizer); and three CNMs treatments with fertilizer (CNMs at 200, 400 and 800 mg kg-1 soil + fertilizer). A novel amorphous, water-dispersible, and carboxyl-functionalized CNMs with pH of 5.5, zeta potential of -40.6 mV and primary particle diameter of 30-60 nm was used for this experiment. Compared to the fertilizer-only control, CNMs applied at low to medium levels (200-400 mg kg-1) significantly increased lettuce shoot biomass (20-21%), total chlorophyll (23-27%), and fluorescence and photosynthetic activities (4-10%), which was associated with greater soil nutrient availability (N: 24-58%, K: 68-111%) and higher leaf tissue accumulation (N: 25-27%; K: 66%). Low to medium levels of CNMs also significantly increased soil biochemical properties, such as higher soil microbial biomass carbon (27-29%) and urease enzyme activity (34-44%) relative to fertilizer-only applications. In contrast, biochar (0.5%) increased lettuce biomass relative to fertilizer-only but had no significant effect on soil fertility and biological properties. These results suggest that CNMs at low to medium application rates are a superior carbon-based amendment relative to biochar in sandy soils.


Subject(s)
Carbon , Nanostructures , Soil , Sand , Lactuca , Fertilizers
13.
Proc Natl Acad Sci U S A ; 120(28): e2220111120, 2023 Jul 11.
Article in English | MEDLINE | ID: mdl-37399381

ABSTRACT

The seasonal availability of light and micronutrients strongly regulates productivity in the Southern Ocean, restricting biological utilization of macronutrients and CO2 drawdown. Mineral dust flux is a key conduit for micronutrients to the Southern Ocean and a critical mediator of multimillennial-scale atmospheric CO2 oscillations. While the role of dust-borne iron (Fe) in Southern Ocean biogeochemistry has been examined in detail, manganese (Mn) availability is also emerging as a potential driver of past, present, and future Southern Ocean biogeochemistry. Here, we present results from fifteen bioassay experiments along a north-south transect in the undersampled eastern Pacific sub-Antarctic zone. In addition to widespread Fe limitation of phytoplankton photochemical efficiency, we found further responses following the addition of Mn at our southerly stations, supporting the importance of Fe-Mn co-limitation in the Southern Ocean. Moreover, addition of different Patagonian dusts resulted in enhanced photochemical efficiency with differential responses linked to source region dust characteristics in terms of relative Fe/Mn solubility. Changes in the relative magnitude of dust deposition, combined with source region mineralogy, could hence determine whether Fe or Mn limitation control Southern Ocean productivity under future as well as past climate states.

14.
Am J Obstet Gynecol ; 229(5): 555.e1-555.e14, 2023 11.
Article in English | MEDLINE | ID: mdl-37263399

ABSTRACT

BACKGROUND: Triplet pregnancies are high risk for both the mother and the infants. The risks for infants include premature birth, low birthweight, and neonatal complications. Therefore, the management of triplet pregnancies involves close monitoring and may include interventions, such as fetal reduction, to prolong the pregnancy and improve outcomes. However, the evidence of benefits and risks associated with fetal reduction is inconsistent. OBJECTIVE: This study aimed to compare the outcomes of trichorionic triplet pregnancies with and without fetal reduction and with nonreduced dichorionic twin pregnancies and primary singleton pregnancies. STUDY DESIGN: All trichorionic triplet pregnancies in Denmark, including those with fetal reduction, were identified between 2008 and 2018. In Denmark, all couples expecting triplets are informed about and offered fetal reduction. Pregnancies with viable fetuses at the first-trimester ultrasound scan and pregnancies not terminated were included. Adverse pregnancy outcome was defined as a composite of miscarriage before 24 weeks of gestation, stillbirth at 24 weeks of gestation, or intrauterine fetal death of 1 or 2 fetuses. RESULTS: The study cohort was composed of 317 trichorionic triplet pregnancies, of which 70.0% of pregnancies underwent fetal reduction to a twin pregnancy, 2.2% of pregnancies were reduced to singleton pregnancies, and 27.8% of pregnancies were not reduced. Nonreduced triplet pregnancies had high risks of adverse pregnancy outcomes (28.4%), which was significantly lower in triplets reduced to twins (9.0%; difference, 19.4%, 95% confidence interval, 8.5%-30.3%). Severe preterm deliveries were significantly higher in nonreduced triplet pregnancies (27.9%) than triplet pregnancies reduced to twin pregnancies (13.1%; difference, 14.9%, 95% confidence interval, 7.9%-21.9%). However, triplet pregnancies reduced to twin pregnancies had an insignificantly higher risk of miscarriage (6.8%) than nonreduced twin pregnancies (1.1%; difference, 5.6%; 95% confidence interval, 0.9%-10.4%). CONCLUSION: Triplet pregnancies reduced to twin pregnancies had significantly lower risks of adverse pregnancy outcomes, severe preterm deliveries, and low birthweight than nonreduced triplet pregnancies. However, triplet pregnancies reduced to twin pregnancies were potentially associated with a 5.6% increased risk of miscarriage.


Subject(s)
Abortion, Spontaneous , Pregnancy Reduction, Multifetal , Infant, Newborn , Female , Pregnancy , Humans , Pregnancy Reduction, Multifetal/adverse effects , Abortion, Spontaneous/epidemiology , Abortion, Spontaneous/etiology , Cohort Studies , Birth Weight , Pregnancy Outcome , Pregnancy, Twin , Stillbirth/epidemiology , Risk Assessment , Denmark/epidemiology , Retrospective Studies , Gestational Age , Triplets
15.
Prog Nucl Magn Reson Spectrosc ; 134-135: 39-51, 2023.
Article in English | MEDLINE | ID: mdl-37321757

ABSTRACT

Deuterium metabolic imaging (DMI) is an emerging clinically-applicable technique for the non-invasive investigation of tissue metabolism. The generally short T1 values of 2H-labeled metabolites in vivo can compensate for the relatively low sensitivity of detection by allowing rapid signal acquisition in the absence of significant signal saturation. Studies with deuterated substrates, including [6,6'-2H2]glucose, [2H3]acetate, [2H9]choline and [2,3-2H2]fumarate have demonstrated the considerable potential of DMI for imaging tissue metabolism and cell death in vivo. The technique is evaluated here in comparison with established metabolic imaging techniques, including PET measurements of 2-deoxy-2-[18F]fluoro-d-glucose (FDG) uptake and 13C MR imaging of the metabolism of hyperpolarized 13C-labeled substrates.


Subject(s)
Magnetic Resonance Imaging , Deuterium , Magnetic Resonance Imaging/methods , Cell Death
16.
J Am Soc Echocardiogr ; 36(10): 1110-1115, 2023 10.
Article in English | MEDLINE | ID: mdl-37230422

ABSTRACT

OBJECTIVE: To assess differences in cardiac morphology and function at midgestation in fetuses from pregnancies that subsequently developed preeclampsia (PE) or gestational hypertension (GH). METHODS: This was a prospective study in 5,801 women with singleton pregnancies attending for a routine ultrasound examination at midgestation, including 179 (3.1%) who subsequently developed PE and 149 (2.6%) who developed GH. Conventional and more advanced echocardiographic modalities, such as speckle-tracking, were used to assess fetal cardiac function in the right and left ventricle. The morphology of the fetal heart was assessed by calculating the right and left sphericity index. RESULTS: In fetuses from the PE group (vs the no PE or GH group) there was a significantly higher left ventricular global longitudinal strain and lower left ventricular ejection fraction that could not be accounted for by fetal size. All other indices of fetal cardiac morphology and function were comparable between groups. There was no significant correlation between fetal cardiac indices and uterine artery pulsatility index multiple of the median or placental growth factor multiple of the median. CONCLUSION: At midgestation, fetuses of mothers at risk of developing PE, but not those at risk of GH, have mild reduction in left ventricular myocardial function. Although absolute differences were minimal and most likely not clinically relevant, these may suggest an early programming effect on left ventricular contractility in fetuses of mothers who develop PE.


Subject(s)
Pre-Eclampsia , Pregnancy , Female , Humans , Pre-Eclampsia/diagnosis , Stroke Volume , Prospective Studies , Ventricular Function, Left , Placenta Growth Factor , Heart Ventricles , Fetal Heart/diagnostic imaging , Ultrasonography, Prenatal
17.
JAMA Pediatr ; 177(7): 718-725, 2023 07 01.
Article in English | MEDLINE | ID: mdl-37184868

ABSTRACT

Importance: Fetuses in women with gestational diabetes (GD) compared with those without GD show evidence of subclinical cardiac functional and morphological changes. However, it is uncertain whether glycemia or the adverse maternal underlying risk factor profile is the main driver for fetal cardiac remodeling. Objective: To assess cardiac morphology and function at midgestation in fetuses of mothers prior to development of GD and compare them with those of unaffected controls. Design, Setting, and Participants: During this prospective nonintervention screening study at 19 to 23 weeks' gestation, fetal cardiac morphology and function were assessed in all participants. Pregnancy complications were obtained from the medical records of the women. Fetal cardiac morphology and function were assessed in all participants at Harris Birthright Research Institute at King's College Hospital, London, United Kingdom. Participants included pregnant women with singleton pregnancy who attended their routine fetal ultrasound examination at midgestation and agreed to participate in the Advanced Cardiovascular Imaging Study in pregnancy. Main Outcome and Measures: Comparison of fetal cardiac morphology and function between mothers who subsequently developed GD and those who did not develop GD. Methods: This was a prospective nonintervention screening study of 5620 women with singleton pregnancies at 19 to 23 weeks' gestation. Conventional and more advanced echocardiographic modalities, such as speckle tracking, were used to assess fetal cardiac function in the right and left ventricle. The morphology of the fetal heart was assessed by calculating the right and left sphericity index. Results: The 5620 included patients had a mean age of 33.6 years. In 470 cases, the women were diagnosed with GD after the midgestation echocardiographic assessment (8.4%). Women with GD, compared with the non-GD group, were older, had higher BMI, higher prevalence of family history of diabetes, non-White ethnicity, chronic hypertension, and GD in a previous pregnancy. In fetuses of the GD group compared with the non-GD group, there was mild increase in interventricular millimeter thickness (0.04; 95% CI, 0.03-0.06 mm) and left atrial area (0.04; 95% CI, 0.04-0.05), whereas left and right functional indices were comparable between groups with the exception of left ventricular ejection fraction, which was marginally improved in the GD group (0.02; 95% CI, 0.03-0.03). Conclusions and Relevance: This study demonstrates that prior to development of GD, there was mild alteration in fetal cardiac morphology without affecting cardiac function. This suggests that the adverse maternal risk factor profile and not only the glycemia might contribute to cardiac remodeling noted in fetuses of women with GD.


Subject(s)
Diabetes, Gestational , Pregnancy , Female , Humans , Adult , Diabetes, Gestational/epidemiology , Stroke Volume , Ventricular Remodeling , Prospective Studies , Ventricular Function, Left , Fetal Heart/diagnostic imaging , Gestational Age
18.
NMR Biomed ; 36(10): e4965, 2023 10.
Article in English | MEDLINE | ID: mdl-37148156

ABSTRACT

Imaging the metabolism of [2,3-2 H2 ]fumarate to produce malate can be used to detect tumor cell death post-treatment. Here, we assess the sensitivity of the technique for detecting cell death by lowering the concentration of injected [2,3-2 H2 ]fumarate and by varying the extent of tumor cell death through changes in drug concentration. Mice were implanted subcutaneously with human triple negative breast cancer cells (MDA-MB-231) and injected with 0.1, 0.3, and 0.5 g/kg [2,3-2 H2 ]fumarate before and after treatment with a multivalent TRAlL-R2 agonist (MEDI3039) at 0.1, 0.4, and 0.8 mg/kg. Tumor conversion of [2,3-2 H2 ]fumarate to [2,3-2 H2 ]malate was assessed from a series of 13 spatially localized 2 H MR spectra acquired over 65 min using a pulse-acquire sequence with a 2-ms BIR4 adiabatic excitation pulse. Tumors were then excised and stained for histopathological markers of cell death: cleaved caspase 3 (CC3) and DNA damage (terminal deoxynucleotidyl transferase dUTP nick end labeling [TUNEL]). The rate of malate production and the malate/fumarate ratio plateaued at tumor fumarate concentrations of 2 mM, which were obtained with injected [2,3-2 H2 ]fumarate concentrations of 0.3 g/kg and above. Tumor malate concentration and the malate/fumarate ratio increased linearly with the extent of cell death determined histologically. At an injected [2,3-2 H2 ]fumarate concentration of 0.3 g/kg, 20% CC3 staining corresponded to a malate concentration of 0.62 mM and a malate/fumarate ratio of 0.21. Extrapolation indicated that there would be no detectable malate at 0% CC3 staining. The use of low and nontoxic fumarate concentrations and the production of [2,3-2 H2 ]malate at concentrations that are within the range that can be detected clinically suggest this technique could translate to the clinic.


Subject(s)
Malates , Neoplasms , Humans , Animals , Mice , Malates/metabolism , Cell Death , Magnetic Resonance Spectroscopy , Fumarates/metabolism
19.
Front Plant Sci ; 14: 1151786, 2023.
Article in English | MEDLINE | ID: mdl-37063213

ABSTRACT

Introduction: The increasing use of cerium nanoparticles (CeO2-NPs) has made their influx in agroecosystems imminent through air and soil deposition or untreated wastewater irrigation. Another major pollutant associated with anthropogenic activities is Cd, which has adverse effects on plants, animals, and humans. The major source of the influx of Cd and Ce metals in the human food chain is contaminated food, making it an alarming issue; thus, there is a need to understand the factors that can reduce the potential damage of these heavy metals. Methods: The present investigation was conducted to evaluate the effect of CeO2-10-nm-NPs and Cd (alone and in combination) on Zea mays growth. A pot experiment (in sand) was conducted to check the effect of 0, 200, 400, 600, 1,000, and 2,000 mg of CeO2-10 nm-NPs/kg-1 dry sand alone and in combination with 0 and 0.5 mg Cd/kg-1 dry sand on maize seedlings grown in a partially controlled greenhouse environment, making a total of 12 treatments applied in four replicates under a factorial design. Maize seedling biomass, shoot and root growth, nutrient content, and root anatomy were measured. Results and discussion: The NPs were toxic to plant biomass (shoot and root dry weight), and growth at 2,000 ppm was the most toxic in Cd-0 sets. For Cd-0.5 sets, NPs applied at 1,000 ppm somewhat reverted Cd toxicity compared with the contaminated control (CC). Additionally, CeO2-NPs affected Cd translocation, and variable Ce uptake was observed in the presence of Cd compared with non-Cd applied sets. Furthermore, CeO2-NPs partially controlled the elemental content of roots and shoots (micronutrients such as B, Mn, Ni, Cu, Zn, Mo, and Fe and the elements Co and Si) and affected root anatomy.

20.
Hypertension ; 80(5): 969-978, 2023 05.
Article in English | MEDLINE | ID: mdl-37035913

ABSTRACT

BACKGROUND: Most preeclampsia occurs at term. There are no effective preventative strategies. We aimed to identify the optimal preeclampsia screening and timing of birth strategy for prevention of term preeclampsia. METHODS: This secondary analysis was of data from a prospective nonintervention cohort study of singleton pregnancies delivering at ≥24 weeks, without major anomalies, at 2 United Kingdom maternity hospitals. At routine visits at 11 to 13 weeks' (57 131 pregnancies screened, 1138 term preeclampsia developed) or 35 to 36 weeks' gestation (29 035 pregnancies screened, 619 term preeclampsia), with patient-specific preeclampsia risks determined by: United Kingdom National Institute for Health and Care Excellence guidance, and the Fetal Medicine Foundation competing-risks model. For each screening strategy, timing of birth for term preeclampsia prevention was evaluated at gestational time points that were fixed (37, 38, 39, 40 weeks) or dependent on preeclampsia risk by the competing-risks model at 35 to 36 weeks. Main outcomes were proportion of term preeclampsia prevented, and number-needed-to-deliver to prevent one term preeclampsia case. RESULTS: The proportion of term preeclampsia prevented was the highest, and number-needed-to-deliver lowest, for preeclampsia screening at 35 to 36 (rather than 11-13) weeks. For delivery at 37 weeks, fewer cases of preeclampsia were prevented for National Institute for Health and Care Excellence (28.8%) than the competing-risks model (59.8%), and the number-needed-to-deliver was higher (16.4 versus 6.9, respectively). The risk-stratified approach (at 35-36 weeks) had similar preeclampsia prevention (by 57.2%) and number-needed-to-deliver (8.4), but fewer women would be induced at 37 weeks (1.2% versus 8.8%). CONCLUSIONS: Risk-stratified timing of birth at term may more than halve the risk of term preeclampsia.


Subject(s)
Pre-Eclampsia , Pregnancy , Female , Humans , Pre-Eclampsia/diagnosis , Cohort Studies , Prospective Studies , Pregnancy Trimester, First , Pregnancy Trimester, Third , Gestational Age
SELECTION OF CITATIONS
SEARCH DETAIL
...