Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
1.
Front Cell Infect Microbiol ; 13: 1137067, 2023.
Article in English | MEDLINE | ID: mdl-36875522

ABSTRACT

The present study aimed at identifying risk factors associated with periodontitis development and periodontal health disparities with emphasis on differential oral microbiota. The prevalence of periodontitis is recently rising dentate adults in the US, which presents a challenge to oral health and overall health. The risk of developing periodontitis is higher in African Americans (AAs), and Hispanic Americans (HAs) than in Caucasian Americans (CAs). To identify potentially microbiological determinations of periodontal health disparities, we examined the distribution of several potentially beneficial and pathogenic bacteria in the oral cavities of AA, CA, and HA study participants. Dental plaque samples from 340 individuals with intact periodontium were collected prior to any dental treatment, and levels of some key oral bacteria were quantitated using qPCR, and the medical and dental histories of participants were obtained retrospectively from axiUm. Data were analyzed statistically using SAS 9.4, IBM SPSS version 28, and R/RStudio version 4.1.2. Amongst racial/ethnic groups: 1) neighborhood medium incomes were significantly higher in the CA participants than the AA and the HA participants; 2) levels of bleeding on probing (BOP) were higher in the AAs than in the CAs and HAs; 3) Porphyromonas gingivalis levels were higher in the HAs compared to that in the CAs; 4) most P. gingivalis detected in the AAs were the fimA genotype II strain that was significantly associated with higher BOP indexes along with the fimA type IV strain. Our results suggest that socioeconomic disadvantages, higher level of P. gingivalis, and specific types of P. gingivalis fimbriae, particularly type II FimA, contribute to risks for development of periodontitis and periodontal health disparities.


Subject(s)
Fimbriae, Bacterial , Microbiota , Adult , Humans , Retrospective Studies , Genotype , Mouth
2.
Article in English | MEDLINE | ID: mdl-36981686

ABSTRACT

As data grows exponentially across diverse fields, the ability to effectively leverage big data has become increasingly crucial. In the field of data science, however, minority groups, including African Americans, are significantly underrepresented. With the strategic role of minority-serving institutions to enhance diversity in the data science workforce and apply data science to health disparities, the National Institute for Minority Health Disparities (NIMHD) provided funding in September 2021 to six Research Centers in Minority Institutions (RCMI) to improve their data science capacity and foster collaborations with data scientists. Meharry Medical College (MMC), a historically Black College/University (HBCU), was among the six awardees. This paper summarizes the NIMHD-funded efforts at MMC, which include offering mini-grants to collaborative research groups, surveys to understand the needs of the community to guide project implementation, and data science training to enhance the data analytics skills of the RCMI investigators, staff, medical residents, and graduate students. This study is innovative as it addressed the urgent need to enhance the data science capacity of the RCMI program at MMC, build a diverse data science workforce, and develop collaborations between the RCMI and MMC's newly established School of Applied Computational Science. This paper presents the progress of this NIMHD-funded project, which clearly shows its positive impact on the local community.


Subject(s)
Data Science , Minority Groups , Humans , Minority Groups/education , Universities , Students , Black or African American
3.
Pain Res Manag ; 2020: 5165682, 2020.
Article in English | MEDLINE | ID: mdl-32318129

ABSTRACT

Objectives: This research describes the prevalence and covariates associated with opioid-induced constipation (OIC) in an observational cohort study utilizing a national veteran cohort and integrated data from the Center for Medicare and Medicaid Services (CMS). Methods: A cohort of 152,904 veterans with encounters between 1 January 2008 and 30 November 2010, an exposure to opioids of 30 days or more, and no exposure in the prior year was developed to establish existing conditions and medications at the start of the opioid exposure and determining outcomes through the end of exposure. OIC was identified through additions/changes in laxative prescriptions, all-cause constipation identification through diagnosis, or constipation related procedures in the presence of opioid exposure. The association of time to constipation with opioid use was analyzed using Cox proportional hazard regression adjusted for patient characteristics, concomitant medications, laboratory tests, and comorbidities. Results: The prevalence of OIC was 12.6%. Twelve positively associated covariates were identified with the largest associations for prior constipation and prevalent laxative (any laxative that continued into the first day of opioid exposure). Among the 17 negatively associated covariates, the largest associations were for erythromycins, androgens/anabolics, and unknown race. Conclusions: There were several novel covariates found that are seen in the all-cause chronic constipation literature but have not been reported for opioid-induced constipation. Some are modifiable covariates, particularly medication coadministration, which may assist clinicians and researchers in risk stratification efforts when initiating opioid medications. The integration of CMS data supports the robustness of the analysis and may be of interest in the elderly population warranting future examination.


Subject(s)
Analgesics, Opioid/adverse effects , Opioid-Induced Constipation/epidemiology , Aged , Cohort Studies , Female , Humans , Laxatives/therapeutic use , Male , Prevalence , Retrospective Studies , Risk Factors , United States , Veterans
4.
Dig Dis Sci ; 65(4): 1003-1031, 2020 04.
Article in English | MEDLINE | ID: mdl-31531817

ABSTRACT

BACKGROUND: Early hospital readmission for patients with cirrhosis continues to challenge the healthcare system. Risk stratification may help tailor resources, but existing models were designed using small, single-institution cohorts or had modest performance. AIMS: We leveraged a large clinical database from the Department of Veterans Affairs (VA) to design a readmission risk model for patients hospitalized with cirrhosis. Additionally, we analyzed potentially modifiable or unexplored readmission risk factors. METHODS: A national VA retrospective cohort of patients with a history of cirrhosis hospitalized for any reason from January 1, 2006, to November 30, 2013, was developed from 123 centers. Using 174 candidate variables within demographics, laboratory results, vital signs, medications, diagnoses and procedures, and healthcare utilization, we built a 47-variable penalized logistic regression model with the outcome of all-cause 30-day readmission. We excluded patients who left against medical advice, transferred to a non-VA facility, or if the hospital length of stay was greater than 30 days. We evaluated calibration and discrimination across variable volume and compared the performance to recalibrated preexisting risk models for readmission. RESULTS: We analyzed 67,749 patients and 179,298 index hospitalizations. The 30-day readmission rate was 23%. Ascites was the most common cirrhosis-related cause of index hospitalization and readmission. The AUC of the model was 0.670 compared to existing models (0.649, 0.566, 0.577). The Brier score of 0.165 showed good calibration. CONCLUSION: Our model achieved better discrimination and calibration compared to existing models, even after local recalibration. Assessment of calibration by variable parsimony revealed performance improvements for increasing variable inclusion well beyond those detectable for discrimination.


Subject(s)
Liver Cirrhosis/diagnosis , Liver Cirrhosis/epidemiology , Patient Readmission/trends , Aged , Cohort Studies , Female , Forecasting , Humans , Liver Cirrhosis/therapy , Male , Middle Aged , Retrospective Studies , Risk Factors , United States/epidemiology
5.
BMJ Open Gastroenterol ; 6(1): e000342, 2019.
Article in English | MEDLINE | ID: mdl-31875140

ABSTRACT

OBJECTIVE: Cirrhotic patients are at high hospitalisation risk with subsequent high mortality. Current risk prediction models have varied performances with methodological room for improvement. We used current analytical techniques using automatically extractable variables from the electronic health record (EHR) to develop and validate a posthospitalisation mortality risk score for cirrhotic patients and compared performance with the model for end-stage liver disease (MELD), model for end-stage liver disease with sodium (MELD-Na), and the CLIF Consortium Acute Decompensation (CLIF-C AD) models. DESIGN: We analysed a retrospective cohort of 73 976 patients comprising 247 650 hospitalisations between 2006 and 2013 at any of 123 Department of Veterans Affairs hospitals. Using 45 predictor variables, we built a time-dependent Cox proportional hazards model with all-cause mortality as the outcome. We compared performance to the three extant models and reported discrimination and calibration using bootstrapping. Furthermore, we analysed differential utility using the net reclassification index (NRI). RESULTS: The C-statistic for the final model was 0.863, representing a significant improvement over the MELD, MELD-Na, and the CLIF-C AD, which had C-statistics of 0.655, 0.675, and 0.679, respectively. Multiple risk factors were significant in our model, including variables reflecting disease severity and haemodynamic compromise. The NRI showed a 24% improvement in predicting survival of low-risk patients and a 30% improvement in predicting death of high-risk patients. CONCLUSION: We developed a more accurate mortality risk prediction score using variables automatically extractable from an EHR that may be used to risk stratify patients with cirrhosis for targeted postdischarge management.

6.
Appl Clin Inform ; 10(5): 794-803, 2019 10.
Article in English | MEDLINE | ID: mdl-31645076

ABSTRACT

BACKGROUND: The development and adoption of health care common data models (CDMs) has addressed some of the logistical challenges of performing research on data generated from disparate health care systems by standardizing data representations and leveraging standardized terminology to express clinical information consistently. However, transforming a data system into a CDM is not a trivial task, and maintaining an operational, enterprise capable CDM that is incrementally updated within a data warehouse is challenging. OBJECTIVES: To develop a quality assurance (QA) process and code base to accompany our incremental transformation of the Department of Veterans Affairs Corporate Data Warehouse health care database into the Observational Medical Outcomes Partnership (OMOP) CDM to prevent incremental load errors. METHODS: We designed and implemented a multistage QA) approach centered on completeness, value conformance, and relational conformance data-quality elements. For each element we describe key incremental load challenges, our extract, transform, and load (ETL) solution of data to overcome those challenges, and potential impacts of incremental load failure. RESULTS: Completeness and value conformance data-quality elements are most affected by incremental changes to the CDW, while updates to source identifiers impact relational conformance. ETL failures surrounding these elements lead to incomplete and inaccurate capture of clinical concepts as well as data fragmentation across patients, providers, and locations. CONCLUSION: Development of robust QA processes supporting accurate transformation of OMOP and other CDMs from source data is still in evolution, and opportunities exist to extend the existing QA framework and tools used for incremental ETL QA processes.


Subject(s)
Electronic Health Records/statistics & numerical data , Models, Statistical , Quality Assurance, Health Care , Delivery of Health Care
7.
Contemp Clin Trials ; 81: 55-61, 2019 06.
Article in English | MEDLINE | ID: mdl-31029692

ABSTRACT

BACKGROUND: The optimal structure and intensity of interventions to reduce hospital readmission remains uncertain, due in part to lack of head-to-head comparison. To address this gap, we evaluated two forms of an evidence-based, multi-component transitional care intervention. METHODS: A quasi-experimental evaluation design compared outcomes of Transition Care Coordinator (TCC) Care to Usual Care, while controlling for sociodemographic characteristics, comorbidities, readmission risk, and administrative factors. The study was conducted between January 1, 2013 and April 30, 2015 as a quality improvement initiative. Eligible adults (N = 7038) hospitalized with pneumonia, congestive heart failure, or chronic obstructive pulmonary disease were identified for program evaluation via an electronic health record algorithm. Nurse TCCs provided either a full intervention (delivered in-hospital and by post-discharge phone call) or a partial intervention (phone call only). RESULTS: A total of 762 hospitalizations with TCC Care (460 full intervention and 302 partial intervention) and 6276 with Usual Care was examined. In multivariable models, hospitalizations with TCC Care had significantly lower odds of readmission at 30 days (OR = 0.512, 95% CI 0.392 to 0.668) and 90 days (OR = 0.591, 95% CI 0.483 to 0.723). Adjusted costs were significantly lower at 30 days (difference = $3969, 95% CI $5099 to $2691) and 90 days (difference = $5684, 95% CI $7602 to $3627). The effect was similar whether patients received the full or partial intervention. CONCLUSION: An evidence-based multi-component intervention delivered by nurse TCCs reduced 30- and 90-day readmissions and associated health care costs. Lower intensity interventions delivered by telephone after discharge may have similar effectiveness to in-hospital programs.


Subject(s)
Nursing Staff, Hospital/organization & administration , Patient Readmission/statistics & numerical data , Quality Improvement/organization & administration , Transitional Care/organization & administration , Adult , Aged , Aged, 80 and over , Comorbidity , Evidence-Based Practice , Female , Heart Failure/therapy , Humans , Male , Middle Aged , Pneumonia/therapy , Pulmonary Disease, Chronic Obstructive/therapy , Retrospective Studies , Socioeconomic Factors
8.
Med Care ; 56(10): 890-897, 2018 10.
Article in English | MEDLINE | ID: mdl-30179988

ABSTRACT

RATIONALE: Intensive care unit (ICU) delirium is highly prevalent and a potentially avoidable hospital complication. The current cost of ICU delirium is unknown. OBJECTIVES: To specify the association between the daily occurrence of delirium in the ICU with costs of ICU care accounting for time-varying illness severity and death. RESEARCH DESIGN: We performed a prospective cohort study within medical and surgical ICUs in a large academic medical center. SUBJECTS: We analyzed critically ill patients (N=479) with respiratory failure and/or shock. MEASURES: Covariates included baseline factors (age, insurance, cognitive impairment, comorbidities, Acute Physiology and Chronic Health Evaluation II Score) and time-varying factors (sequential organ failure assessment score, mechanical ventilation, and severe sepsis). The primary analysis used a novel 3-stage regression method: first, estimation of the cumulative cost of delirium over 30 ICU days and then costs separated into those attributable to increased resource utilization among survivors and those that were avoided on the account of delirium's association with early mortality in the ICU. RESULTS: The patient-level 30-day cumulative cost of ICU delirium attributable to increased resource utilization was $17,838 (95% confidence interval, $11,132-$23,497). A combination of professional, dialysis, and bed costs accounted for the largest percentage of the incremental costs associated with ICU delirium. The 30-day cumulative incremental costs of ICU delirium that were avoided due to delirium-associated early mortality was $4654 (95% confidence interval, $2056-7869). CONCLUSIONS: Delirium is associated with substantial costs after accounting for time-varying illness severity and could be 20% higher (∼$22,500) if not for its association with early ICU mortality.


Subject(s)
Coma/economics , Delirium/economics , Intensive Care Units/economics , Adult , Aged , Coma/complications , Comorbidity , Costs and Cost Analysis , Critical Illness/economics , Delirium/complications , Dialysis/economics , Female , Humans , Intensive Care Units/organization & administration , Intensive Care Units/statistics & numerical data , Male , Middle Aged , Prospective Studies , Respiration, Artificial/economics , Risk Factors
9.
J Hosp Med ; 12(11): 918-924, 2017 11.
Article in English | MEDLINE | ID: mdl-29091980

ABSTRACT

OBJECTIVE: To examine the association of health literacy with the number and type of transitional care needs (TCN) among patients being discharged to home. DESIGN, SETTING, PARTICIPANTS: A cross-sectional analysis of patients admitted to an academic medical center. MEASUREMENTS: Nurses administered the Brief Health Literacy Screen and documented TCNs along 10 domains: caregiver support, transportation, healthcare utilization, high-risk medical comorbidities, medication management, medical devices, functional status, mental health comorbidities, communication, and financial resources. RESULTS: Among the 384 patients analyzed, 113 (29%) had inadequate health literacy. Patients with inadequate health literacy had needs in more TCN domains (mean = 5.29 vs 4.36; P < 0 .001). In unadjusted analysis, patients with inadequate health literacy were significantly more likely to have TCNs in 7 out of the 10 domains. In multivariate analyses, inadequate health literacy remained significantly associated with inadequate caregiver support (odds ratio [OR], 2.61; 95% confidence interval [CI], 1.37-4.99) and transportation barriers (OR, 1.69; 95% CI, 1.04-2.76). CONCLUSIONS: Among hospitalized patients, inadequate health literacy is prevalent and independently associated with other needs that place patients at a higher risk of adverse outcomes, such as hospital readmission. Screening for inadequate health literacy and associated needs may enable hospitals to address these barriers and improve postdischarge outcomes.


Subject(s)
Health Literacy/statistics & numerical data , Hospitalization , Patient Acceptance of Health Care , Transitional Care/statistics & numerical data , Aged , Cross-Sectional Studies , Female , Humans , Male , Nursing Assessment , Patient Discharge , Patient Readmission , Surveys and Questionnaires
10.
Comput Math Methods Med ; 2015: 891692, 2015.
Article in English | MEDLINE | ID: mdl-25737739

ABSTRACT

A novel clustering method is proposed for mammographic mass segmentation on extracted regions of interest (ROIs) by using deterministic annealing incorporating circular shape function (DACF). The objective function reported in this study uses both intensity and spatial shape information, and the dominant dissimilarity measure is controlled by two weighting parameters. As a result, pixels having similar intensity information but located in different regions can be differentiated. Experimental results shows that, by using DACF, the mass segmentation results in digitized mammograms are improved with optimal mass boundaries, less number of noisy patches, and computational efficiency. An average probability of segmentation error of 7.18% for well-defined masses (or 8.06% for ill-defined masses) was obtained by using DACF on MiniMIAS database, with 5.86% (or 5.55%) and 6.14% (or 5.27%) improvements as compared to the standard DA and fuzzy c-means methods.


Subject(s)
Breast Neoplasms/pathology , Mammography/methods , Radiographic Image Interpretation, Computer-Assisted/methods , Algorithms , Breast Neoplasms/diagnostic imaging , Cluster Analysis , Databases, Factual , Female , Fuzzy Logic , Humans , Models, Statistical , Normal Distribution , Pattern Recognition, Automated/methods , Probability , Reproducibility of Results
11.
Artif Intell Med ; 60(3): 189-96, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24637294

ABSTRACT

OBJECTIVE: Support vector machines (SVMs) have drawn considerable attention due to their high generalisation ability and superior classification performance compared to other pattern recognition algorithms. However, the assumption that the learning data is identically generated from unknown probability distributions may limit the application of SVMs for real problems. In this paper, we propose a vicinal support vector classifier (VSVC) which is shown to be able to effectively handle practical applications where the learning data may originate from different probability distributions. METHODS: The proposed VSVC method utilises a set of new vicinal kernel functions which are constructed based on supervised clustering in the kernel-induced feature space. Our proposed approach comprises two steps. In the clustering step, a supervised kernel-based deterministic annealing (SKDA) clustering algorithm is employed to partition the training data into different soft vicinal areas of the feature space in order to construct the vicinal kernel functions. In the training step, the SVM technique is used to minimise the vicinal risk function under the constraints of the vicinal areas defined in the SKDA clustering step. RESULTS: Experimental results on both artificial and real medical datasets show our proposed VSVC achieves better classification accuracy and lower computational time compared to a standard SVM. For an artificial dataset constructed from non-separated data, the classification accuracy of VSVC is between 95.5% and 96.25% (using different cluster numbers) which compares favourably to the 94.5% achieved by SVM. The VSVC training time is between 8.75s and 17.83s (for 2-8 clusters), considerable less than the 65.0s required by SVM. On a real mammography dataset, the best classification accuracy of VSVC is 85.7% and thus clearly outperforms a standard SVM which obtains an accuracy of only 82.1%. A similar performance improvement is confirmed on two further real datasets, a breast cancer dataset (74.01% vs. 72.52%) and a heart dataset (84.77% vs. 83.81%), coupled with a reduction in terms of learning time (32.07s vs. 92.08s and 25.00s vs. 53.31s, respectively). Furthermore, the VSVC results in the number of support vectors being equal to the specified cluster number, and hence in a much sparser solution compared to a standard SVM. CONCLUSION: Incorporating a supervised clustering algorithm into the SVM technique leads to a sparse but effective solution, while making the proposed VSVC adaptive to different probability distributions of the training data.


Subject(s)
Cluster Analysis , Support Vector Machine , Artificial Intelligence , Breast Neoplasms/diagnosis , Female , Humans , Mammography/methods , Sensitivity and Specificity
12.
Soc Cogn Affect Neurosci ; 9(12): 2049-58, 2014 Dec.
Article in English | MEDLINE | ID: mdl-24493850

ABSTRACT

Children born with an inhibited temperament are at heightened risk for developing anxiety, depression and substance use. Inhibited temperament is believed to have a biological basis; however, little is known about the structural brain basis of this vulnerability trait. Structural MRI scans were obtained from 84 (44 inhibited, 40 uninhibited) young adults. Given previous findings of amygdala hyperactivity in inhibited individuals, groups were compared on three measures of amygdala structure. To identify novel substrates of inhibited temperament, a whole brain analysis was performed. Functional activation and connectivity were examined across both groups. Inhibited adults had larger amygdala and caudate volume and larger volume predicted greater activation to neutral faces. In addition, larger amygdala volume predicted greater connectivity with subcortical and higher order visual structures. Larger caudate volume predicted greater connectivity with the basal ganglia, and less connectivity with primary visual and auditory cortex. We propose that larger volume in these salience detection regions may result in increased activation and enhanced connectivity in response to social stimuli. Given the strong link between inhibited temperament and risk for psychiatric illness, novel therapeutics that target these brain regions and related neural circuits have the potential to reduce rates of illness in vulnerable individuals.


Subject(s)
Brain/anatomy & histology , Brain/physiology , Inhibition, Psychological , Temperament/physiology , Adult , Analysis of Variance , Brain Mapping , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Neural Pathways/blood supply , Neural Pathways/physiology , Oxygen/blood , Statistics as Topic , Young Adult
13.
J Child Psychol Psychiatry ; 55(2): 162-71, 2014.
Article in English | MEDLINE | ID: mdl-24117668

ABSTRACT

BACKGROUND: Restricted interests are a class of repetitive behavior in autism spectrum disorders (ASD) whose intensity and narrow focus often contribute to significant interference with daily functioning. While numerous neuroimaging studies have investigated executive circuits as putative neural substrates of repetitive behavior, recent work implicates affective neural circuits in restricted interests. We sought to explore the role of affective neural circuits and determine how restricted interests are distinguished from hobbies or interests in typical development. METHODS: We compared a group of children with ASD to a typically developing (TD) group of children with strong interests or hobbies, employing parent report, an operant behavioral task, and functional imaging with personalized stimuli based on individual interests. RESULTS: While performance on the operant task was similar between the two groups, parent report of intensity and interference of interests was significantly higher in the ASD group. Both the ASD and TD groups showed increased BOLD response in widespread affective neural regions to the pictures of their own interest. When viewing pictures of other children's interests, the TD group showed a similar pattern, whereas BOLD response in the ASD group was much more limited. Increased BOLD response in the insula and anterior cingulate cortex distinguished the ASD from the TD group, and parent report of the intensity and interference with daily life of the child's restricted interest predicted insula response. CONCLUSIONS: While affective neural network response and operant behavior are comparable in typical and restricted interests, the narrowness of focus that clinically distinguishes restricted interests in ASD is reflected in more interference in daily life and aberrantly enhanced insula and anterior cingulate response to individuals' own interests in the ASD group. These results further support the involvement of affective neural networks in repetitive behaviors in ASD.


Subject(s)
Affect/physiology , Cerebral Cortex/physiopathology , Child Development Disorders, Pervasive/physiopathology , Hobbies/psychology , Adolescent , Child , Gyrus Cinguli/physiopathology , Humans , Male , Nerve Net/physiopathology
14.
Psychiatry Res ; 214(2): 122-31, 2013 Nov 30.
Article in English | MEDLINE | ID: mdl-24035535

ABSTRACT

Craving is a major motivator underlying drug use and relapse but the neural correlates of cannabis craving are not well understood. This study sought to determine whether visual cannabis cues increase cannabis craving and whether cue-induced craving is associated with regional brain activation in cannabis-dependent individuals. Cannabis craving was assessed in 16 cannabis-dependent adult volunteers while they viewed cannabis cues during a functional MRI (fMRI) scan. The Marijuana Craving Questionnaire was administered immediately before and after each of three cannabis cue-exposure fMRI runs. FMRI blood-oxygenation-level-dependent (BOLD) signal intensity was determined in regions activated by cannabis cues to examine the relationship of regional brain activation to cannabis craving. Craving scores increased significantly following exposure to visual cannabis cues. Visual cues activated multiple brain regions, including inferior orbital frontal cortex, posterior cingulate gyrus, parahippocampal gyrus, hippocampus, amygdala, superior temporal pole, and occipital cortex. Craving scores at baseline and at the end of all three runs were significantly correlated with brain activation during the first fMRI run only, in the limbic system (including amygdala and hippocampus) and paralimbic system (superior temporal pole), and visual regions (occipital cortex). Cannabis cues increased craving in cannabis-dependent individuals and this increase was associated with activation in the limbic, paralimbic, and visual systems during the first fMRI run, but not subsequent fMRI runs. These results suggest that these regions may mediate visually cued aspects of drug craving. This study provides preliminary evidence for the neural basis of cue-induced cannabis craving and suggests possible neural targets for interventions targeted at treating cannabis dependence.


Subject(s)
Brain Mapping , Drug-Seeking Behavior/physiology , Limbic System/pathology , Marijuana Abuse/pathology , Marijuana Abuse/psychology , Visual Pathways/pathology , Adult , Cues , Female , Humans , Image Processing, Computer-Assisted , Limbic System/blood supply , Magnetic Resonance Imaging , Male , Oxygen/blood , Surveys and Questionnaires , Visual Pathways/blood supply , Young Adult
15.
Brain Connect ; 3(2): 199-211, 2013.
Article in English | MEDLINE | ID: mdl-23273430

ABSTRACT

Although an extensive literature exists on the neurobiological correlates of dyslexia (DYS), to date, no studies have examined the neurobiological profile of those who exhibit poor reading comprehension despite intact word-level abilities (specific reading comprehension deficits [S-RCD]). Here we investigated the word-level abilities of S-RCD as compared to typically developing readers (TD) and those with DYS by examining the blood oxygenation-level dependent response to words varying on frequency. Understanding whether S-RCD process words in the same manner as TD, or show alternate pathways to achieve normal word-reading abilities, may provide insights into the origin of this disorder. Results showed that as compared to TD, DYS showed abnormal covariance during word processing with right-hemisphere homologs of the left-hemisphere reading network in conjunction with left occipitotemporal underactivation. In contrast, S-RCD showed an intact neurobiological response to word stimuli in occipitotemporal regions (associated with fast and efficient word processing); however, inferior frontal gyrus (IFG) abnormalities were observed. Specifically, TD showed a higher-percent signal change within right IFG for low-versus-high frequency words as compared to both S-RCD and DYS. Using psychophysiological interaction analyses, a coupling-by-reading group interaction was found in right IFG for DYS, as indicated by a widespread greater covariance between right IFG and right occipitotemporal cortex/visual word-form areas, as well as bilateral medial frontal gyrus, as compared to TD. For S-RCD, the context-dependent functional interaction anomaly was most prominently seen in left IFG, which covaried to a greater extent with hippocampal, parahippocampal, and prefrontal areas than for TD for low- as compared to high-frequency words. Given the greater lexical access demands of low frequency as compared to high-frequency words, these results may suggest specific weaknesses in accessing lexical-semantic representations during word recognition. These novel findings provide foundational insights into the nature of S-RCD, and set the stage for future investigations of this common, but understudied, reading disorder.


Subject(s)
Brain/pathology , Comprehension/physiology , Dyslexia/diagnosis , Adolescent , Brain/blood supply , Child , Female , Functional Laterality , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Oxygen/blood , Psychophysics , Vocabulary
16.
Psychopharmacology (Berl) ; 227(1): 41-54, 2013 May.
Article in English | MEDLINE | ID: mdl-23241648

ABSTRACT

RATIONALE: Ecstasy (3,4-methylenedioxymethamphetamine [MDMA]) polydrug users have verbal memory performance that is statistically significantly lower than that of control subjects. Studies have correlated long-term MDMA use with altered brain activation in regions that play a role in verbal memory. OBJECTIVES: The aim of our study was to examine the association of lifetime ecstasy use with semantic memory performance and brain activation in ecstasy polydrug users. METHODS: A total of 23 abstinent ecstasy polydrug users (age = 24.57 years) and 11 controls (age = 22.36 years) performed a two-part functional magnetic resonance imaging (fMRI) semantic encoding and recognition task. To isolate brain regions activated during each semantic task, we created statistical activation maps in which brain activation was greater for word stimuli than for non-word stimuli (corrected p < 0.05). RESULTS: During the encoding phase, ecstasy polydrug users had greater activation during semantic encoding bilaterally in language processing regions, including Brodmann areas 7, 39, and 40. Of this bilateral activation, signal intensity with a peak T in the right superior parietal lobe was correlated with lifetime ecstasy use (r s = 0.43, p = 0.042). Behavioral performance did not differ between groups. CONCLUSIONS: These findings demonstrate that ecstasy polydrug users have increased brain activation during semantic processing. This increase in brain activation in the absence of behavioral deficits suggests that ecstasy polydrug users have reduced cortical efficiency during semantic encoding, possibly secondary to MDMA-induced 5-HT neurotoxicity. Although pre-existing differences cannot be ruled out, this suggests the possibility of a compensatory mechanism allowing ecstasy polydrug users to perform equivalently to controls, providing additional support for an association of altered cerebral neurophysiology with MDMA exposure.


Subject(s)
Brain/drug effects , Illicit Drugs/adverse effects , Memory/drug effects , N-Methyl-3,4-methylenedioxyamphetamine/adverse effects , Psychomotor Performance/drug effects , Semantics , Adolescent , Adult , Brain/metabolism , Female , Humans , Magnetic Resonance Imaging/methods , Male , Memory/physiology , N-Methyl-3,4-methylenedioxyamphetamine/administration & dosage , Prospective Studies , Psychomotor Performance/physiology , Young Adult
17.
J Neurodev Disord ; 4(1): 9, 2012 May 17.
Article in English | MEDLINE | ID: mdl-22958533

ABSTRACT

BACKGROUND: One hypothesis for the social deficits that characterize autism spectrum disorders (ASD) is diminished neural reward response to social interaction and attachment. Prior research using established monetary reward paradigms as a test of non-social reward to compare with social reward may involve confounds in the ability of individuals with ASD to utilize symbolic representation of money and the abstraction required to interpret monetary gains. Thus, a useful addition to our understanding of neural reward circuitry in ASD includes a characterization of the neural response to primary rewards. METHOD: We asked 17 children with ASD and 18 children without ASD to abstain from eating for at least four hours before an MRI scan in which they viewed images of high-calorie foods. We assessed the neural reward network for increases in the blood oxygenation level dependent (BOLD) signal in response to the food images RESULTS: We found very similar patterns of increased BOLD signal to these images in the two groups; both groups showed increased BOLD signal in the bilateral amygdala, as well as in the nucleus accumbens, orbitofrontal cortex, and insula. Direct group comparisons revealed that the ASD group showed a stronger response to food cues in bilateral insula along the anterior-posterior gradient and in the anterior cingulate cortex than the control group, whereas there were no neural reward regions that showed higher activation for controls than for ASD. CONCLUSION: These results suggest that neural response to primary rewards is not diminished but in fact shows an aberrant enhancement in children with ASD.

18.
Arch Gen Psychiatry ; 69(4): 399-409, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22147810

ABSTRACT

CONTEXT: MDMA (3,4-methylenedioxymethamphetamine, also popularly known as "ecstasy") is a popular recreational drug that produces loss of serotonin axons in animal models. Whether MDMA produces chronic reductions in serotonin signaling in humans remains controversial. OBJECTIVE: To determine whether MDMA use is associated with chronic reductions in serotonin signaling in the cerebral cortex of women as reflected by increased serotonin(2A) receptor levels. DESIGN: Cross-sectional case-control study comparing serotonin(2A) receptor levels in abstinent female MDMA polydrug users with those in women who did not use MDMA (within-group design assessing the association of lifetime MDMA use and serotonin(2A) receptors). Case participants were abstinent from MDMA use for at least 90 days as verified by analysis of hair samples. The serotonin(2A) receptor levels in the cerebral cortex were determined using serotonin(2A)-specific positron emission tomography with radioligand fluorine 18-labeled setoperone as the tracer. SETTING: Academic medical center research laboratory. PARTICIPANTS: A total of 14 female MDMA users and 10 women who did not use MDMA (controls). The main exclusion criteria were nondrug-related DSM-IV Axis I psychiatric disorders and general medical illness. MAIN OUTCOME MEASURES: Cortical serotonin(2A) receptor nondisplaceable binding potential (serotonin(2A)BP(ND)). RESULTS: MDMA users had increased serotonin(2A)BP(ND) in occipital-parietal (19.7%), temporal (20.5%), occipitotemporal-parietal (18.3%), frontal (16.6%), and frontoparietal (18.5%) regions (corrected P < .05). Lifetime MDMA use was positively associated with serotonin(2A)BP(ND) in frontoparietal (ß = 0.665; P = .007), occipitotemporal (ß = 0.798; P = .002), frontolimbic (ß = 0.634; P = .02), and frontal (ß = 0.691; P = .008) regions. In contrast, there were no regions in which MDMA use was inversely associated with receptor levels. There were no statistically significant effects of the duration of MDMA abstinence on serotonin(2A)BP(ND). CONCLUSIONS: The recreational use of MDMA is associated with long-lasting increases in serotonin(2A) receptor density. Serotonin(2A) receptor levels correlate positively with lifetime MDMA use and do not decrease with abstinence. These results suggest that MDMA use produces chronic serotonin neurotoxicity in humans. Given the broad role of serotonin in human brain function, the possibility for therapeutic MDMA use, and the widespread recreational popularity of this drug, these results have critical public health implications.


Subject(s)
Amphetamine-Related Disorders/metabolism , Cerebral Cortex/metabolism , Functional Neuroimaging/psychology , N-Methyl-3,4-methylenedioxyamphetamine/adverse effects , Receptor, Serotonin, 5-HT2A/metabolism , Adolescent , Adult , Amphetamine-Related Disorders/diagnostic imaging , Cerebral Cortex/diagnostic imaging , Female , Fluorine Radioisotopes , Functional Neuroimaging/methods , Humans , Positron-Emission Tomography/methods , Positron-Emission Tomography/psychology , Pyrimidinones , Radioligand Assay/methods , Radioligand Assay/psychology , Serotonin Antagonists , Substance-Related Disorders/diagnostic imaging , Time Factors
19.
Neuropsychopharmacology ; 36(6): 1127-41, 2011 May.
Article in English | MEDLINE | ID: mdl-21326196

ABSTRACT

The serotonergic neurotoxin, 3,4-methylenedioxymethamphetamine (MDMA/Ecstasy), is a highly popular recreational drug. Human recreational MDMA users have neurocognitive and neuropsychiatric impairments, and human neuroimaging data are consistent with animal reports of serotonin neurotoxicity. However, functional neuroimaging studies have not found consistent effects of MDMA on brain neurophysiology in human users. Several lines of evidence suggest that studying MDMA effects in visual system might reveal the general cortical and subcortical neurophysiological consequences of MDMA use. We used 3 T functional magnetic resonance imaging during visual stimulation to compare visual system lateral geniculate nucleus (LGN) and Brodmann Area (BA) 17 and BA 18 activation in 20 long abstinent (479.95±580.65 days) MDMA users and 20 non-MDMA user controls. Lifetime quantity of MDMA use was strongly positively correlated with blood oxygenation level-dependent (BOLD) signal intensity in bilateral LGN (r(s)=0.59; p=0.007), BA 17 (r(s)=0.50; p=0.027), and BA 18 (r(s)=0.48; p=0.031), and with the spatial extent of activation in BA 17 (r(s)=0.059; p=0.007) and BA 18 (r(s)=0.55; p=0.013). There were no between-group differences in brain activation in any region, but the heaviest MDMA users showed a significantly greater spatial extent of activation than controls in BA 17 (p=0.031) and BA 18 (p=0.049). These results suggest that human recreational MDMA use may be associated with a long-lasting increase in cortical excitability, possibly through loss of serotonin input to cortical and subcortical regions. When considered in the context of previous results, cortical hyper-excitability may be a biomarker for MDMA-induced serotonin neurotoxicity.


Subject(s)
Brain/drug effects , Brain/physiopathology , N-Methyl-3,4-methylenedioxyamphetamine/adverse effects , Serotonin Agents/adverse effects , Visual Pathways/drug effects , Visual Pathways/physiopathology , Adolescent , Adult , Brain Mapping/methods , Cerebrovascular Circulation/drug effects , Cerebrovascular Circulation/physiology , Female , Geniculate Bodies/drug effects , Geniculate Bodies/physiopathology , Humans , Magnetic Resonance Imaging/methods , Male , Visual Cortex/drug effects , Visual Cortex/physiopathology , Young Adult
20.
IEEE Trans Biomed Eng ; 57(6): 1285-96, 2010 Jun.
Article in English | MEDLINE | ID: mdl-20172796

ABSTRACT

In this paper, an efficient paradigm is presented to correct for brain shift during tumor resection therapies. For this study, high resolution preoperative (pre-op) and postoperative (post-op) MR images were acquired for eight in vivo patients, and surface/subsurface shift was identified by manual identification of homologous points between the pre-op and immediate post-op tomograms. Cortical surface deformation data were then used to drive an inverse problem framework. The manually identified subsurface deformations served as a comparison toward validation. The proposed framework recaptured 85% of the mean subsurface shift. This translated to a subsurface shift error of 0.4 +/- 0.4 mm for a measured shift of 3.1 +/- 0.6 mm. The patient's pre-op tomograms were also deformed volumetrically using displacements predicted by the model. Results presented allow a preliminary evaluation of correction both quantitatively and visually. While intraoperative (intra-op) MR imaging data would be optimal, the extent of shift measured from pre- to post-op MR was comparable to clinical conditions. This study demonstrates the accuracy of the proposed framework in predicting full-volume displacements from sparse shift measurements. It also shows that the proposed framework can be extended and used to update pre-op images on a time scale that is compatible with surgery.


Subject(s)
Artifacts , Brain Neoplasms/pathology , Brain Neoplasms/surgery , Image Interpretation, Computer-Assisted/methods , Magnetic Resonance Imaging/methods , Subtraction Technique , Surgery, Computer-Assisted/methods , Algorithms , Female , Humans , Image Enhancement/methods , Male , Middle Aged , Neurosurgical Procedures/methods , Pattern Recognition, Automated/methods , Postoperative Care/methods , Preoperative Care/methods , Reproducibility of Results , Sensitivity and Specificity
SELECTION OF CITATIONS
SEARCH DETAIL
...