Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
Proc Natl Acad Sci U S A ; 121(12): e2317078121, 2024 Mar 19.
Article in English | MEDLINE | ID: mdl-38466848

ABSTRACT

Covalent bonding interactions determine the energy-momentum (E-k) dispersion (band structure) of solid-state materials. Here, we show that noncovalent interactions can modulate the E-k dispersion near the Fermi level of a low-dimensional nanoscale conductor. We demonstrate that low energy band gaps may be opened in metallic carbon nanotubes through polymer wrapping of the nanotube surface at fixed helical periodicity. Electronic spectral, chiro-optic, potentiometric, electronic device, and work function data corroborate that the magnitude of band gap opening depends on the nature of the polymer electronic structure. Polymer dewrapping reverses the conducting-to-semiconducting phase transition, restoring the native metallic carbon nanotube electronic structure. These results address a long-standing challenge to develop carbon nanotube electronic structures that are not realized through disruption of π conjugation, and establish a roadmap for designing and tuning specialized semiconductors that feature band gaps on the order of a few hundred meV.

2.
ACS Appl Mater Interfaces ; 15(1): 984-996, 2023 Jan 11.
Article in English | MEDLINE | ID: mdl-36548441

ABSTRACT

A sonochemical-based hydrosilylation method was employed to covalently attach a rhenium tricarbonyl phenanthroline complex to silicon(111). fac-Re(5-(p-Styrene)-phen)(CO)3Cl (5-(p-styrene)-phen = 5-(4-vinylphenyl)-1,10-phenanthroline) was reacted with hydrogen-terminated silicon(111) in an ultrasonic bath to generate a hybrid photoelectrode. Subsequent reaction with 1-hexene enabled functionalization of remaining atop Si sites. Attenuated total reflectance-Fourier transform infrared spectroscopy confirms attachment of the organometallic complex to silicon without degradation of the organometallic core, supporting hydrosilylation as a strategy for installing coordination complexes that retain their molecular integrity. Detection of Re(I) and nitrogen by X-ray photoelectron spectroscopy (XPS) further support immobilization of fac-Re(5-(p-styrene)-phen)(CO)3Cl. Cyclic voltammetry and electrochemical impedance spectroscopy under white light illumination indicate that fac-Re(5-(p-styrene)-phen)(CO)3Cl undergoes two electron reductions. Mott-Schottky analysis indicates that the flat band potential is 239 mV more positive for p-Si(111) co-functionalized with both fac-Re(5-(p-styrene)-phen)(CO)3Cl and 1-hexene than when functionalized with 1-hexene alone. XPS, ultraviolet photoelectron spectroscopy, and Mott-Schottky analysis show that functionalization with fac-Re(5-(p-styrene)-phen)(CO)3Cl and 1-hexene introduces a negative interfacial dipole, facilitating reductive photoelectrochemistry.

3.
Am J Surg ; 219(3): 400-403, 2020 03.
Article in English | MEDLINE | ID: mdl-31910990

ABSTRACT

BACKGROUND: Geriatric patients, age ≥65, frequently require no operation and only short observation after injury; yet many are prescribed opioids. We reviewed geriatric opioid prescriptions following a statewide outpatient prescribing limit. METHODS: Discharge and 30-day pain prescriptions were collected for geriatric patients managed without operation and with stays less than two midnights from May and June of 2015 through 2018. Patients were compared pre- and post-limit and with a non-geriatric cohort aged 18-64. Fall risk was also assessed. RESULTS: We included 218 geriatric patients, 57 post-limit. Patients received fewer discharge prescriptions and lower doses following the limit. However, this trend preceded the limit. Geriatric patients received fewer opioid prescriptions but higher doses than non-geriatric patients. Fall risk was not associated with reduced prescription frequency or doses. CONCLUSIONS: Opioid prescribing has decreased for geriatric patients with minor injuries. However, surgeons have not reduced dosage based on age or fall risk.


Subject(s)
Analgesics, Opioid/therapeutic use , Drug Prescriptions/statistics & numerical data , Pain Management , Practice Patterns, Physicians'/statistics & numerical data , Wounds and Injuries/drug therapy , Aged , Aged, 80 and over , Female , Humans , Male , Ohio , Retrospective Studies
4.
Surgery ; 166(4): 593-600, 2019 10.
Article in English | MEDLINE | ID: mdl-31326187

ABSTRACT

BACKGROUND: Opioid-prescribing practices for minimally injured trauma patients are unknown. We hypothesized that opioid-prescribing frequency and morphine-equivalent doses prescribed have decreased in recent years, specifically surrounding an acute prescribing limit implemented in August 2017 mandating opioid prescriptions not exceed 210 morphine-equivalent doses. METHODS: A single-center retrospective study was performed in the month of May during the years 2015 to 2018 on minimally injured trauma patients in a level I trauma center. Minimally injured trauma patients included patients discharged within 2 midnights of trauma evaluation without surgical intervention. Primary outcomes were discharge opioid-prescribing frequency and dosing in morphine-equivalent doses. Secondary outcomes were occurrence and timing of postdischarge follow-up. RESULTS: For 673 minimally injured trauma patients, opioid-prescribing frequency and morphine-equivalent doses prescribed decreased between 2015 and 2017 (49.3% to 31.5%, P = .006, mean 229 to 146 morphine-equivalent doses, P = .007). Decreases between 2017 and 2018 were not statistically significant. Acute prescribing limit compliance was 97% in 2018. After the acute prescribing limit was implemented, outpatient opioid prescribing did not increase and time to earliest follow-up did not decrease. CONCLUSION: Opioid-prescribing frequency and morphine-equivalent doses prescribed to minimally injured trauma patients decreased dramatically between 2015 and 2018. These changes occurred primarily before the implementation of an acute prescribing limit; however, incremental improvement and high compliance since implementation are demonstrated. Patients did not have significantly earlier follow-up encounters for pain or additional opioid prescriptions. Prospective research on pain control for minimally injured trauma patients is needed.


Subject(s)
Analgesics, Opioid/administration & dosage , Drug Utilization/legislation & jurisprudence , Opioid-Related Disorders/prevention & control , Practice Patterns, Physicians'/legislation & jurisprudence , Wounds and Injuries/drug therapy , Cohort Studies , Continuity of Patient Care , Dose-Response Relationship, Drug , Drug Administration Schedule , Drug Prescriptions/statistics & numerical data , Female , Humans , Injury Severity Score , Male , Needs Assessment , Pain Management , Patient Discharge , Retrospective Studies , Trauma Centers , United States , Wounds and Injuries/diagnosis
5.
Dis Colon Rectum ; 61(1): 115-123, 2018 Jan.
Article in English | MEDLINE | ID: mdl-29219921

ABSTRACT

BACKGROUND: Disparities in access to colorectal cancer care are multifactorial and are affected by socioeconomic elements. Uninsured and Medicaid patients present with advanced stage disease and have worse outcomes compared with similar privately insured patients. Safety net hospitals are a major care provider to this vulnerable population. Few studies have evaluated outcomes for safety net hospitals compared with private institutions in colorectal cancer. OBJECTIVE: The purpose of this study was to compare demographics, screening rates, presentation stage, and survival rates between a safety net hospital and a tertiary care center. DESIGN: Comparative review of patients at 2 institutions in the same metropolitan area were conducted. SETTINGS: The study included colorectal cancer care delivered either at 1 safety net hospital or 1 private tertiary care center in the same city from 2010 to 2016. PATIENTS: A total of 350 patients with colorectal cancer from each hospital were evaluated. MAIN OUTCOME MEASURES: Overall survival across hospital systems was measured. RESULTS: The safety net hospital had significantly more uninsured and Medicaid patients (46% vs 13%; p < 0.001) and a significantly lower median household income than the tertiary care center ($39,299 vs $49,741; p < 0.0001). At initial presentation, a similar percentage of patients at each hospital presented with stage IV disease (26% vs 20%; p = 0.06). For those undergoing resection, final pathologic stage distribution was similar across groups (p = 0.10). After a comparable median follow-up period (26.6 mo for safety net hospital vs 29.2 mo for tertiary care center), log-rank test for overall survival favored the safety net hospital (p = 0.05); disease-free survival was similar between hospitals (p = 0.40). LIMITATIONS: This was a retrospective review, reporting from medical charts. CONCLUSIONS: Our results support the value of safety net hospitals for providing quality colorectal cancer care, with survival and recurrence outcomes equivalent or improved compared with a local tertiary care center. Because safety net hospitals can provide equivalent outcomes despite socioeconomic inequalities and financial constraints, emphasis should be focused on ensuring that adequate funding for these institutions continues. See Video Abstract at http://links.lww.com/DCR/A454.


Subject(s)
Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/therapy , Healthcare Disparities/statistics & numerical data , Safety-net Providers/standards , Tertiary Care Centers/standards , Colorectal Neoplasms/mortality , Health Services Accessibility/statistics & numerical data , Humans , Medicaid/statistics & numerical data , Medically Uninsured/statistics & numerical data , Quality of Health Care , Retrospective Studies , Safety-net Providers/statistics & numerical data , Survival Analysis , Tertiary Care Centers/statistics & numerical data , United States/epidemiology
6.
Transfusion ; 55(4): 703-7, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25565577

ABSTRACT

A national recipient hemovigilance system was introduced in the United States in 2010, when voluntary enrollment began as part of the National Healthcare Safety Network (NHSN) Hemovigilance Module. NHSN is a secure, Web-based surveillance system operated by the Centers for Disease Control and Prevention and used by US health care facilities to report a variety of patient safety information. The Hemovigilance Module is used for comprehensive monitoring of transfusion-related adverse events. Participating facilities can utilize analytic tools available within the module to identify opportunities for enhancing transfusion safety, evaluate the effectiveness of interventions, and compare facility specific transfusion-related data to aggregate national estimates. Data may be voluntarily shared by facilities with external partners for patient safety improvement initiatives and to fulfill reporting mandates. We describe the key characteristics of the Hemovigilance Module, highlight the benefits for participating facilities, and discuss the use of reported data for establishing national estimates of transfusion-associated adverse events to identify gaps in transfusion safety and opportunities for interventions. National hemovigilance systems are essential to recognize gaps in transfusion safety and identify opportunities for interventions to improve patient safety and outcomes.


Subject(s)
Blood Safety , Centers for Disease Control and Prevention, U.S./organization & administration , Blood Banks/standards , Blood Transfusion/statistics & numerical data , Health Care Surveys , Humans , Internet , Quality Improvement , Research Design , Transfusion Reaction , United States , Vocabulary, Controlled
7.
Transfusion ; 55(4): 709-18, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25371300

ABSTRACT

BACKGROUND: In 2010, health care facilities in the United States began voluntary enrollment in the National Healthcare Safety Network (NHSN) Hemovigilance Module. Participants report transfusion practices; red blood cell, platelet (PLT), plasma, and cryoprecipitate units transfused; and transfusion-related adverse reactions and process errors to the Centers for Disease Control and Prevention through a secure, Internet-accessible surveillance application available to transfusing facilities. STUDY DESIGN AND METHODS: Facilities submitting at least 1 month of transfused components data and adverse reactions from January 1, 2010, to December 31, 2012, were included in this analysis. Adverse reaction rates for transfused components, stratified by component type and collection and modification methods, were calculated. RESULTS: In 2010 to 2012, a total of 77 facilities reported 5136 adverse reactions among 2,144,723 components transfused (239.5/100,000). Allergic (46.8%) and febrile nonhemolytic (36.1%) reactions were most frequent; 7.2% of all reactions were severe or life-threatening and 0.1% were fatal. PLT transfusions (421.7/100,000) had the highest adverse reaction rate. CONCLUSION: Adverse transfusion reaction rates from the NHSN Hemovigilance Module in the United States are comparable to early hemovigilance reporting from other countries. Although severe reactions are infrequent, the numbers of transfusion reactions in US hospitals suggest that interventions to prevent these reactions are important for patient safety. Further investigation is needed to understand the apparent increased risk of reactions from apheresis-derived blood components. Comprehensive evaluation, including data validation, is important to continued refinement of the module.


Subject(s)
Blood Safety , Transfusion Reaction , Acute Lung Injury/epidemiology , Acute Lung Injury/etiology , Blood Component Transfusion/adverse effects , Blood Component Transfusion/methods , Blood Component Transfusion/statistics & numerical data , Blood Transfusion/methods , Blood Transfusion/statistics & numerical data , Centers for Disease Control and Prevention, U.S. , Communicable Diseases/epidemiology , Communicable Diseases/transmission , Data Collection , Humans , Leukocyte Reduction Procedures , Retrospective Studies , Transfusion Reaction/epidemiology , United States
8.
Environ Toxicol Chem ; 31(2): 402-7, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22102175

ABSTRACT

Potential microbial activities are commonly used to assess soil toxicity of petroleum hydrocarbons (PHC) and are assumed to be a surrogate for microbial activity within the soil ecosystem. However, this assumption needs to be evaluated for frozen soil, in which microbial activity is limited by liquid water (θ(liquid)). Influence of θ(liquid) on in situ toxicity was evaluated and compared to the toxicity endpoints of potential microbial activities using soil from an aged diesel fuel spill at Casey Station, East Antarctica. To determine in situ toxicity, gross mineralization and nitrification rates were determined by the stable isotope dilution technique. Petroleum hydrocarbon-contaminated soil (0-8,000 mg kg(-1)), packed at bulk densities of 1.4, 1.7, and 2.0 g cm(-3) to manipulate liquid water content, was incubated at -5°C for one, two, and three months. Although θ(liquid) did not have a significant effect on gross mineralization or nitrification, gross nitrification was sensitive to PHC contamination, with toxicity decreasing over time. In contrast, gross mineralization was not sensitive to PHC contamination. Toxic response of gross nitrification was comparable to potential nitrification activity (PNA) with similar EC25 (effective concentration causing a 25% effect in the test population) values determined by both measurement endpoints (400 mg kg(-1) for gross nitrification compared to 200 mg kg(-1) for PNA), indicating that potential microbial activity assays are good surrogates for in situ toxicity of PHC contamination in polar regions.


Subject(s)
Hydrocarbons/toxicity , Petroleum/toxicity , Soil Pollutants/toxicity , Soil/chemistry , Toxicity Tests/methods , Antarctic Regions , Ecosystem , Gasoline/analysis , Gasoline/toxicity , Humans , Hydrocarbons/analysis , Nitrification , Petroleum/analysis , Soil Microbiology , Soil Pollutants/analysis
9.
Environ Toxicol Chem ; 31(2): 395-401, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22102214

ABSTRACT

Bioremediation has been used to remediate petroleum hydrocarbon (PHC)-contaminated sites in polar regions; however, limited knowledge exists in understanding how frozen conditions influence factors that regulate microbial activity. We hypothesized that increased liquid water (θ(liquid) ) would affect nutrient supply rates (NSR) and gas diffusion under frozen conditions. If true, management practices that increase θ(liquid) should also increase bioremediation in polar soils by reducing nutrient and oxygen limitations. Influence of θ(liquid) on NSR was determined using diesel-contaminated soil (0-8,000 mg kg(-1)) from Casey Station, Antarctica. The θ(liquid) was altered between 0.007 and 0.035 cm(3) cm(-3) by packing soil cores at different bulk densities. The nutrient supply rate of NH 4+ and NO 3-, as well as gas diffusion coefficient, D(s), were measured at two temperatures, 21°C and -5°C, to correct for bulk density effects. Freezing decreased NSR of both NH 4+ and NO 3-, with θ(liquid) linked to nitrate and ammonia NSR in frozen soil. Similarly for D(s), decreases due to freezing were much more pronounced in soils with low θ(liquid) compared to soils with higher θ(liquid) contents. Additional studies are needed to determine the relationship between degradation rates and θ(liquid) under frozen conditions.


Subject(s)
Hydrocarbons/analysis , Petroleum/analysis , Soil Pollutants/analysis , Soil/chemistry , Antarctic Regions , Biodegradation, Environmental , Diffusion , Environmental Monitoring , Freezing , Hydrocarbons/chemistry , Hydrocarbons/metabolism , Nitrates/analysis , Nitrates/chemistry , Petroleum/metabolism , Soil Pollutants/chemistry , Soil Pollutants/metabolism , Temperature
10.
J Am Diet Assoc ; 111(6): 858-63, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21616198

ABSTRACT

The transmission of bovine spongiform encephalopathy (BSE) to human beings and the spread of chronic wasting disease (CWD) among cervids have prompted concerns about zoonotic transmission of prion diseases. Travel to the United Kingdom and other European countries, hunting for deer or elk, and venison consumption could result in the exposure of US residents to the agents that cause BSE and CWD. The Foodborne Diseases Active Surveillance Network 2006-2007 population survey was used to assess the prevalence of these behaviors among residents of 10 catchment areas across the United States. Of 17,372 survey respondents, 19.4% reported travel to the United Kingdom since 1980, and 29.5% reported travel to any of the nine European countries considered to be BSE-endemic since 1980. The proportion of respondents who had ever hunted deer or elk was 18.5%, and 1.2% had hunted deer or elk in a CWD-endemic area. More than two thirds (67.4%) reported having ever eaten deer or elk meat. Respondents who traveled spent more time in the United Kingdom (median 14 days) than in any other BSE-endemic country. Of the 11,635 respondents who had consumed venison, 59.8% ate venison at most one to two times during their year of highest consumption, and 88.6% had obtained all of their meat from the wild. The survey results were useful in determining the prevalence and frequency of behaviors that could be important factors for foodborne prion transmission.


Subject(s)
Food Contamination , Population Surveillance , Prion Diseases/transmission , Prion Diseases/veterinary , Zoonoses , Adolescent , Adult , Aged , Animals , Animals, Domestic , Animals, Wild , Catchment Area, Health , Cattle , Child , Child, Preschool , Deer , Encephalopathy, Bovine Spongiform/epidemiology , Encephalopathy, Bovine Spongiform/transmission , Europe , Female , Humans , Infant , Infant, Newborn , Male , Meat , Middle Aged , Prevalence , Prion Diseases/epidemiology , Risk Factors , Travel , United States , Wasting Disease, Chronic/epidemiology , Wasting Disease, Chronic/transmission , Young Adult
11.
Am J Trop Med Hyg ; 83(1): 174-82, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20595498

ABSTRACT

Rocky Mountain spotted fever (RMSF), a potentially fatal tick-borne infection caused by Rickettsia rickettsii, is considered a notifiable condition in the United States. During 2000 to 2007, the annual reported incidence of RMSF increased from 1.7 to 7 cases per million persons from 2000 to 2007, the highest rate ever recorded. American Indians had a significantly higher incidence than other race groups. Children 5-9 years of age appeared at highest risk for fatal outcome. Enzyme-linked immunosorbent assays became more widely available beginning in 2004 and were used to diagnose 38% of cases during 2005-2007. The proportion of cases classified as confirmed RMSF decreased from 15% in 2000 to 4% in 2007. Concomitantly, case fatality decreased from 2.2% to 0.3%. The decreasing proportion of confirmed cases and cases with fatal outcome suggests that changes in diagnostic and surveillance practices may be influencing the observed increase in reported incidence rates.


Subject(s)
Incidence , Rickettsia rickettsii , Rocky Mountain Spotted Fever/epidemiology , Animals , Child , Enzyme-Linked Immunosorbent Assay/methods , Enzyme-Linked Immunosorbent Assay/trends , Humans , Surveys and Questionnaires , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...