Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
J Emerg Med ; 61(6): 711-719, 2021 12.
Article in English | MEDLINE | ID: mdl-34654586

ABSTRACT

BACKGROUND: Although there is some support for visual estimation (VE) as an accurate method to estimate left ventricular ejection fraction (LVEF), it is also scrutinized for its subjectivity. Therefore, more objective assessments, such as fractional shortening (FS) or e-point septal separation (EPSS), may be useful in estimating LVEF among patients in the emergency department (ED). OBJECTIVE: Our aim was to compare the real-world accuracy of VE, FS, and EPSS using a sample of point-of-care cardiac ultrasound transthoracic echocardiography (POC-TTE) images acquired by emergency physicians (EPs) with the gold standard of Simpson's method of discs, as measured by comprehensive cardiology-performed echocardiography. METHODS: We conducted a single-site prospective observational study comparing VE, FS, and EPSS to assess LVEF. Adult patients in the ED receiving both POC-TTE and comprehensive cardiology TTE were included. EPs acquired POC-TTE images and videos that were then interpreted by 2 blinded EPs who were fellowship-trained in emergency ultrasound. EPs estimated LVEF using VE, FS, and EPSS. The primary outcome was accuracy. RESULTS: Between April and May 2018, 125 patients were enrolled and 113 were included in the final analysis. EP1 and EP2 had a κ of 0.94 (95% confidence interval [CI] 0.87-1.00) and 0.97 (95% CI 0.91-1.00), respectively, for VE compared with gold standard, a κ of 0.40 (95% CI 0.23-0.57) and 0.38 (95% CI 0.18-0.57), respectively, for EPSS compared with gold standard, and a κ of 0.70 (95% CI 0.54-0.85) and 0.66 (95% CI 0.50-0.81), respectively, for FS compared with gold standard. Sensitivity of severe dysfunction was moderate to high in VE (EP1 85% and EP2 93%), poor to moderate in FS (EP1 73% and EP2 50%), and poor in EPSS (EP1 11% and EP2 18%). CONCLUSIONS: Using a real-world sample of POC-TTE images, the quantitative measurements of EPSS and FS demonstrated poor accuracy in estimating LVEF, even among experienced sonographers. These methods should not be used to determine cardiac function in the ED. VE by experienced physicians demonstrated reliable accuracy for estimating LVEF compared with the gold standard of cardiology-performed TTE.


Subject(s)
Physicians , Ventricular Function, Left , Adult , Echocardiography , Humans , Prospective Studies , Stroke Volume
2.
Proc Natl Acad Sci U S A ; 118(37)2021 09 14.
Article in English | MEDLINE | ID: mdl-34497124

ABSTRACT

While model order reduction is a promising approach in dealing with multiscale time-dependent systems that are too large or too expensive to simulate for long times, the resulting reduced order models can suffer from instabilities. We have recently developed a time-dependent renormalization approach to stabilize such reduced models. In the current work, we extend this framework by introducing a parameter that controls the time decay of the memory of such models and optimally select this parameter based on limited fully resolved simulations. First, we demonstrate our framework on the inviscid Burgers equation whose solution develops a finite-time singularity. Our renormalized reduced order models are stable and accurate for long times while using for their calibration only data from a full order simulation before the occurrence of the singularity. Furthermore, we apply this framework to the three-dimensional (3D) Euler equations of incompressible fluid flow, where the problem of finite-time singularity formation is still open and where brute force simulation is only feasible for short times. Our approach allows us to obtain a perturbatively renormalizable model which is stable for long times and includes all the complex effects present in the 3D Euler dynamics. We find that, in each application, the renormalization coefficients display algebraic decay with increasing resolution and that the parameter which controls the time decay of the memory is problem-dependent.

3.
Cureus ; 12(2): e6917, 2020 Feb 07.
Article in English | MEDLINE | ID: mdl-32190472

ABSTRACT

Thrombophlebitis of a subcutaneous vein, known as Mondor's disease, is a rare cause of chest pain and can mimic several more life-threatening diseases. Mondor's disease can be caused by trauma, or hypercoagulable states; however, in many cases the etiology is unknown. Mondor's disease is usually self-limited and can be managed conservatively. In this case report, we highlight a 52-year-old male patient who presented to our emergency department with chest pain caused by Mondor's disease mimicking a pulmonary embolism. Although a rare and benign diagnosis, Mondor's disease should be part of the differential diagnosis of chest pain and can be made on the basis of a thorough history and physical examination alone. Recognition of Mondor's disease could reduce costs and risks of further testing for patients presenting with chest pain.

4.
Ann Emerg Med ; 76(2): 134-142, 2020 08.
Article in English | MEDLINE | ID: mdl-31955940

ABSTRACT

STUDY OBJECTIVE: Ultrasonographically guided intravenous peripheral catheters have dismal dwell time, with most intravenous lines failing before completion of therapy. Catheter length in the vein is directly related to catheter longevity. We investigate the survival of an ultralong ultrasonographically guided intravenous peripheral catheter compared with a standard long one. METHODS: We conducted a single-site, nonblinded, randomized trial of catheter survival. Adult patients presenting to the emergency department with difficult vascular access were recruited and randomized to receive either standard long, 4.78-cm, 20-gauge ultrasonographically guided intravenous peripheral catheters or ultralong, 6.35-cm, 20-gauge ultrasonographically guided intravenous peripheral catheters. The primary outcome was duration of catheter survival. The secondary outcome was the optimal length of the catheter in the vein to maximize survival. Additional intravenous-related endpoints included first-stick success, time to insertion, number of attempts, thrombosis, and infection. RESULTS: Between October 2018 and March 2019, 257 patients were randomized, with 126 in the standard long ultrasonographically guided intravenous peripheral catheter group and 131 in the ultralong group. Kaplan-Meier estimate of catheter median survival time in the ultralong group was 136 hours (95% confidence interval [CI] 116 to 311 hours) compared with 92 hours (95% CI 71 to 120 hours) in the standard long group, for a difference of 44 hours (95% CI 2 to 218 hours). The optimal catheter length in the vein was 2.75 cm, and intravenous lines with greater than 2.75 cm inserted had a median survival of 129 hours (95% CI 102 to 202 hours) compared with 75 hours (95% CI 52 to 116 hours) for intravenous lines with less than or equal to 2.75 cm, for a difference of 54 hours (95% CI 10 to 134 hours). Insertion characteristics were similar between the groups: 74.1% versus 79.4% first-stick success (95% CI for the difference -2% to 5%), 1.4 versus 1.3 for number of attempts (95% CI for the difference -0.1 to 0.3), and 6.9 versus 5.9 minutes to completion (95% CI for the difference -1.3 to 3.4) with ultralong versus standard long, respectively. There were no cases of infection or thrombosis. CONCLUSION: This study demonstrated increased catheter survival when the ultralong compared with the standard long ultrasonographically guided intravenous peripheral catheter was used, whereas insertion characteristics and safety appeared similar.


Subject(s)
Catheterization, Peripheral/instrumentation , Equipment Design , Vascular Access Devices , Adult , Aged , Catheterization, Peripheral/methods , Female , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Proportional Hazards Models , Surgery, Computer-Assisted , Time Factors , Ultrasonography
5.
West J Emerg Med ; 20(5): 719-725, 2019 Aug 06.
Article in English | MEDLINE | ID: mdl-31539328

ABSTRACT

INTRODUCTION: Peripheral, ultrasound-guided intravenous (IV) access occurs frequently in the emergency department, but certain populations present unique challenges for successfully completing this procedure. Prior research has demonstrated decreased compressibility under double tourniquet technique (DT) compared with single tourniquet (ST). We hypothesized that catheters inserted under DT method would have a higher first-stick success rate compared with those inserted under ST method. METHODS: We randomized 100 patients with a history of difficult IV access, as defined by past ultrasound IV, prior emergency visit with two or more attempts required for vascular access, history of IV drug abuse, history of end stage renal disease on hemodialysis or obesity, to ultrasound-guided IV placement under either DT or ST method. We measured the vein characteristics measured under ultrasound, and recorded the number of attempts and location of attempts at vascular access. RESULTS: Of an initial 100 patients enrolled, we analyzed a total of 99 with 48 placed under ST and 51 placed under DT. Attending physicians inserted 41.7% of ST and 41.2% of DT, with non-attending inserters (including residents, nurses, and technicians) inserted the remainder. First-stick success rate was observed at 64.3% in ST and 66.7% in DT (p=0.93). Attendings had an overall higher first-stick success rate (95.1%) compared to non-attending inserters (65.5%) (p=<0.001). The average vein depth measured in ST was 0.73 centimeters (cm) compared with 0.87 cm in DT (p=0.02). CONCLUSION: DT technique did not produce a measureable increase in first-stick success rate compared to ST, including after adjusting for level of training of inserter. However, a significant difference in average vein depth between the study arms may have limited the reliability of our overall results. Future studies controlling for this variable may be required to more accurately compare these two techniques.


Subject(s)
Catheterization, Peripheral/methods , Critical Illness/therapy , Emergency Service, Hospital , Tourniquets , Ultrasonography/methods , Veins/diagnostic imaging , Adult , Aged , Female , Humans , Infusions, Intravenous/methods , Male , Middle Aged , Prospective Studies , Reproducibility of Results
6.
Sci Total Environ ; 613-614: 1104-1116, 2018 Feb 01.
Article in English | MEDLINE | ID: mdl-28954372

ABSTRACT

Through a combined approach using analytical chemistry, real-time quantitative polymerase chain reaction (qPCR), and targeted amplicon sequencing, we studied the impact of wastewater treatment plant effluent sources at six sites on two sampling dates on the chemical and microbial population regimes within the Wissahickon Creek, and its tributary, Sandy Run, in Montgomery County, Pennsylvania, USA. These water bodies contribute flow to the Schuylkill River, one of the major drinking water sources for Philadelphia, Pennsylvania. Effluent was observed to be a significant source of nutrients, human and non-specific fecal associated taxa. There was an observed increase in the alpha diversity at locations immediately below effluent outflows, which contributed many taxa involved in wastewater treatment processes and nutrient cycling to the stream's microbial community. Unexpectedly, modeling of microbial community shifts along the stream was not controlled by concentrations of measured nutrients. Furthermore, partial recovery, in the form of decreasing abundances of bacteria and nutrients associated with wastewater treatment plant processes, nutrient cycling bacteria, and taxa associated with fecal and sewage sources, was observed between effluent sources, which we hypothesize is controlled by distance from effluent source. Antecedent moisture conditions were observed to impact overall microbial community diversity, with higher diversity occurring after rainfall. Finally, the efficacy of using a subset of the microbial community including the orders of Bifidobacteriales, Bacteroidales, and Clostridiales to estimate the degree of influence due to sewage and fecal sources was explored and verified.

7.
Environ Sci Technol ; 51(22): 13344-13352, 2017 Nov 21.
Article in English | MEDLINE | ID: mdl-29053261

ABSTRACT

Data collected from experiments conducted at a flask scale are regularly used as input data for life cycle assessments and techno-economic analyses for predicting the potential productivities of large-scale commercial facilities. This study measures and compares nitrogen removal and biomass growth rates in treatment systems that utilize an algae-bacteria consortium to remediate landfill leachate at three scales: small (0.25 L), medium (100 L), and large (1000 L). The medium- and large-scale vessels were run for 52 consecutive weeks as semibatch reactors under variable environmental conditions. The small-scale experiments were conducted in flasks as batch experiments under controlled environmental conditions. Kolomogov-Smirnov statistical tests, which compare the distributions of entire data sets, were used to determine if the ammonia removal, total nitrogen removal, and biomass growth rates at each scale were statistically different. Results from the Kolmogov-Smirnov comparison indicate that there is a significant difference between all rates determined in the large-scale vessels compared to those in the small-scale vessels. These results suggest that small-scale experiments may not be appropriate as input data in predictive analyses of full scale algal processes. The accumulation of nitrite and nitrate within the reactor, observed midway through the experimental process, is attributed to high relative abundances of ammonia- and nitrite-oxidizing bacteria, identified via metagenomic analysis.


Subject(s)
Bioreactors , Nitrogen , Water Pollutants, Chemical , Ammonia , Bacteria , Biomass
8.
Phytopathology ; 105(5): 621-7, 2015 May.
Article in English | MEDLINE | ID: mdl-25901871

ABSTRACT

Wheat streak mosaic virus (WSMV) causes significant yield loss in hard red winter wheat in the U.S. Southern High Plains. Despite the prevalence of this pathogen, little is known about the physiological response of wheat to WSMV infection. A 2-year study was initiated to (i) investigate the effect of WSMV, inoculated at different development stages, on shoot and root growth, water use, water use efficiency (WUE), and photosynthesis and (ii) understand the relationships between yield and photosynthetic parameters during WSMV infection. Two greenhouse experiments were conducted with two wheat cultivars mechanically inoculated with WSMV at different developmental stages, from three-leaf to booting. WSMV inoculated early, at three- to five-leaf stage, resulted in a significant reduction in shoot biomass, root dry weight, and yield compared with wheat infected at the jointing and booting stages. However, even when inoculated as late as jointing, WSMV still reduced grain yield by at least 53%. Reduced tillers, shoot biomass, root dry weight, water use, and WUE contributed to yield loss under WSMV infection. However, infection by WSMV did not affect rooting depth and the number of seminal roots but reduced the number of nodal roots. Leaf photosynthetic parameters (chlorophyll [SPAD], net photosynthetic rate [Pn], stomatal conductance [Gs], intercellular CO2 concentration [Ci], and transpiration rate [Tr]) were reduced when infected by WSMV, and early infection reduced parameters more than late infection. Photosynthetic parameters had a linear relationship with grain yield and shoot biomass. The reduced Pn under WSMV infection was mainly in response to decreased Gs, Ci, and SPAD. The results of this study indicated that leaf chlorophyll and gas exchange parameters can be used to quantify WSMV effects on biomass and grain yield in wheat.


Subject(s)
Plant Diseases/virology , Potyviridae/physiology , Triticum/physiology , Biomass , Chlorophyll/metabolism , Edible Grain/growth & development , Edible Grain/physiology , Edible Grain/virology , Photosynthesis/physiology , Plant Leaves/growth & development , Plant Leaves/physiology , Plant Leaves/virology , Plant Roots/growth & development , Plant Roots/physiology , Plant Roots/virology , Plant Transpiration/physiology , Seasons , Triticum/growth & development , Triticum/virology , Water/physiology
9.
J Vis Exp ; (106): e53443, 2015 Dec 25.
Article in English | MEDLINE | ID: mdl-26780544

ABSTRACT

A novel reactor design, coined a high density bioreactor (HDBR), is presented for the cultivation and study of high density microbial communities. Past studies have evaluated the performance of the reactor for the removal of COD(1) and nitrogen species(2-4) by heterotrophic and chemoautotrophic bacteria, respectively. The HDBR design eliminates the requirement for external flocculation/sedimentation processes while still yielding effluent containing low suspended solids. In this study, the HDBR is applied as a photobioreactor (PBR) in order to characterize the nitrogen removal characteristics of an algae-based photosynthetic microbial community. As previously reported for this HDBR design, a stable biomass zone was established with a clear delineation between the biologically active portion of the reactor and the recycling reactor fluid, which resulted in a low suspended solid effluent. The algal community in the HDBR was observed to remove 18.4% of total nitrogen species in the influent. Varying NH4(+) and NO3(-) concentrations in the feed did not have an effect on NH4(+) removal (n=44, p=0.993 and n=44, p=0.610 respectively) while NH4(+) feed concentration was found to be negatively related with NO3(-) removal (n=44, p=0.000) and NO3(-) feed concentration was found to be positively correlated with NO3(-) removal (n=44, p=0.000). Consistent removal of NH4(+), combined with the accumulation of oxidized nitrogen species at high NH4(+) fluxes indicates the presence of ammonia- and nitrite-oxidizing bacteria within the microbial community.


Subject(s)
Bioreactors/microbiology , Waste Disposal, Fluid/methods , Ammonia/metabolism , Bacteria/metabolism , Biomass , Microalgae/metabolism , Nitrites/metabolism , Nitrogen/metabolism
10.
Plant Dis ; 98(4): 525-531, 2014 Apr.
Article in English | MEDLINE | ID: mdl-30708730

ABSTRACT

Wheat streak mosaic virus (WSMV), Triticum mosaic virus, and Wheat mosaic virus, all vectored by the wheat curl mite Aceria tosichella Keifer, frequently cause devastating losses to winter wheat production throughout the central and western Great Plains. Resistant 'Mace' and 'RonL are commercially available and contain the wsm1 and wsm2 genes, respectively, for resistance to WSMV. However, the resistance in these cultivars is temperature sensitive, ineffective above 27°C, and does not protect against the other common wheat viruses. The majority of winter wheat in the Southern Great Plains is planted in early fall as a dual-purpose crop for both grazing and grain production. Early planting exposes wheat plants to warmer temperatures above the threshold for effective resistance. Studies were conducted to determine whether the resistance found in these cultivars would give infected plants the ability to recover as temperatures cooled to a range conducive to effective genetic resistance. RonL, Mace, 'TAM 111', 'TAM 112', and 'Karl 92' wheat were infested with WSMV viruliferous mites at temperatures above the resistance threshold. After the initial 4-week infection period, plants were subjected to progressively cooler temperatures during the winter months, well below the resistance threshold. Throughout the study, plant samples were taken to quantify virus titer and mite populations. Resistant RonL and Mace, which became severely infected during the initial infection period, were not able to recover even when temperatures dropped below the resistance threshold. However, TAM 112 showed resistance to WSMV but, more importantly, it also showed resistance to the wheat curl mite, because the mite population in this cultivar was significantly lower than on all other cultivars. The results of this study are significant in that they represent the first evidence of quantitative resistance to both WSMV and the wheat curl mite in a single wheat cultivar. Resistance to the wheat curl mite has potential to reduce losses to all mite-vectored virus diseases of wheat and not just WSMV.

11.
Plant Dis ; 95(12): 1516-1519, 2011 Dec.
Article in English | MEDLINE | ID: mdl-30732011

ABSTRACT

In 2006, a previously unknown wheat (Triticum aestivum) virus was discovered in Western Kansas and given the name Triticum mosaic virus (TriMV). TriMV has since been found in wheat samples isolated all across the Great Plains. Even though it can infect singularly, TriMV is mostly found with Wheat streak mosaic virus (WSMV) as a co-infection. The potential for TriMV to cause economic loss is significant, but very little is known about the virus. The objective of this study was to survey the TriMV population for genetic variation by nucleotide sequencing of isolates across a geographical region. A secondary objective was to characterize the WSMV isolates that are being co-transmitted with TriMV. Fourteen different TriMV isolations were taken from locations in Texas, Oklahoma, and Kansas, and the coat protein cDNA was sequenced. Thirteen nucleotide differences were found in the TriMV isolates, of which three induce amino acid changes. WSMV isolates had 65 nucleotide changes when compared to WSMV Sydney81. Our results indicate the TriMV virus population has minimal amounts of sequence variation and no singular WSMV genotype is specifically associated with TriMV co-infection. Based on the isolates analyzed, it appears that the field population of TriMV is very homogeneous.

SELECTION OF CITATIONS
SEARCH DETAIL
...