Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
2.
J Hosp Med ; 18(7): 568-575, 2023 07.
Article in English | MEDLINE | ID: mdl-36788630

ABSTRACT

BACKGROUND: Increased hospital admissions due to COVID-19 place a disproportionate strain on inpatient general medicine service (GMS) capacity compared to other services. OBJECTIVE: To study the impact on capacity and safety of a hospital-wide policy to redistribute admissions from GMS to non-GMS based on admitting diagnosis during surge periods. DESIGN, SETTING, AND PARTICIPANTS: Retrospective case-controlled study at a large teaching hospital. The intervention included adult patients admitted to general care wards during two surge periods (January-February 2021 and 2022) whose admission diagnosis was impacted by the policy. The control cohort included admissions during a matched number of days preceding the intervention. MAIN OUTCOMES AND MEASURES: Capacity measures included average daily admissions and hospital census occupied on GMS. Safety measures included length of stay (LOS) and adverse outcomes (death, rapid response, floor-to-intensive care unit transfer, and 30-day readmission). RESULTS: In the control cohort, there were 365 encounters with 299 (81.9%) GMS admissions and 66 (18.1%) non-GMS versus the intervention with 384 encounters, including 94 (24.5%) GMS admissions and 290 (75.5%) non-GMS (p < .001). The average GMS census decreased from 17.9 and 21.5 during control periods to 5.5 and 8.5 during intervention periods. An interrupted time series analysis confirmed a decrease in GMS daily admissions (p < .001) and average daily hospital census (p = .014; p < .001). There were no significant differences in LOS (5.9 vs. 5.9 days, p = .059) or adverse outcomes (53, 14.5% vs. 63, 16.4%; p = .482). CONCLUSION: Admission redistribution based on diagnosis is a safe lever to reduce capacity strain on GMS during COVID-19 surges.


Subject(s)
COVID-19 , Patient Admission , Adult , Humans , Retrospective Studies , COVID-19/epidemiology , COVID-19/therapy , Hospitalization , Length of Stay , Hospitals, Teaching
3.
Jt Comm J Qual Patient Saf ; 49(4): 181-188, 2023 04.
Article in English | MEDLINE | ID: mdl-36476954

ABSTRACT

BACKGROUND: Hospitals have sought to increase pre-noon discharges to improve capacity, although evidence is mixed on the impact of these initiatives. Past interventions have not quantified the daily gap between morning bed supply and demand. The authors quantified this gap and applied the pre-noon data to target a pre-noon discharge initiative. METHODS: The study was conducted at a large hospital and included adult and pediatric medical/surgical wards. The researchers calculated the difference between the average cumulative bed requests and transfers in for each hour of the day in 2018, the year prior to the intervention. In 2019 an intervention on six adult general medical and two surgical wards was implemented. Eight intervention and 14 nonintervention wards were compared to determine the change in average cumulative pre-noon discharges. The change in average hospital length of stay (LOS) and 30-day readmissions was also calculated. RESULTS: The average daily cumulative gap by noon between bed supply and demand across all general care wards was 32.1 beds (per ward average, 1.3 beds). On intervention wards, mean pre-noon discharges increased from 4.7 to 6.7 (p < 0.0000) compared with the nonintervention wards 14.0 vs. 14.6 (p = 0.19877). On intervention wards, average LOS decreased from 6.9 to 6.4 days (p < 0.001) and readmission rates were 14.3% vs 13.9% (p = 0.3490). CONCLUSION: The gap between daily hospital bed supply and demand can be quantified and applied to create pre-noon discharge targets. In an intervention using these targets, researchers observed an increase in morning discharges, a decrease in LOS, and no significant change in readmissions.


Subject(s)
Patient Discharge , Patient Readmission , Adult , Humans , Child , Length of Stay , Equipment and Supplies, Hospital , Hospitals
4.
Am J Health Syst Pharm ; 79(19): 1652-1662, 2022 09 22.
Article in English | MEDLINE | ID: mdl-35596269

ABSTRACT

PURPOSE: Obtaining an accurate medication history is a vital component of medication reconciliation upon admission to the hospital. Despite the importance of this task, medication histories are often inaccurate and/or incomplete. We evaluated the association of a pharmacy-driven medication history initiative on clinical outcomes of patients admitted to the general medicine service of an academic medical center. METHODS: Comparing patients who received a pharmacy-driven medication history to those who did not, a retrospective stabilized inverse probability treatment weighting propensity score analysis was used to estimate the average treatment effect of the intervention on general medical patients. Fifty-two patient baseline characteristics including demographic, operational, and clinical variables were controlled in the propensity score model. Hospital length of stay, 7-day and 30-day unplanned readmissions, and in-hospital mortality were evaluated. RESULTS: Among 11,576 eligible general medical patients, 2,234 (19.30%) received a pharmacy-driven medication history and 9,342 (80.70%) patients did not. The estimated average treatment effect of receiving a pharmacy-driven medication history was a shorter length of stay (mean, 5.88 days vs 6.53 days; P = 0.0002) and a lower in-hospital mortality rate (2.34% vs 3.72%, P = 0.001), after adjustment for differences in patient baseline characteristics. No significant difference was found for 7-day or 30-day all-cause readmission rates. CONCLUSION: Pharmacy-driven medication histories reduced length of stay and in-hospital mortality in patients admitted to the general medical service at an academic medical center but did not change 7-day and 30-day all-cause readmission rates. Further research via a large, multisite randomized controlled trial is needed to confirm our findings.


Subject(s)
Pharmacy Service, Hospital , Pharmacy , Humans , Medication Reconciliation , Patient Readmission , Retrospective Studies
5.
Clin Transl Gastroenterol ; 13(7): e00482, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35347098

ABSTRACT

INTRODUCTION: Delays in inpatient colonoscopy are commonly caused by inadequate bowel preparation and result in increased hospital length of stay (LOS) and healthcare costs. Low-volume bowel preparation (LV-BP; sodium sulfate, potassium sulfate, and magnesium sulfate ) has been shown to improve outpatient bowel preparation quality compared with standard high-volume bowel preparations (HV-BP; polyethylene glycol ). However, its efficacy in hospitalized patients has not been well-studied. We assessed the impact of LV-BP on time to colonoscopy, hospital LOS, and bowel preparation quality among inpatients. METHODS: We performed a propensity score-matched analysis of adult inpatients undergoing colonoscopy who received either LV-BP or HV-BP before colonoscopy at a quaternary academic medical center. Multivariate regression models with feature selection were developed to assess the association between LV-BP and study outcomes. RESULTS: Among 1,807 inpatients included in this study, 293 and 1,514 patients received LV-BP and HV-BP, respectively. Among the propensity score-matched population, LV-BP was associated with a shorter time to colonoscopy (ß: -0.43 [95% confidence interval: -0.56 to -0.30]) while having similar odds of adequate preparation (odds ratio: 1.02 [95% confidence interval: 0.71-1.46]; P = 0.92). LV-BP was also significantly associated with decreased hospital LOS among older patients (age ≥ 75 years), patients with chronic kidney disease, and patients who were hospitalized with gastrointestinal bleeding. DISCUSSION: LV-BP is associated with decreased time to colonoscopy in hospitalized patients. Older inpatients, inpatients with chronic kidney disease, and inpatients with gastrointestinal bleeding may particularly benefit from LV-BP. Prospective studies are needed to further establish the role of LV-BP for inpatient colonoscopies.


Subject(s)
Cathartics , Renal Insufficiency, Chronic , Adult , Aged , Colonoscopy/adverse effects , Gastrointestinal Hemorrhage/diagnosis , Gastrointestinal Hemorrhage/etiology , Humans , Inpatients
6.
J Gen Intern Med ; 37(15): 3789-3796, 2022 11.
Article in English | MEDLINE | ID: mdl-35091916

ABSTRACT

BACKGROUND: Understanding association between factors related to clinical work environment and well-being can inform strategies to improve physicians' work experience. OBJECTIVE: To model and quantify what drivers of work composition, team structure, and dynamics are associated with well-being. DESIGN: Utilizing social network modeling, this cohort study of physicians in an academic health center examined inbasket messaging data from 2018 to 2019 to identify work composition, team structure, and dynamics features. Indicators from a survey in 2019 were used as dependent variables to identify factors predictive of well-being. PARTICIPANTS: EHR data available for 188 physicians and their care teams from 18 primary care practices; survey data available for 163/188 physicians. MAIN MEASURES: Area under the receiver operating characteristic curve (AUC) of logistic regression models to predict well-being dependent variables was assessed out-of-sample. KEY RESULTS: The mean AUC of the model for the dependent variables of emotional exhaustion, vigor, and professional fulfillment was, respectively, 0.665 (SD 0.085), 0.700 (SD 0.082), and 0.669 (SD 0.082). Predictors associated with decreased well-being included physician centrality within support team (OR 3.90, 95% CI 1.28-11.97, P=0.01) and share of messages related to scheduling (OR 1.10, 95% CI 1.03-1.17, P=0.003). Predictors associated with increased well-being included higher number of medical assistants within close support team (OR 0.91, 95% CI 0.83-0.99, P=0.05), nurse-centered message writing practices (OR 0.89, 95% CI 0.83-0.95, P=0.001), and share of messages related to ambiguous diagnosis (OR 0.92, 95% CI 0.87-0.98, P=0.01). CONCLUSIONS: Through integration of EHR data with social network modeling, the analysis highlights new characteristics of care team structure and dynamics that are associated with physician well-being. This quantitative methodology can be utilized to assess in a refined data-driven way the impact of organizational changes to improve well-being through optimizing team dynamics and work composition.


Subject(s)
Burnout, Professional , Physicians , Humans , Electronic Health Records , Cohort Studies , Physicians/psychology , Surveys and Questionnaires , Social Networking , Burnout, Professional/epidemiology
8.
J Hosp Med ; 15(3): 147-153, 2020 Mar.
Article in English | MEDLINE | ID: mdl-31891558

ABSTRACT

BACKGROUND: It is not known whether delivering inpatient care earlier to patients boarding in the emergency department (ED) by a hospitalist-led team can decrease length of stay (LOS). OBJECTIVE: To study the association between care provided by a hospital medicine ED Boarder (EDB) service and LOS. DESIGN, SETTING, AND PARTICIPANTS: Retrospective cross-sectional study (July 1, 2016 to June 30, 2018) conducted at a single, large, urban academic medical center. Patients admitted to general medicine services from the ED were included. EDB patients were defined as those waiting for more than two hours for an inpatient bed. Patients were categorized as covered EDB, noncovered EDB, or nonboarder. INTERVENTION: The hospital medicine team provided continuous care to covered EDB patients waiting for an inpatient bed. PRIMARY OUTCOME AND MEASURES: The primary outcome was median hospital LOS defined as the time period from ED arrival to hospital departure. Secondary outcomes included ED LOS and 30-day ED readmission rate. RESULTS: There were 8,776 covered EDB, 5,866 noncovered EDB, and 2,026 nonboarder patients. The EDB service covered 59.9% of eligible patients and 62.9% of total boarding hours. Median hospital LOS was 4.76 (interquartile range [IQR] 2.90-7.22) days for nonboarders, 4.92 (IQR 3.00-8.03) days for covered EDB patients, and 5.11 (IQR 3.16-8.34) days for noncovered EDB (P < .001). Median ED LOS for nonboarders was 5.6 (IQR 4.2-7.5) hours, 20.7 (IQR 15.8-24.9) hours for covered EDB, and 10.1 (IQR 7.9-13.8) hours for noncovered EDB (P < .001). There was no difference in 30-day ED readmission rates. CONCLUSION: Admitted patients who were not boarders had the shortest LOS. Among boarded patients, coverage by a hospital medicine-led EDB service was associated with a reduced hospital LOS.

10.
Am J Trop Med Hyg ; 98(6): 1637-1639, 2018 06.
Article in English | MEDLINE | ID: mdl-29714162

ABSTRACT

To reduce transmission of tuberculosis (TB) in resource-limited countries where TB remains a major cause of mortality, novel diagnostic tools are urgently needed. We evaluated the fractional concentration of exhaled nitric oxide (FeNO) as an easily measured, noninvasive potential biomarker for diagnosis and monitoring of treatment response in participants with pulmonary TB including multidrug resistant-TB in Lima, Peru. In a longitudinal study however, we found no differences in baseline median FeNO levels between 38 TB participants and 93 age-matched controls (13 parts per billion [ppb] [interquartile range (IQR) = 8-26] versus 15 ppb [IQR = 12-24]), and there was no change over 60 days of treatment (15 ppb [IQR = 10-19] at day 60). Taking this and previous evidence together, we conclude FeNO is not of value in either the diagnosis of pulmonary TB or as a marker of treatment response.


Subject(s)
Nitric Oxide/analysis , Tuberculosis, Pulmonary/diagnosis , Adult , Biomarkers/analysis , Biomarkers/metabolism , Case-Control Studies , Female , Humans , Longitudinal Studies , Male , Nitric Oxide/metabolism , Peru , Surveys and Questionnaires , Treatment Outcome , Tuberculin Test
12.
Chest ; 153(6): 1358-1367, 2018 06.
Article in English | MEDLINE | ID: mdl-29559307

ABSTRACT

BACKGROUND: Cough frequency, and its duration, is a biomarker that can be used in low-resource settings without the need of laboratory culture and has been associated with transmission and treatment response. Radiologic characteristics associated with increased cough frequency may be important in understanding transmission. The relationship between cough frequency and cavitary lung disease has not been studied. METHODS: We analyzed data in 41 adults who were HIV negative and had culture-confirmed, drug-susceptible pulmonary TB throughout treatment. Cough recordings were based on the Cayetano Cough Monitor, and sputum samples were evaluated using microscopic observation drug susceptibility broth culture; among culture-positive samples, bacillary burden was assessed by means of time to positivity. CT scans were analyzed by a US-board-certified radiologist and a computer-automated algorithm. The algorithm evaluated cavity volume and cavitary proximity to the airway. CT scans were obtained within 1 month of treatment initiation. We compared small cavities (≤ 7 mL) and large cavities (> 7 mL) and cavities located closer to (≤ 10 mm) and farther from (> 10 mm) the airway to cough frequency and cough cessation until treatment day 60. RESULTS: Cough frequency during treatment was twofold higher in participants with large cavity volumes (rate ratio [RR], 1.98; P = .01) and cavities located closer to the airway (RR, 2.44; P = .001). Comparably, cough ceased three times faster in participants with smaller cavities (adjusted hazard ratio [HR], 2.89; P = .06) and those farther from the airway (adjusted HR, 3.61;, P = .02). Similar results were found for bacillary burden and culture conversion during treatment. CONCLUSIONS: Cough frequency during treatment is greater and lasts longer in patients with larger cavities, especially those closer to the airway.


Subject(s)
Antitubercular Agents/therapeutic use , Cough/epidemiology , Tuberculosis, Pulmonary/complications , Adult , Cough/etiology , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Peru/epidemiology , Prospective Studies , Tomography, X-Ray Computed , Tuberculosis, Pulmonary/diagnosis , Tuberculosis, Pulmonary/drug therapy , Young Adult
13.
J Infect Dis ; 216(5): 514-524, 2017 09 01.
Article in English | MEDLINE | ID: mdl-28510693

ABSTRACT

Background: Sputum from patients with tuberculosis contains subpopulations of metabolically active and inactive Mycobacterium tuberculosis with unknown implications for infectiousness. Methods: We assessed sputum microscopy with fluorescein diacetate (FDA, evaluating M. tuberculosis metabolic activity) for predicting infectiousness. Mycobacterium tuberculosis was quantified in pretreatment sputum of patients with pulmonary tuberculosis using FDA microscopy, culture, and acid-fast microscopy. These 35 patients' 209 household contacts were followed with prevalence surveys for tuberculosis disease for 6 years. Results: FDA microscopy was positive for a median of 119 (interquartile range [IQR], 47-386) bacteria/µL sputum, which was 5.1% (IQR, 2.4%-11%) the concentration of acid-fast microscopy-positive bacteria (2069 [IQR, 1358-3734] bacteria/µL). Tuberculosis was diagnosed during follow-up in 6.4% (13/209) of contacts. For patients with lower than median concentration of FDA microscopy-positive M. tuberculosis, 10% of their contacts developed tuberculosis. This was significantly more than 2.7% of the contacts of patients with higher than median FDA microscopy results (crude hazard ratio [HR], 3.8; P = .03). This association maintained statistical significance after adjusting for disease severity, chemoprophylaxis, drug resistance, and social determinants (adjusted HR, 3.9; P = .02). Conclusions: Mycobacterium tuberculosis that was FDA microscopy negative was paradoxically associated with greater infectiousness. FDA microscopy-negative bacteria in these pretreatment samples may be a nonstaining, slowly metabolizing phenotype better adapted to airborne transmission.


Subject(s)
Fluoresceins/chemistry , Microscopy , Sputum/microbiology , Tuberculosis, Pulmonary/diagnosis , Adult , Female , Humans , Linear Models , Male , Multivariate Analysis , Mycobacterium tuberculosis/isolation & purification , Prevalence , Surveys and Questionnaires , Tuberculin Test , Young Adult
14.
Clin Infect Dis ; 64(9): 1174-1181, 2017 05 01.
Article in English | MEDLINE | ID: mdl-28329268

ABSTRACT

Background: Cough is the major determinant of tuberculosis transmission. Despite this, there is a paucity of information regarding characteristics of cough frequency throughout the day and in response to tuberculosis therapy. Here we evaluate the circadian cycle of cough, cough frequency risk factors, and the impact of appropriate treatment on cough and bacillary load. Methods: We prospectively evaluated human immunodeficiency virus-negative adults (n = 64) with a new diagnosis of culture-proven, drug-susceptible pulmonary tuberculosis immediately prior to treatment and repeatedly until treatment day 62. At each time point, participant cough was recorded (n = 670) and analyzed using the Cayetano Cough Monitor. Consecutive coughs at least 2 seconds apart were counted as separate cough episodes. Sputum samples (n = 426) were tested with microscopic-observation drug susceptibility broth culture, and in culture-positive samples (n = 252), the time to culture positivity was used to estimate bacillary load. Results: The highest cough frequency occurred from 1 pm to 2 pm, and the lowest from 1 am to 2 am (2.4 vs 1.1 cough episodes/hour, respectively). Cough frequency was higher among participants who had higher sputum bacillary load (P < .01). Pretreatment median cough episodes/hour was 2.3 (interquartile range [IQR], 1.2-4.1), which at 14 treatment days decreased to 0.48 (IQR, 0.0-1.4) and at the end of the study decreased to 0.18 (IQR, 0.0-0.59) (both reductions P < .001). By 14 treatment days, the probability of culture conversion was 29% (95% confidence interval, 19%-41%). Conclusions: Coughs were most frequent during daytime. Two weeks of appropriate treatment significantly reduced cough frequency and resulted in one-third of participants achieving culture conversion. Thus, treatment by 2 weeks considerably diminishes, but does not eliminate, the potential for airborne tuberculosis transmission.


Subject(s)
Antitubercular Agents/therapeutic use , Cough/pathology , Tuberculosis, Pulmonary/drug therapy , Tuberculosis, Pulmonary/pathology , Adolescent , Adult , Aged , Aged, 80 and over , Circadian Rhythm , Female , Humans , Longitudinal Studies , Male , Middle Aged , Prospective Studies , Young Adult
15.
BMJ Open ; 6(4): e010365, 2016 04 22.
Article in English | MEDLINE | ID: mdl-27105713

ABSTRACT

INTRODUCTION: Cough is a key symptom of tuberculosis (TB) as well as the main cause of transmission. However, a recent literature review found that cough frequency (number of coughs per hour) in patients with TB has only been studied once, in 1969. The main aim of this study is to describe cough frequency patterns before and after the start of TB treatment and to determine baseline factors that affect cough frequency in these patients. Secondarily, we will evaluate the correlation between cough frequency and TB microbiological resolution. METHODS: This study will select participants with culture confirmed TB from 2 tertiary hospitals in Lima, Peru. We estimated that a sample size of 107 patients was sufficient to detect clinically significant changes in cough frequency. Participants will initially be evaluated through questionnaires, radiology, microscopic observation drug susceptibility broth TB-culture, auramine smear microscopy and cough recordings. This cohort will be followed for the initial 60 days of anti-TB treatment, and throughout the study several microbiological samples as well as 24 h recordings will be collected. We will describe the variability of cough episodes and determine its association with baseline laboratory parameters of pulmonary TB. In addition, we will analyse the reduction of cough frequency in predicting TB cure, adjusted for potential confounders. ETHICS AND DISSEMINATION: Ethical approval has been obtained from the ethics committees at each participating hospital in Lima, Peru, Asociación Benéfica PRISMA in Lima, Peru, the Universidad Peruana Cayetano Heredia in Lima, Peru and Johns Hopkins University in Baltimore, USA. We aim to publish and disseminate our findings in peer-reviewed journals. We also expect to create and maintain an online repository for TB cough sounds as well as the statistical analysis employed.


Subject(s)
Cough/physiopathology , Tuberculosis, Pulmonary/physiopathology , Adult , Antitubercular Agents/therapeutic use , Clinical Protocols , Cough/microbiology , Female , Humans , Male , Middle Aged , Mycobacterium tuberculosis , Peru , Predictive Value of Tests , Prospective Studies , Tuberculosis, Pulmonary/complications , Tuberculosis, Pulmonary/drug therapy
16.
Clin Infect Dis ; 60(8): 1186-95, 2015 Apr 15.
Article in English | MEDLINE | ID: mdl-25537870

ABSTRACT

BACKGROUND: It is difficult to determine whether early tuberculosis treatment is effective in reducing the infectiousness of patients' sputum, because culture takes weeks and conventional acid-fast sputum microscopy and molecular tests cannot differentiate live from dead tuberculosis. METHODS: To assess treatment response, sputum samples (n=124) from unselected patients (n=35) with sputum microscopy-positive tuberculosis were tested pretreatment and after 3, 6, and 9 days of empiric first-line therapy. Tuberculosis quantitative viability microscopy with fluorescein diacetate, quantitative culture, and acid-fast auramine microscopy were all performed in triplicate. RESULTS: Tuberculosis quantitative viability microscopy predicted quantitative culture results such that 76% of results agreed within ±1 logarithm (rS=0.85; P<.0001). In 31 patients with non-multidrug-resistant (MDR) tuberculosis, viability and quantitative culture results approximately halved (both 0.27 log reduction, P<.001) daily. For patients with non-MDR tuberculosis and available data, by treatment day 9 there was a >10-fold reduction in viability in 100% (24/24) of cases and quantitative culture in 95% (19/20) of cases. Four other patients subsequently found to have MDR tuberculosis had no significant changes in viability (P=.4) or quantitative culture (P=.6) results during early treatment. The change in viability and quantitative culture results during early treatment differed significantly between patients with non-MDR tuberculosis and those with MDR tuberculosis (both P<.001). Acid-fast microscopy results changed little during early treatment, and this change was similar for non-MDR tuberculosis vs MDR tuberculosis (P=.6). CONCLUSIONS: Tuberculosis quantitative viability microscopy is a simple test that within 1 hour predicted quantitative culture results that became available weeks later, rapidly indicating whether patients were responding to tuberculosis therapy.


Subject(s)
Antitubercular Agents/therapeutic use , Bacteriological Techniques/methods , Drug Monitoring/methods , Microbial Viability/drug effects , Microscopy/methods , Tuberculosis/drug therapy , Adult , Female , Humans , Male , Sputum/microbiology , Time Factors , Young Adult
17.
PLoS One ; 7(10): e46229, 2012.
Article in English | MEDLINE | ID: mdl-23071550

ABSTRACT

BACKGROUND: A laboratory-free test for assessing recovery from pulmonary tuberculosis (TB) would be extremely beneficial in regions of the world where laboratory facilities are lacking. Our hypothesis is that analysis of cough sound recordings may provide such a test. In the current paper, we present validation of a cough analysis tool. METHODOLOGY/PRINCIPAL FINDINGS: Cough data was collected from a cohort of TB patients in Lima, Peru and 25.5 hours of recordings were manually annotated by clinical staff. Analysis software was developed and validated by comparison to manual scoring. Because many patients cough in bursts, coughing was characterized in terms of cough epochs. Our software correctly detects 75.5% of cough episodes with a specificity of 99.6% (comparable to past results using the same definition) and a median false positive rate of 4 false positives/hour, due to the noisy, real-world nature of our dataset. We then manually review detected coughs to eliminate false positives, in effect using the algorithm as a pre-screening tool that reduces reviewing time to roughly 5% of the recording length. This cough analysis approach provides a foundation to support larger-scale studies of coughing rates over time for TB patients undergoing treatment.


Subject(s)
Algorithms , Automation , Cough/physiopathology , Tuberculosis, Pulmonary/physiopathology , Cohort Studies , Humans , Peru , Tuberculosis, Pulmonary/drug therapy
18.
Article in English | MEDLINE | ID: mdl-22255711

ABSTRACT

In regions of the world where tuberculosis (TB) poses the greatest disease burden, the lack of access to skilled laboratories is a significant problem. A lab-free method for assessing patient recovery during treatment would be of great benefit, particularly for identifying patients who may have drug-resistant tuberculosis. We hypothesize that cough analysis may provide such a test. In this paper we describe algorithm development in support of a pilot study of TB patient coughing. We describe several approaches to event detection and classification, and show preliminary data which suggest that cough count decreases after the start of treatment in drug-responsive patients. Our eventual goal is development of a low-cost ambulatory cough analysis system that will help identify patients with drug-resistant tuberculosis.


Subject(s)
Algorithms , Auscultation/methods , Cough/diagnosis , Cough/prevention & control , Respiratory Sounds , Tuberculosis, Pulmonary/diagnosis , Tuberculosis, Pulmonary/drug therapy , Antitubercular Agents/therapeutic use , Cough/etiology , Diagnosis, Computer-Assisted/methods , Humans , Recovery of Function , Reproducibility of Results , Sensitivity and Specificity , Sound Spectrography/methods , Tuberculosis, Pulmonary/complications
19.
Transplantation ; 84(11): 1467-73, 2007 Dec 15.
Article in English | MEDLINE | ID: mdl-18091523

ABSTRACT

BACKGROUND: Using a class I-disparate swine lung transplant model, we examined whether an intensive course of tacrolimus could induce operational tolerance and whether preoperative allopeptide immunization would prevent the development of tolerance. METHODS: Left lung grafts were performed using class I-disparate (class II-matched) donors. Recipients were treated with 12 days of postoperative tacrolimus. Three recipients were immunized prior to transplantation with class I allopeptides. Three other recipients were not immunized. RESULTS: The nonimmunized recipients maintained their grafts long term (>497, >451, and >432 days), without developing chronic rejection. The immunized swine also maintained their grafts long term (>417, >402, >401 days), despite developing a variety of in vitro and in vivo responses to the immunizing peptides, as well as having strong mixed lymphocyte reactions to donor cells prior to transplantation. CONCLUSIONS: Using only a brief course of tacrolimus, we have been able to induce a state of operational tolerance in a class I-disparate preclinical lung transplant model. Moreover, preoperative alloimmunization did not block tolerance induction or induce chronic rejection. These data show that it is possible to create a state of operational tolerance to lung allografts even in the presence of donor-sensitized cells.


Subject(s)
Histocompatibility Antigens Class I/immunology , Immune Tolerance/immunology , Lung Transplantation/immunology , Peptides/immunology , Animals , Graft Survival/immunology , Hypersensitivity/immunology , Immune Tolerance/drug effects , Skin Transplantation/immunology , Swine , Swine, Miniature , Tacrolimus/pharmacology , Time Factors , Transplantation, Homologous/immunology
20.
Am J Transplant ; 5(7): 1626-34, 2005 Jul.
Article in English | MEDLINE | ID: mdl-15943620

ABSTRACT

The role of indirect allorecognition in graft rejection is examined in two experiments using a swine lung transplantation model. First, two swine received class I mismatched grafts without immunosuppression; another two recipients were treated postoperatively with cyclosporine (CsA). These swine exhibited acute and chronic rejection, respectively. All four recipients developed T-cell reactivity to donor-derived class I major histocompatibility complex (MHC) peptides. Second, six swine were immunized with synthetic donor-derived class I allopeptides prior to transplantation. Control groups consisted of nonimmunized recipients (n = 6) and recipients immunized with an irrelevant peptide (n = 3). These recipients all received a 12-day course of post-operative CsA. Swine immunized with allopeptides exhibited accelerated graft rejection, as compared to both control groups (p < 0.01 and p = 0.03, respectively). Within the experimental group, the dominant histologic finding was acute rejection (AR). Obliterative bronchiolitis (OB) was seen in the graft with the longest survival. Both control groups showed a lesser degree of AR, with four out of six nonimmunized swine ultimately developing OB. These studies suggest that indirect allorecognition is operative during lung allograft rejection, and that pre-transplant sensitization to donor-derived MHC allopeptides can accelerate graft rejection.


Subject(s)
Graft Rejection/immunology , Histocompatibility Antigens Class II/immunology , Isoantigens/immunology , Lung Transplantation/immunology , Acute Disease , Animals , Cell Proliferation , Chronic Disease , Graft Rejection/pathology , Histocompatibility Antigens Class I/immunology , Hypersensitivity, Delayed/immunology , Immunization , Isoantibodies/biosynthesis , Lung/pathology , Swine , Swine, Miniature , T-Lymphocytes/immunology , T-Lymphocytes/pathology , Time Factors , Tissue Donors , Transplantation, Homologous
SELECTION OF CITATIONS
SEARCH DETAIL
...