Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
1.
Subst Abus ; 41(4): 463-467, 2020.
Article in English | MEDLINE | ID: mdl-32031914

ABSTRACT

BACKGROUND: In the midst of this national opioid crisis, it has become apparent that there is a large shortage in the workforce of treatment providers equipped to deliver evidence-based care for opioid use disorders (OUD). Medications for opioid use disorder (MOUD), such as buprenorphine, are crucial in reducing mortality in those with OUD, and yet prescribers must meet federal waiver requirements under the Drug Addiction Treatment Act of 2000 (DATA 2000). There are now several pathways for medical schools to satisfy these waiver requirements for all graduates, but this has not yet become widespread. We propose that including a DATA 2000 waiver training within the medical school curriculum is a feasible and effective way to meet eligibility requirements to prescribe buprenorphine. Methods: As part of a longitudinal opioid curriculum requirement, we implemented a DATA 2000 waiver training for all rising Year 4 medical students. One hundred sixty-nine students completed a hybrid (online and in-person) waiver training. The majority completed a pre- and post-survey. Results: The pre-training survey showed 93% of rising Year 4 medical students (112/120) reported participation in care of patients with OUD. Students six month post-training reported a rise in confidence (1.94 to 2.45; p < 0.01) and knowledge (2.27 to 2.76; p < 0.01) regarding MOUD. They also reported their plans to apply for the buprenorphine waiver once licensed and reported being more likely to prescribe buprenorphine for OUD as a result of the training (mean = 3.35; SD = 1.36; 0 = extremely unlikely to 5 = extremely likely). Conclusions: We successfully implemented a DATA 2000 waiver training as a mandatory requirement for the medical school curriculum. Further studies are needed to determine optimal timing, best format, and frequency of reinforcement of MOUD educational content across the undergraduate and graduate medical education continuum.


Subject(s)
Buprenorphine , Opioid-Related Disorders , Students, Medical , Buprenorphine/therapeutic use , Humans , Opiate Substitution Treatment , Opioid-Related Disorders/drug therapy , Schools, Medical
2.
Anat Sci Educ ; 11(3): 254-261, 2018 May 06.
Article in English | MEDLINE | ID: mdl-28941215

ABSTRACT

The pedagogical approach for both didactic and laboratory teaching of anatomy has changed in the last 25 years and continues to evolve; however, assessment of student anatomical knowledge has not changed despite the awareness of Bloom's taxonomy. For economic reasons most schools rely on multiple choice questions (MCQ) that test knowledge mastered while competences such as critical thinking and skill development are not typically assessed. In contrast, open-ended question (OEQ) examinations demand knowledge construction and a higher order of thinking, but more time is required from the faculty to score the constructed responses. This study compares performances on MCQ and OEQ examinations administered to a small group of incoming first year medical students in a preparatory (enrichment) anatomy course that covered the thorax and abdomen. In the thorax module, the OEQ examination score was lower than the MCQ examination score; however, in the abdomen module, the OEQ examination score improved compared to the thorax OEQ score. Many students attributed their improved performance to a change from simple memorization (superficial learning) for cued responses to conceptual understanding (deeper learning) for constructed responses. The results support the view that assessment with OEQs, which requires in depth knowledge, would result in student better performance in the examination. Anat Sci Educ 11: 254-261. © 2017 American Association of Anatomists.


Subject(s)
Anatomy/education , Education, Medical, Undergraduate/organization & administration , Educational Measurement/methods , Schools, Medical/organization & administration , Students, Medical/psychology , Abdomen/anatomy & histology , Adult , Choice Behavior , Curriculum , Education, Medical, Undergraduate/economics , Education, Medical, Undergraduate/methods , Education, Medical, Undergraduate/trends , Educational Measurement/economics , Female , Humans , Learning , Male , Schools, Medical/economics , Schools, Medical/trends , Thinking , Thorax/anatomy & histology , Young Adult
3.
Prog Transplant ; 27(4): 377-385, 2017 12.
Article in English | MEDLINE | ID: mdl-29187135

ABSTRACT

INTRODUCTION: Maximizing education about living donor kidney transplant (LDKT) during the in-person evaluation at the transplant center may increase the numbers of kidney patients pursuing LDKT. Research Questions and Design: To test the effectiveness of a 1-time LDKT educational intervention, we performed a cluster-randomized trial among 499 patients who presented for evaluation of kidney transplant. We compared usual care education (n = 250) versus intensive LDKT education (n = 249), which was implemented only on the evaluation day and consisted of viewing a 25-minute video of information and stories about LDKT and discussion of LDKT possibilities with an educator. Our primary outcome was knowledge of LDKT, 1 week after the transplant evaluation. RESULTS: One week after evaluation, patients who received intensive education had higher knowledge than patients who received usual care (12.7 vs. 11.7; P = .0008), but there were no differences in postevaluation readiness for LDKT. Among patients who had not previously identified a potential living donor, receiving intensive education was associated with increased willingness to take steps toward LDKT. DISCUSSION: In conclusion, expansion of LDKT education within the evaluation day may be helpful, but interventions that are implemented at multiple times and for greater duration may be necessary to ensure larger and long-term behavioral changes in pursuit of LDKT.


Subject(s)
Health Education/organization & administration , Health Knowledge, Attitudes, Practice , Kidney Transplantation/education , Living Donors , Transplant Recipients/education , Decision Making , Female , Humans , Male , Middle Aged , Surveys and Questionnaires
4.
Health Soc Work ; 42(1): 7-14, 2017 02 01.
Article in English | MEDLINE | ID: mdl-28395067

ABSTRACT

Authors comparatively analyzed health and social isolation between U.S. military veterans denied Veterans Affairs (VA) disability compensation and veterans awarded VA disability compensation. The 2001 National Survey of Veterans was used to create a sample of 4,522 veterans denied or awarded VA disability compensation. Using the Andersen health services utilization model as a conceptual framework, multivariate logistic regression was applied to assess relationships between VA disability compensation award status, three separate domains of health, and correlates of social isolation. Results indicate that denied applicants were more likely than those awarded to have poor overall health (odds ratio [OR] = 1.45, 95% confidence interval [CI]: 1.23, 1.70), and limitations in activities of daily living (OR = 1.12, 95% CI: 1.03, 1.21). Denied applicants' physical functioning (40.3) and mental functioning (41.2) composite summary scores were not clinically different from those of awarded applicants (39.0 and 40.1, respectively), indicating that both were comparably impaired. Veterans denied VA disability compensation had poor health and functional impairments. They also experienced poverty and isolation, suggesting that they may be in need of additional supportive services. Connecting veterans to community resources could be a vital service to provide to all veterans applying for disability compensation.


Subject(s)
Social Isolation , United States Department of Veterans Affairs , Veterans/psychology , Activities of Daily Living , Humans , Patient Acceptance of Health Care , United States
5.
J Glaucoma ; 25(10): e855-e860, 2016 10.
Article in English | MEDLINE | ID: mdl-27367136

ABSTRACT

PURPOSE: To perform a longitudinal analysis on the association of corneal haze with intraocular pressure (IOP) in eyes with primary congenital glaucoma (PCG) over 3 years. PATIENTS AND METHODS: Charts of all patients diagnosed with glaucoma of childhood from 2002 to 2012 at our institution were retrospectively reviewed. Inclusion criteria were age 18 years and below, plus elevated IOP or characteristic clinical signs. Exclusion criteria were eyes with secondary glaucoma or corneal haze not from PCG and patients with prior ocular surgery or incomplete follow-up. RESULTS: Of 79 eyes with childhood glaucoma during this period, 36 eyes had PCG [25 patients; 15 male (60.0%), 14 bilateral (56.0%)]. Eighteen eyes (13 patients) presented with corneal haze, whereas 18 eyes (12 patients) did not. Eyes with haze were diagnosed at a younger age than eyes without haze (0.79 vs. 5.2 y, P<0.02). During year 1, eyes with haze underwent significantly more IOP-lowering procedures and used significantly fewer IOP-lowering medications. Multivariate analysis revealed that corneal haze increased IOP by 4.63 mm Hg when controlling for treatment over time (P<0.01). Eyes with haze had lower survival curves and a failure hazard of 1.3 times that of eyes without haze. These eyes had a lower proportion of qualified successes than eyes without haze at year 1 (P<0.05) but this was reversed at year 3 (P<0.02). CONCLUSIONS: Eyes with PCG-related corneal haze generally presented more severely than did those without haze, but postmanagement outcomes may be similar to those in eyes without haze.


Subject(s)
Corneal Opacity/physiopathology , Glaucoma/congenital , Intraocular Pressure/physiology , Adolescent , Analysis of Variance , Child , Child, Preschool , Female , Glaucoma/physiopathology , Glaucoma/surgery , Humans , Infant , Infant, Newborn , Longitudinal Studies , Male , Optic Nerve Diseases/complications , Prognosis , Retrospective Studies , Tonometry, Ocular
6.
Ann Epidemiol ; 26(4): 261-6, 2016 04.
Article in English | MEDLINE | ID: mdl-27016951

ABSTRACT

PURPOSE: A well-established literature has shown that social integration strongly patterns health, including mortality risk. However, the extent to which living in high-poverty neighborhoods and having few social ties jointly pattern survival in the United States has not been examined. METHODS: We analyzed data from the Third National Health and Nutrition Examination Survey (1988-1994) linked to mortality follow-up through 2006 and census-based neighborhood poverty. We fit Cox proportional hazards models to estimate associations between social integration and neighborhood poverty on all-cause mortality as independent predictors and in joint-effects models using the relative excess risk due to interaction to test for interaction on an additive scale. RESULTS: In the joint-effects model adjusting for age, gender, race/ ethnicity, and individual-level socioeconomic status, exposure to low social integration alone was associated with increased mortality risk (hazard ratio [HR]: 1.42, 95% confidence interval [CI]: 1.28-1.59) while living in an area of high poverty alone did not have a significant effect (HR: 1.10; 95% CI: 0.95-1.28) when compared with being jointly unexposed. Individuals simultaneously living in neighborhoods characterized by high poverty and having low levels of social integration had an increased risk of mortality (HR: 1.63; 95% CI: 1.35-1.96). However, relative excess risk due to interaction results were not statistically significant. CONCLUSIONS: Social integration remains an important determinant of mortality risk in the United States independent of neighborhood poverty.


Subject(s)
Mortality , Poverty , Residence Characteristics , Social Determinants of Health , Social Participation , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Interpersonal Relations , Male , Middle Aged , Nutrition Surveys , Population Surveillance , Poverty Areas , Social Support , Socioeconomic Factors , United States/epidemiology , Young Adult
7.
Mil Med ; 180(10): 1034-40, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26444465

ABSTRACT

UNLABELLED: The general consensus in studies of individuals seeking federal disability compensation is that individuals "denied" disability compensation are healthier than those "awarded." In contrast, studies of military veterans seeking U.S. Department of Veterans Affairs (VA) disability compensation suggest that those "denied" ("denied applicants") may be as impaired as those "awarded" ("awarded applicants"), and likely have critical, albeit unmet health care needs. Moreover, although social isolation among U.S. Veterans has received some attention, its broad influence on health and health care consumption among veterans "denied" VA disability compensation is not well understood. OBJECTIVES: To provide a more thorough understanding of "denied" applicants' health, health care utilization, and social conditions. METHODS: We reviewed published reports of health, health care utilization, and social isolation "relevant" to U.S. Veterans "denied" VA disability compensation. Among 122 research items initially reviewed, a total of 47 met our inclusion criteria and are summarized herein. RESULTS: Compared to veterans "awarded" VA disability compensation, those "denied" have poorer health, use less VA health care, and may experience social isolation. CONCLUSIONS: Veterans "denied" VA disability compensation may comprise a vulnerable subgroup of veterans in need of supportive services. Such needs may be addressed through evidence-based targeted outreach programs.


Subject(s)
Health Services Accessibility/trends , Patient Acceptance of Health Care/statistics & numerical data , United States Department of Veterans Affairs/statistics & numerical data , Veterans Disability Claims/statistics & numerical data , Veterans , Humans , United States
8.
Am J Community Psychol ; 56(1-2): 134-44, 2015 Sep.
Article in English | MEDLINE | ID: mdl-26076667

ABSTRACT

Social integration is fundamental to health and well-being. However, few studies have explored how neighborhood contexts pattern types and levels of social integration that individuals experience. We examined how neighborhood poverty structures two dimensions of social integration: integration with neighbors and social integration more generally. Using data from the United States Third National Health and Nutrition Examination Survey, we linked study participants to percent poverty in their neighborhood of residence (N = 16,040). Social integration was assessed using a modified Social Network Index and neighborhood integration based on yearly visits with neighbors. We fit multivariate logistic regression models that accounted for the complex survey design. Living in high poverty neighborhoods was associated with lower social integration but higher visits with neighbors. Neighborhood poverty distinctly patterns social integration, demonstrating that contexts shape the extent and quality of social relationships.


Subject(s)
Interpersonal Relations , Poverty Areas , Residence Characteristics , Social Determinants of Health , Social Participation , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Nutrition Surveys , Poverty , United States , Young Adult
9.
J Glaucoma ; 24(2): 122-6, 2015 Feb.
Article in English | MEDLINE | ID: mdl-23807353

ABSTRACT

PURPOSE: To determine long-term intraocular pressure (IOP) outcomes and risk factors for failure of IOP control in patients with previous glaucoma surgery that was complicated with infectious endophthalmitis. PATIENTS AND METHODS: Retrospective case series of 12 patients with previous glaucoma surgery that presented with infectious endophthalmitis to the University Hospital, Newark, NJ between 1995 and 2006. IOP control failure was stratified into 2 groups: IOP of ≥22 mm Hg and IOP ≥16 mm Hg at 3 consecutive follow-up visits. A Kaplan-Meier survival analysis was used to determine failure rate and Cox proportional hazards model to analyze effects of pertinent variables on survival. P values <0.05 were considered statistically significant. RESULTS: Twelve patients that had previously undergone glaucoma surgery (8 trabeculectomies and 4 bleb revisions) and were complicated with infectious endophthalmitis were identified. Mean follow-up time was 43.7 months (range, 10 to 98 mo). Of 12 patients, 9 (75%) failed, 2 (17%) consistently maintained IOP<22 mm Hg, and 1 (8%) maintained IOP<16 mm Hg during the follow-up period. Median survival time was 9.25 months. Age of the subject 65 years and older (P=0.0002) was associated with increased risk of IOP failure, whereas initial treatment selection with vitrectomy did not. Six patients required additional glaucoma surgery during the follow-up period. CONCLUSIONS: IOP control after resolution of endophthalmitis in patients with previous glaucoma surgery was maintained in only 25% of cases. Half the patients required additional glaucoma surgery.


Subject(s)
Endophthalmitis/physiopathology , Eye Infections, Bacterial/physiopathology , Glaucoma/surgery , Intraocular Pressure/physiology , Trabeculectomy/adverse effects , Adult , Aged , Aged, 80 and over , Anti-Bacterial Agents/therapeutic use , Bacteria/isolation & purification , Combined Modality Therapy , Endophthalmitis/microbiology , Endophthalmitis/therapy , Eye Infections, Bacterial/microbiology , Eye Infections, Bacterial/therapy , Female , Glaucoma/physiopathology , Humans , Male , Middle Aged , Proportional Hazards Models , Reoperation , Retrospective Studies , Risk Factors , Survival Analysis , Tonometry, Ocular , Vitrectomy , Young Adult
10.
Transplantation ; 97(3): 337-43, 2014 Feb 15.
Article in English | MEDLINE | ID: mdl-24169340

ABSTRACT

BACKGROUND: It is unclear whether ischemic preconditioning (IPC) of solid organs induces remote IPC (RIPC) in donors after brain death (DBD). METHODS: Outcomes in kidney recipients from 163 DBD in two randomized trials of liver IPC (5 min=62 and 10 min=101) were obtained retrospectively from the Scientific Registry of Transplant Recipients. Controls were kidney recipients from donors without IPC. Mean cold ischemia times were less than 20 hr. Primary outcomes were delayed graft function, defined as dialysis during the first posttransplantation week, and death-censored graft survival. Secondary outcomes were duration of initial hospital stay, patient survival, and estimated glomerular filtration rate 6, 12, 36, and 60 months after transplantation. RESULTS: After exclusions (40 kidneys not recovered, 21 not transplanted, 8 en bloc, 23 with extrarenal organs, and 6 with missing records), 228 recipients were included. Delayed graft function occurred in 23% of No RIPC and 28% of RIPC kidneys (P=0.54). One- and 3-year graft survival rates were 92% and 90%, respectively, in the No RIPC and 90% and 81%, respectively, in the RIPC group (P=0.12), and mean hospital stay was 9.3±13.9 and 9.7±8.2 days, respectively (P=0.15). There were no significant between group differences in patient survival and estimated glomerular filtration rate at any time point. CONCLUSIONS: Despite design and power limitations, our results suggest that liver IPC in DBD is of no clinical benefit to kidney recipients. Inconsistent efficacy and impracticality severely limit the usefulness of IPC in DBD. Other modalities of preconditioning should be tested.


Subject(s)
Delayed Graft Function/prevention & control , Ischemic Preconditioning/methods , Kidney Transplantation/methods , Liver/pathology , Renal Insufficiency/therapy , Adult , Brain Death , Delayed Graft Function/etiology , Female , Glomerular Filtration Rate , Graft Survival , Humans , Length of Stay , Liver Transplantation/methods , Male , Middle Aged , Organ Preservation/methods , Randomized Controlled Trials as Topic , Registries , Renal Insufficiency/mortality , Retrospective Studies , Time Factors , Tissue Donors , Tissue and Organ Harvesting/methods , Treatment Outcome
11.
PLoS One ; 8(9): e66117, 2013.
Article in English | MEDLINE | ID: mdl-24039694

ABSTRACT

The cause of multiple sclerosis (MS), its driving pathogenesis at the earliest stages, and what factors allow the first clinical attack to manifest remain unknown. Some imaging studies suggest gray rather than white matter may be involved early, and some postulate this may be predictive of developing MS. Other imaging studies are in conflict. To determine if there was objective molecular evidence of gray matter involvement in early MS we used high-resolution mass spectrometry to identify proteins in the cerebrospinal fluid (CSF) of first-attack MS patients (two independent groups) compared to established relapsing remitting (RR) MS and controls. We found that the CSF proteins in first-attack patients were differentially enriched for gray matter components (axon, neuron, synapse). Myelin components did not distinguish these groups. The results support that gray matter dysfunction is involved early in MS, and also may be integral for the initial clinical presentation.


Subject(s)
Central Nervous System/pathology , Multiple Sclerosis, Relapsing-Remitting/cerebrospinal fluid , Nerve Tissue Proteins/cerebrospinal fluid , Adolescent , Adult , Case-Control Studies , Central Nervous System/metabolism , Female , Humans , Male , Middle Aged , Multiple Sclerosis, Relapsing-Remitting/pathology , Proteomics , Tandem Mass Spectrometry , Young Adult
12.
Graefes Arch Clin Exp Ophthalmol ; 251(3): 653-9, 2013 Mar.
Article in English | MEDLINE | ID: mdl-22910792

ABSTRACT

BACKGROUND: To describe the demographics and outcomes of assault-related open-globe injuries (OGI) at University Hospital (UH), Newark, New Jersey over a ten-year period. METHODS: The medical records of all subjects presenting to a single university referral center with an OGI were retrospectively analyzed to identify prognostic factors for enucleation and final visual acuity (VA) of no light perception (NLP). RESULTS: One hundred and forty-eight eyes of 147 patients presented to UH with assault-related OGI. Eighty-two percent of patients were male, and the mean age was 35.9 years. The anatomic site of the wound was zone 3 in the majority (54.1 %) of eyes. Most common type of injury noted was rupture (57.4 %), followed by penetrating injury (35.1 %). Mean initial presenting and final VA in LogMAR were 2.38 ± 0.12 and 2.18 ± 0.16 respectively. Initial Snellen VA was no light perception (NLP) in 57 eyes (38.5 %); four eyes had an initial VA of ≥ 20/40 (2.7 %). Final VA was NLP in 68 eyes (45.9 %) of which 46 were enucleated (31.1 %); 18 eyes (12.2 %) had a final VA of ≥ 20/40. Fifty eyes (33.8 %) underwent pars plana vitrectomy (PPV). Significant risk factors of final VA of NLP or enucleation included initial VA of NLP, perforating or rupture type of OGI, and zone 3 injury. Eyes that sustained penetrating injuries were less likely to have final VA of NLP or require enucleation. CONCLUSIONS: Assault-related OGIs carry an extremely poor visual prognosis and a high rate of enucleations. Only eighteen eyes (12.2 %) recovered VA ≥ 20/40. We found initial VA of NLP and zone 3 injury to be significant predictors of final VA of NLP or undergoing enucleation. Penetrating injuries were less likely to have a final VA of NLP or an enucleation.


Subject(s)
Eye Injuries, Penetrating/epidemiology , Hospitals, University/statistics & numerical data , Hospitals, Urban/statistics & numerical data , Violence/statistics & numerical data , Adolescent , Adult , Age Distribution , Aged , Aged, 80 and over , Child , Eye Enucleation/statistics & numerical data , Eye Injuries, Penetrating/surgery , Female , Humans , Male , Middle Aged , New Jersey/epidemiology , Prognosis , Retrospective Studies , Risk Factors , Sex Distribution , Trauma Severity Indices , Visual Acuity , Vitrectomy , Young Adult
13.
PLoS One ; 6(2): e17287, 2011 Feb 23.
Article in English | MEDLINE | ID: mdl-21383843

ABSTRACT

BACKGROUND: Neurologic Post Treatment Lyme disease (nPTLS) and Chronic Fatigue (CFS) are syndromes of unknown etiology. They share features of fatigue and cognitive dysfunction, making it difficult to differentiate them. Unresolved is whether nPTLS is a subset of CFS. METHODS AND PRINCIPAL FINDINGS: Pooled cerebrospinal fluid (CSF) samples from nPTLS patients, CFS patients, and healthy volunteers were comprehensively analyzed using high-resolution mass spectrometry (MS), coupled with immunoaffinity depletion methods to reduce protein-masking by abundant proteins. Individual patient and healthy control CSF samples were analyzed directly employing a MS-based label-free quantitative proteomics approach. We found that both groups, and individuals within the groups, could be distinguished from each other and normals based on their specific CSF proteins (p<0.01). CFS (n = 43) had 2,783 non-redundant proteins, nPTLS (n = 25) contained 2,768 proteins, and healthy normals had 2,630 proteins. Preliminary pathway analysis demonstrated that the data could be useful for hypothesis generation on the pathogenetic mechanisms underlying these two related syndromes. CONCLUSIONS: nPTLS and CFS have distinguishing CSF protein complements. Each condition has a number of CSF proteins that can be useful in providing candidates for future validation studies and insights on the respective mechanisms of pathogenesis. Distinguishing nPTLS and CFS permits more focused study of each condition, and can lead to novel diagnostics and therapeutic interventions.


Subject(s)
Cerebrospinal Fluid Proteins/analysis , Fatigue Syndrome, Chronic/cerebrospinal fluid , Fatigue Syndrome, Chronic/diagnosis , Lyme Disease/cerebrospinal fluid , Lyme Disease/diagnosis , Proteome/analysis , Adolescent , Adult , Case-Control Studies , Cerebrospinal Fluid Proteins/metabolism , Diagnosis, Differential , Fatigue Syndrome, Chronic/metabolism , Fatigue Syndrome, Chronic/therapy , Female , Humans , Lyme Disease/metabolism , Lyme Disease/therapy , Male , Middle Aged , Proteomics/methods , Treatment Outcome , Young Adult
14.
Anat Sci Educ ; 1(1): 3-9, 2008 Jan.
Article in English | MEDLINE | ID: mdl-19177372

ABSTRACT

Team-based learning (TBL) is an instructional strategy that combines independent out-of-class preparation for in-class discussion in small groups. This approach has been successfully adopted by a number of medical educators. This strategy allowed us to eliminate anatomy lectures and incorporate small-group active learning. Although our strategy is a modified use of classical TBL, in the text, we use the standard terminology of TBL for simplicity. We have modified classical TBL to fit our curricular needs and approach. Anatomy lectures were replaced with TBL activities that required pre-class reading of assigned materials, an individual self-assessment quiz, discussion of learning issues derived from the reading assignments, and then the group retaking the same quiz for discussion and deeper learning. Students' performances and their educational experiences in the TBL format were compared with the traditional lecture approach. We offer several in-house unit exams and a final comprehensive subject exam provided by the National Board of Medical Examiners. The students performed better in all exams following the TBL approach compared to traditional lecture-based teaching. Students acknowledged that TBL encouraged them to study regularly, allowed them to actively teach and learn from peers, and this served to improve their own exam performances. We found that a TBL approach in teaching anatomy allowed us to create an active learning environment that helped to improve students' performances. Based on our experience, other preclinical courses are now piloting TBL.


Subject(s)
Anatomy/education , Cooperative Behavior , Education, Medical, Undergraduate , Embryology/education , Group Processes , Problem-Based Learning , Comprehension , Curriculum , Educational Measurement , Humans , Models, Educational , Program Development , Program Evaluation , Schools, Medical , Time Factors
15.
Respir Care ; 49(11): 1320-5, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15507166

ABSTRACT

BACKGROUND: A study was undertaken to determine factors present in adult patients, newly admitted to the hospital, that predict the inability of noninvasive positive-pressure ventilation (NPPV) to sustain the work of breathing and avoid endotracheal intubation. METHODS: Data were collected prospectively from patients with acute respiratory failure who were admitted to Hackensack University Medical Center from August 2001 to August 2002 and received NPPV. Physiologic characteristics of those patients on admission were compiled into a database, with the hypothesis that those with the worst initial physiologic characteristics would subsequently fail NPPV and require endotracheal intubation with mechanical ventilation. RESULTS: Seventy-five patients were included. Sixty-four patients (85%) successfully avoided endotracheal intubation and were discharged. Of the 11 patients who failed NPPV, 8 were intubated and 5 expired. The groups were comparable in age, sex, arterial blood gases, and Acute Physiology and Chronic Health Evaluation score (p > 0.05). The success group, however, had a significantly higher body mass index (29 kg/m(2) vs 23 kg/m(2), p = 0.0167). CONCLUSIONS: The following can be concluded from our study: there is a low failure rate for NPPV (15%); patients with a low body mass index are more likely to fail NPPV and require endotracheal intubation; and patients who fail NPPV have a higher risk of mortality (p = 0.00016).


Subject(s)
Body Mass Index , Positive-Pressure Respiration , Respiratory Distress Syndrome/physiopathology , Respiratory Distress Syndrome/therapy , Adult , Aged , Aged, 80 and over , Female , Humans , Intubation, Intratracheal , Logistic Models , Male , Middle Aged , Outcome Assessment, Health Care , Prospective Studies , Risk Factors , Statistics, Nonparametric , Work of Breathing
16.
Sleep Med ; 5(5): 501-6, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15341897

ABSTRACT

BACKGROUND AND PURPOSE: To determine the benefit of repeat polysomnography with/without esophageal pressure (PES) monitoring to diagnose sleep apnea syndrome (SAS) in patients with symptoms of sleep apnea who have had a 'negative', single-night polysomnogram (PSG). PATIENTS AND METHODS: This is a retrospective investigation of 1187 patients seen in our sleep lab from January to December 2001, of which 709 were adults suspected of having sleep apnea. Following a single PSG, 588 patients were diagnosed with sleep apnea and 121 had negative PSGs (an apnea-hypopnea index <5 events per hour). Of the 121 patients, 92 continued to complain of unexplained sleepiness, loud snoring, or apnea, symptoms which were also documented on their initial evaluation (PSG or multiple sleep latency testing). The remaining 29 patients had no further complaints, or another medical cause of their sleepiness was established (i.e. asthma) following the single-night PSG. Of the 92 patients, 28 underwent additional screening with both repeat PSG and PES monitoring within the following 6 months. RESULTS: With repeat PSG and PES monitoring, 18 of the 28 patients with previous, negative PSGs were diagnosed with sleep apnea. The sensitivity of a single-night PSG fell to 97%, with a false negative rate of 3%. Only 12 of the 28 would have been positive based on polysomnographic criteria alone, without the additional PES monitoring. On the other hand, 10 of the 28 remained negative and further evaluation revealed other, underlying medical problems (i.e. nocturnal asthma) that explained their symptoms. CONCLUSIONS: There is a clear benefit of repeat PSG, with or without PES monitoring, for patients with a prior negative PSG and continued symptoms suspected of having SAS.


Subject(s)
Polysomnography , Sleep Apnea, Obstructive/diagnosis , Adult , Aged , Asthma/diagnosis , Diagnosis, Differential , Disorders of Excessive Somnolence/diagnosis , Esophagus/physiopathology , Female , Humans , Male , Manometry , Middle Aged , Retrospective Studies , Sensitivity and Specificity , Sleep Apnea, Obstructive/therapy , Snoring/diagnosis
17.
J Am Diet Assoc ; 103(12): 1632-8, 2003 Dec.
Article in English | MEDLINE | ID: mdl-14647090

ABSTRACT

OBJECTIVE: To examine how registered dietitians who have completed one of two physical assessment programs use the knowledge and skills learned in practice and whether method of instruction had an affect on use of skills in practice. SUBJECTS/SETTING: Surveys were mailed to 891 persons, all of whom completed a Dietitians in Nutrition Support dietetic practice group or University of Medicine and Dentistry of New Jersey continuing education program. Four hundred seventeen surveys were returned and 407 were usable. STATISTICAL ANALYSIS: chi(2) analysis and stepwise logistic regression was used to analyze the data. Statistical significance was P=.05. RESULTS: Sixty percent of respondents worked in a clinic setting. Four of the five most-used competencies were similar between the two programs. More registered dietitians are using physical assessment competency information in clinical assessment, but not performing the competencies independently. Respondents with the Certified Diabetic Educator credential (P=.007) and Certified Nutrition Support Dietitians credential (P=.215) were more likely to use select physical assessment competencies. Confidence was reported as enhancing use of physical assessment competencies (n=153, 45%) and time was a barrier to using physical assessment competencies (n=159, 52%). APPLICATIONS/CONCLUSIONS: There were no significant differences in use of physical assessment competencies between the University of Medicine and Dentistry of New Jersey program and the Dietitians in Nutrition Support program. Although not statistically significant, there appeared to be more use of physical assessment competencies by those who received additional training and those who completed the University of Medicine and Dentistry of New Jersey program. This study reveals that registered dietitians are using the values in clinical assessment, however they must move to actually performing physical assessment competencies in practice.


Subject(s)
Clinical Competence , Dietetics/standards , Education, Continuing , Physical Examination/standards , Adult , Aged , Certification , Data Collection , Dietetics/education , Dietetics/statistics & numerical data , Education, Continuing/methods , Female , Health Knowledge, Attitudes, Practice , Humans , Logistic Models , Male , Middle Aged , Physical Examination/methods , Video Recording
18.
Transplantation ; 75(10): 1683-7, 2003 May 27.
Article in English | MEDLINE | ID: mdl-12777856

ABSTRACT

BACKGROUND: Historically, organ recovery rates in donors with cardiac arrest (CA) have been low, presumably from hemodynamic instability. We hypothesized that donor resuscitation has improved hemodynamic stability and organ recovery in CA donors, and that CA triggers ischemic preconditioning (IP) in liver grafts. METHODS: A total of 131 donor pairs with and without CA were matched in age, gender, and year of recovery. Hemodynamic stability was determined by vasopressor use. Abdominal and thoracic organs recovered and livers transplanted were compared between the groups. Liver graft function, injury, and IP benefit were examined by comparing liver chemistries after transplantation and postperfusion biopsies between recipients of grafts from both groups (n=40 each). RESULTS: Hemodynamic stability was similar in both groups, but recovery of thoracic organs was significantly lower in CA versus non-CA donors (35 vs. 53%, P<0.01). On the other hand, recovery rates of three or more abdominal organs from CA donors approached those of non-CA donors (77 vs. 87%, not significant). Although significantly fewer livers were transplanted from CA donors (69 vs. 85%, P<0.01), posttransplantation graft function and injury parameters were similar between the two groups, and CA did not appear to trigger IP. CONCLUSION: Compared with historical data, cardiovascular stability and abdominal organ recovery rates have improved considerably in CA donors. Liver grafts from CA donors function similarly to grafts from non-CA donors with no IP from CA. Our data support the increased use of livers and other organs from donors with CA.


Subject(s)
Heart Arrest , Heart/physiopathology , Ischemic Preconditioning , Liver Transplantation , Tissue Donors , Tissue and Organ Harvesting , Abdomen/surgery , Adolescent , Adult , Cohort Studies , Female , Hemodynamics , Humans , Liver/physiopathology , Liver Transplantation/statistics & numerical data , Male , Middle Aged , Transplantation, Homologous
19.
Med Sci Monit ; 9(5): SR23-8, 2003 May.
Article in English | MEDLINE | ID: mdl-12761469

ABSTRACT

The purpose of this study is to determine the effectiveness of increased clinical correlation teaching in a first year anatomy program and subsequent student performance on departmental and nationally standardized examinations. Basic science curricula in medical schools are increasingly being taught with clinical correlation in order to provide a >seamless transition' from preclinical-to-clinical years. The National Board examination for basic science is also increasingly clinically oriented and designed to test students problem solving and applied skills. Five-year period data of students course grades and standardized subject examination scores are compared statistically in this study. The results indicate that incorporation of clinical correlation teaching and more problem-focused assessment of student learning resulted in better performance on the standardized National Board anatomy examination. We suggest that such an approach can be easily adopted in other preclinical courses.


Subject(s)
Anatomy/education , Certification , Education, Medical, Undergraduate/standards , Education, Medical, Undergraduate/trends , Humans , United States
20.
J Rheumatol ; 29(2): 331-4, 2002 Feb.
Article in English | MEDLINE | ID: mdl-11838852

ABSTRACT

OBJECTIVE: To evaluate the effect of local application of ice on duration and severity of acute gouty arthritis. METHODS: Nineteen patients with acute gout were enrolled and randomized into 2 groups. Group A (n = 10) received topical ice therapy, oral prednisone 30 mg PO tapered to 0 over 6 days and colchicine 0.6 mg/day. Group B was the control group (n = 9), given the same regimen but without the ice therapy. The patients were followed for one week. RESULTS: The mean reduction in pain for those patients treated with ice therapy was 7.75 cm (on 10 cm visual analog scale) with standard deviation +/- 2.58 compared with 4.42 cm (+/- SD 2.96) for the control group. Using a Wilcoxon rank-sum test there was a significant difference (p = 0.021 ) in pain reduction between the ice therapy and control groups. Joint circumference and synovial fluid volume also tended to be more effectively reduced after one week of therapy in the ice group compared with controls, but these did not achieve statistical significance. CONCLUSION: The group treated with ice had a significantly greater reduction in pain compared with the control group. Although the clinical improvement was impressive, due to the small sample size we could not show statistically significant improvement in all the variables that tended to suggest that effect was more than simply analgesic. Cold applications may be a useful adjunct to treatment of acute gouty arthritis.


Subject(s)
Arthritis, Gouty/therapy , Cryotherapy , Ice , Arthritis, Gouty/physiopathology , Humans , Joints/physiopathology , Pain/physiopathology , Pain Management , Pain Measurement , Prospective Studies , Synovial Fluid/cytology , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...