Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Hosp Pediatr ; 1(1): 46-50, 2011 Jul 01.
Article in English | MEDLINE | ID: mdl-24510929

ABSTRACT

CONTEXT: Fever without source (FWS) in children 3-36 months is a common presenting complaint. Because of changes in immunization practices and their effects on rates of bacteremia, older guidelines may no longer be applicable. We reviewed the literature regarding the necessity of obtaining a blood culture in non-toxic children in this age group with FWS. DATA SOURCES: We conducted a MEDLINE search on the topic of bacteremia in febrile children 3-36 months from 2004-present. RESULTS: Eight studies were included. Although the studies varied in terms of approach and analysis, all suggested a rate of bacteremia in a non-toxic, febrile child 3-36 months of age to be less than 1%. CONCLUSIONS: Strong consideration should be given for foregoing blood culture in a non-toxic child 3-36 months of age with FWS.

2.
Ambul Pediatr ; 8(3): 195-9, 2008.
Article in English | MEDLINE | ID: mdl-18501867

ABSTRACT

OBJECTIVE: The aim of this study was to evaluate a culturally effective health care (CEHC) curriculum integrated into the real-time clinical experience of a third-year medical student pediatric clerkship. METHODS: The intervention group (n = 22) and the nonintervention group (n = 69) consisted of students who were assigned to one of two sites for their clerkship. Students did not volunteer for the curriculum. A curriculum in 2002 was developed based upon a local needs assessment of students and parents, key CEHC concepts and experts' input. Learning strategies included incorporation of CEHC issues into clinic precepting, attending rounds, and written histories. Evaluation methods were preintervention and postintervention knowledge tests and Likert-type attitudinal surveys, and a final objective structured clinical exam (OSCE; nonintervention group, n = 22, intervention group, n = 22). RESULTS: Pretest knowledge scores were similar in both groups. The post-test scores were significantly different. The intervention group demonstrated higher gain in the knowledge scores (42% vs 5%; P < .001). The intervention group demonstrated significantly higher gains in observed role modeling (85% vs 31%; P = .01), self-perceived skill (82% vs 19%; P < .001), and attitude (21% vs 0%; P = .02), but not in self-perceived knowledge domains (65% vs 15%; P = .14) on the attitudinal survey. The intervention group performed significantly better in the folk (83% vs 70%; P = .02) and language (75% vs 63%; P = .01) OSCE stations and had a significantly higher total OSCE score (79% vs 68%; P = .01). CONCLUSION: A CEHC curriculum, stressing clinical relevance, was successfully incorporated into the real-time experience of a third-year medical student pediatric clerkship. Students demonstrated significant gains in knowledge, attitudinal domains, and clinical skills.


Subject(s)
Arabs/ethnology , Clinical Clerkship/organization & administration , Cultural Competency/education , Curriculum , Islam , Pediatrics/education , Humans , Program Evaluation
3.
Teach Learn Med ; 18(3): 267-72, 2006.
Article in English | MEDLINE | ID: mdl-16776616

ABSTRACT

BACKGROUND: Instruction in evidence-based medicine (EBM) has been widely incorporated into medical school curricula with little evidence of its effectiveness. Our goal was to create, implement, and validate a computer-based assessment tool that measured medical students' EBM skills. DESCRIPTION: As part of a required objective structured clinical examination, we developed a specific case scenario in which students (a) asked a structured clinical question using a standard framework, (b) generated effective MEDLINE search terms to answer a specific question, and (c) elected the most appropriate of 3 abstracts generated from a search justifying which best applies to the patient scenario. EVALUATION: Between the 3 blinded raters, there was very good interrater reliability with 84, 94, and 96% agreement on the scoring for each component, respectively (k = .64, .82, and .91, respectively). In addition, students found the station appropriately difficult for their level of training. CONCLUSIONS: This computer-based tool appears to measure several EBM skills independently and combines simple administration and scoring. Its generalizability to other cases and settings requires further study.


Subject(s)
Clinical Competence , Computers , Evidence-Based Medicine/education , Program Evaluation , Students, Medical , Curriculum , Educational Measurement/methods , Humans
4.
Acad Med ; 77(11): 1157-8, 2002 Nov.
Article in English | MEDLINE | ID: mdl-12431934

ABSTRACT

OBJECTIVE: To create a feasible, valid, and reliable tool to measure third-year medical students' skills in evidence-based medicine (EBM). DESCRIPTION: EBM skills-asking clinical questions, finding appropriate medical information resources, and appraising and applying them to patients-involve higher-order critical thinking abilities and are essential to being a competent physician. Students at our institution must pass a required OSCE exam at the end of their third year. As part of this exam, we developed a new 20-minute computer-based station to assess students' EBM skills. Using a specific case scenario, we asked the students to (1) ask a question using the population/intervention/comparison/outcome (PICO) framework; (2) generate appropriate search terms, given a specific question; and (3) select an appropriate abstract to answer a given question and state why two other abstracts were not appropriate. Prior to the assessment, we determined grading and passing criteria for each of the three components and for the station overall. Of the 140 students who completed the station, the percentages that passed the components were 71%, 81%, and 49% respectively, with only 29% passing all three parts. Preliminary analysis of psychometric properties of the station shows very good to excellent interrater reliability, with 65%, 67%, and 94% agreement on the scoring for the components, and kappas of.64,.82, and.94, respectively. DISCUSSION: Although there are many curricula for teaching EBM concepts, there are few tools to measure whether students are competent in applying their EBM skills. Our pilot station appears to be an innovative and promising tool to measure several EBM skills independently. By being computer-based, it is relatively simple to administer, grade, and evaluate. While preliminary data show good inter-rater reliability with our use of a single case, future work will include further testing of reliability and assessment of different types of cases. We will also use the results of this assessment to drive continuous improvement in our EBM curriculum. The students who completed this pilot station had not received an extensive formal EBM curriculum, whereas future groups will. We also will explore whether scores on our station correlate with those on other OSCE stations that also assess critical thinking skills, or if scores correlate with a student's clinical grades or overall class standing. We hope to test these hypotheses: (1) skills used in EBM are useful and valid measures of critical thinking abilities in learners and (2) tools such as ours will help to measure these essential competencies.


Subject(s)
Clinical Competence , Computers , Evidence-Based Medicine/education , Students, Medical , Curriculum , Humans , Psychometrics
SELECTION OF CITATIONS
SEARCH DETAIL
...