Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
1.
Acad Pediatr ; 21(1): 170-177, 2021.
Article in English | MEDLINE | ID: mdl-32619544

ABSTRACT

OBJECTIVE: Professional development programs (PDPs) within academic professional organizations rely on faculty volunteers, but little is known about the volunteering process and experience. Our aim was to gain insights into the initial decision to volunteer, the experience of volunteering and the decision to re-volunteer or not (ie, remain or leave as a volunteer). The study setting was a PDP of the Academic Pediatric Association, the Educational Scholars Program. METHODS: In 2014, 13 Educational Scholars Program faculty members participated in semistructured phone interviews. The authors performed a general inductive analysis of the data, inductively created codes, and analyzed coded data for emergent themes that led to the creation of a model for recruiting and sustaining volunteers. RESULTS: Four themes related to the initial volunteer decision and the decision to re-volunteer or not (self-interest and altruism, reputation of the program, relevant skill set, and doability), and 4 themes related to the experience of volunteering (emotional impact, career advancement and professional recognition, professional growth, and doability) emerged. The relationship among the themes led to the creation of a model of volunteering, involving a metaphorical volunteerism "tank" that is full when faculty initially volunteer and subsequently fills or empties as a result of dynamic interplay between the themes for each individual. CONCLUSIONS: Leaders of PDPs may find our model of volunteering beneficial for enhancing the emotional and tangible benefits and minimizing the logistical issues of volunteering. This information should contribute to success in recruiting and retaining the volunteers who are essential for developing and sustaining PDPs.


Subject(s)
Faculty , Volunteers , Child , Humans
2.
Acad Pediatr ; 20(1): 97-103, 2020.
Article in English | MEDLINE | ID: mdl-31404708

ABSTRACT

OBJECTIVE: Research on how medical students choose a career in pediatrics is either dated or conflated with primary care career choice. Capitalizing on student participation in an innovative, time-variable, competency based pathway program, Education in Pediatrics Across the Continuum (EPAC), the authors explored the process of career decision-making in students at 5 medical schools (including 4 EPAC sites) who begin medical school with an interest in pediatrics. METHODS: Individual, semistructured interviews were conducted with students in 5 groups: Group 1: accepted into EPAC, n = 8; Group 2: accepted into EPAC, opted-out, n = 4; Group 3 applied to EPAC, not accepted, pursued pediatrics, n = 4; Group 4: applied to EPAC, not accepted, did not pursue pediatrics, n = 3; Group 5: pursued pediatrics at a non-EPAC site, n = 6. Data collection and analysis occurred iteratively, with inductive coding of data revealing patterns in data explored in subsequent interviews and refined in the final analysis. RESULTS: All students described intrinsic guiding principles, that is, "doing what you love," that attracted them to pediatrics. They described extrinsic, phase-specific experiences before medical school, before clerkship, and in clerkship that shaped their perceptions of a career in pediatrics and shed light on collective values of different specialties. Student's assessment of how their guiding principles aligned with the collective values of pediatrics, which students encountered in the clerkship phase, was a key to making career decisions. CONCLUSIONS: Intrinsic and extrinsic factors do not act alone but interact in clerkships, and influence career choice of students who enter medical school with an interest in pediatrics.


Subject(s)
Career Choice , Pediatrics/education , Schools, Medical , Adult , Female , Humans , Interviews as Topic , Male , Pilot Projects , Qualitative Research , United States
3.
J Grad Med Educ ; 11(6): 685-690, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31871570

ABSTRACT

BACKGROUND: Primary care forms a critical part of pediatricians' practices, yet the most effective ways to teach primary care during residency are not known. OBJECTIVE: We established a new primary care curriculum based on Malcolm Knowles' theory of andragogy, with brief clinical content that is easily accessible and available in different formats. METHODS: We used Kern's model to create a curriculum. In 2013, we implemented weekly e-mails with links to materials on our learning management system, including moderators' curricular content, resident-developed quizzes, and podcasts. After 3 years, we evaluated the curriculum with resident focus groups, retrospective pre-/post-resident surveys, faculty feedback, a review of materials accessed, and resident attendance. RESULTS: From content analysis of focus groups we learned that residents found the curriculum beneficial, but it was not always possible to do the pre-work. The resident survey, with a response rate of 87% (71 of 82), showed that residents perceived improvement in 37 primary care clinical skills, with differences from 0.64 to 1.46 for scales 1-5 (P < .001 for all). Faculty feedback was positive regarding curriculum organization and structure, but patient care often precluded devoting time to discussing the curriculum. In other ways, our results were disappointing: 51% of residents did not access the curriculum materials, 51% did not open their e-mails, only 37% completed any of the quizzes, and they attended a weekly conference 46% of the time. CONCLUSIONS: Although residents accessed the curriculum less than expected, their self-assessments reflect perceptions of improvement in their clinical skills after implementation.


Subject(s)
Curriculum , Pediatricians/education , Primary Health Care , Clinical Competence , Colorado , Education, Medical, Graduate/methods , Focus Groups , Humans , Internship and Residency , Retrospective Studies , Surveys and Questionnaires
4.
Acad Med ; 94(3): 338-345, 2019 03.
Article in English | MEDLINE | ID: mdl-30475269

ABSTRACT

In 2011, the Education in Pediatrics Across the Continuum (EPAC) Study Group recruited four medical schools (University of California, San Francisco; University of Colorado; University of Minnesota; and University of Utah) and their associated pediatrics clerkship and residency program directors to be part of a consortium to pilot a model designed to advance learners from undergraduate medical education (UME) to graduate medical education (GME) and then to fellowship or practice based on competence rather than time spent in training. The central design features of this pilot included predetermined expectations of performance and transition criteria to ensure readiness to progress from UME to GME, using the Core Entrustable Professional Activities for Entering Residency (Core EPAs) as a common assessment framework. Using this framework, each site team (which included, but was not limited to, the EPAC course, pediatric clerkship, and pediatric residency program directors) monitored learners' progress, with the site's clinical competency committee marking the point of readiness to transition from UME to GME (i.e., the attainment of supervision level 3a). Two of the sites implemented time-variable transition from UME to GME, based on when a learner met the performance expectations and transition criteria. In this Article, the authors describe each of the four sites' implementation of Core EPA assessment and their approach to gathering the data necessary to determine readiness for transition. They conclude by offering recommendations and lessons learned from the pilot's first seven years of development, adaptation, and implementation of assessment strategies across the sites, and discussing next steps.


Subject(s)
Competency-Based Education/statistics & numerical data , Educational Measurement/methods , Clinical Competence , Education, Medical, Graduate , Education, Medical, Undergraduate , Humans
5.
Acad Pediatr ; 18(3): 357-359, 2018 04.
Article in English | MEDLINE | ID: mdl-29408680

ABSTRACT

Management of referral and consultation is an entrustable professional activity for pediatric residents; however, few tools exist to teach these skills. We designed and implemented tools to prompt discussion, feedback, and reflection about the process of referral, notably including the family's perspective.


Subject(s)
Clinical Competence , Internship and Residency , Pediatrics/education , Referral and Consultation , Education, Medical, Graduate , Feedback , Humans , Learning
6.
Acad Pediatr ; 18(3): 354-356, 2018 04.
Article in English | MEDLINE | ID: mdl-29247792

ABSTRACT

A process and tool that prompts learners to think about and reflect on their clinical performance was implemented. Learner narrative reflections about their work and faculty feedback, both captured in the moment, provided data for decisions about level of performance in a competency-based assessment system.


Subject(s)
Clinical Competence , Education, Medical, Undergraduate/methods , Learning , Narration , Faculty, Medical , Humans , Students, Medical
7.
Teach Learn Med ; 29(3): 337-350, 2017.
Article in English | MEDLINE | ID: mdl-28632010

ABSTRACT

PROBLEM: Clinical educators at U.S. academic health centers are frequently disadvantaged in the academic promotion system, lacking needed faculty development, mentoring, and networking support. INTERVENTION: In 2006, we implemented the national Educational Scholars Program to offer faculty development in educational scholarship for early career educators in pediatrics. We aimed to provide them with skills, experience, and initial success in educational scholarship and dissemination. The 3-year curriculum is delivered in interactive sessions at the annual pediatric academic meetings and online intersession modules. Curriculum content progresses from educational scholarship and implementing scholarly projects to dissemination and professional networking. Intersession modules address project planning, building an educator portfolio, reviewing the literature, using technology, authorship, and peer review. Concurrently, all scholars must complete a mentored educational project and demonstrate national dissemination of a peer-reviewed product to obtain a Certificate of Excellence in Educational Scholarship. CONTEXT: The setting of this study was a national, longitudinal, cohort-based faculty development program built within the Academic Pediatric Association, a 2,000-member professional organization. OUTCOME: In 10 years, the Educational Scholars Program has enrolled 172 scholars in 8 cohorts; 94 have graduated so far. We describe how formative evaluation guided curriculum refinement and process improvement. Summative evaluations show that faculty and scholars were satisfied with the program. Participant outcomes from Cohort 1, assessed at Kirkpatrick's four levels of evaluation, demonstrate increases in scholarly productivity, leadership activities, and academic promotions. LESSONS LEARNED: Curriculum building is a dynamic process of ongoing evaluation and modification. Our program benefited from designing an integrated and focused curriculum, developing educational principles to guide program improvements, creating curricular tools to help learners organize and document their efforts, supporting project-based learning with expert mentoring, and facilitating peer and faculty networking and collaboration. A national, longitudinal faculty development program can support growth in academic knowledge and skills, promote professional networking, and thereby enrich educators' career opportunities.


Subject(s)
Curriculum , Faculty, Medical , Program Evaluation , Staff Development , Female , Humans , Male , Program Development , Surveys and Questionnaires
9.
MedEdPORTAL ; 12: 10526, 2016 Dec 31.
Article in English | MEDLINE | ID: mdl-30800729

ABSTRACT

INTRODUCTION: Advocacy and service-learning increasingly are being incorporated into medical education and residency training. The Jefferson Service Training in Advocacy for Residents and Students (JeffSTARS) curriculum is an educational program for Thomas Jefferson University and Nemours trainees. The JeffSTARS Advocacy and Community Partnership Elective is one of two core components of the larger curriculum. METHODS: The elective is a monthlong rotation that provides trainees in their senior year of medical school or residency training the opportunity to learn about health advocacy in depth. Trainees develop a basic understanding of social determinants of health, learn about health policy, participate in legislative office visits, and work directly with community agencies on a mutually agreeable project. The elective provides advocacy training to self-selected trainees from area medical schools and residency programs to develop a cadre of physicians empowered to advocate for child health. RESULTS: JeffSTARS has advanced the field of child health advocacy locally by forging new partnerships and building a network of experts, agencies, and academic institutions. After this experience, trainees realize that their health expertise is very valuable to health advocacy and policy development. JeffSTARS is recognized nationally as one of a growing number of advocacy training programs for students and residents, with trainees presenting selected projects at national meetings. DISCUSSION: Teaching advocacy has raised awareness about social determinants of health, community resources, and the medical home. One of the many benefits of the elective has been to strengthen the skills and expertise of trainees and faculty members alike.

11.
Hosp Pediatr ; 4(4): 239-46, 2014 Jul.
Article in English | MEDLINE | ID: mdl-24986994

ABSTRACT

OBJECTIVE: To explore medical students' experiences working with frequently rotating pediatric inpatient attending physicians. METHODS: We performed a qualitative study using focus groups and individual interviews of medical students who rotated on the general pediatric inpatient service at Children's Hospital Colorado. The majority of inpatient pediatric attending physicians worked 1-week blocks. We used a semistructured interview guide and analyzed data using the constant comparative method. In accordance with the grounded theory method, codes were developed using an iterative approach, and major themes were identified. Analysis indicated theoretical saturation was achieved. We created a theory that arose from analysis of the data. RESULTS: Twenty-seven medical students participated. Data analysis yielded 6 themes: learning climate, continuity, student resilience, opportunity to progress, growth into a physician, and evaluation. In the learning climate, the emotional environment was often stressful, although students valued exposure to different patient care and teaching styles. Senior resident continuity promoted student function; lack of continuity with attending physicians inhibited relationship development. Students were resilient in adjusting to changing faculty with different expectations. In the context of frequently rotating faculty, students had difficulty showing improvement to a single attending physician after feedback, which limited students' opportunities to progress. Students perceived summative evaluation as less meaningful in the absence of having a relationship with their attending physicians. CONCLUSIONS: Medical students valued exposure to different patient care and teaching styles. However, frequently changing attending physicians caused students stress and limited students' perceived ability to achieve and show professional growth.


Subject(s)
Attitude of Health Personnel , Pediatrics/education , Personnel Staffing and Scheduling , Students, Medical/psychology , Education, Medical, Undergraduate , Faculty, Medical , Female , Focus Groups , Humans , Male , Qualitative Research
12.
Front Psychol ; 4: 668, 2013.
Article in English | MEDLINE | ID: mdl-24348433

ABSTRACT

BACKGROUND: In medical education, evaluation of clinical performance is based almost universally on rating scales for defined aspects of performance and scores on examinations and checklists. Unfortunately, scores and grades do not capture progress and competence among learners in the complex tasks and roles required to practice medicine. While the literature suggests serious problems with the validity and reliability of ratings of clinical performance based on numerical scores, the critical issue is not that judgments about what is observed vary from rater to rater but that these judgments are lost when translated into numbers on a scale. As the Next Accreditation System of the Accreditation Council on Graduate Medical Education (ACGME) takes effect, medical educators have an opportunity to create new processes of evaluation to document and facilitate progress of medical learners in the required areas of competence. Proposal and initial experience: Narrative descriptions of learner performance in the clinical environment, gathered using a framework for observation that builds a shared understanding of competence among the faculty, promise to provide meaningful qualitative data closely linked to the work of physicians. With descriptions grouped in categories and matched to milestones, core faculty can place each learner along the milestones' continua of progress. This provides the foundation for meaningful feedback to facilitate the progress of each learner as well as documentation of progress toward competence. IMPLICATIONS: This narrative evaluation system addresses educational needs as well as the goals of the Next Accreditation System for explicitly documented progress. Educators at other levels of education and in other professions experience similar needs for authentic assessment and, with meaningful frameworks that describe roles and tasks, may also find useful a system built on descriptions of learner performance in actual work settings. CONCLUSIONS: We must place medical learning and assessment in the contexts and domains in which learners do clinical work. The approach proposed here for gathering qualitative performance data in different contexts and domains is one step along the road to moving learners toward competence and mastery.

13.
Acad Med ; 88(10): 1558-63, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23969364

ABSTRACT

PURPOSE: To provide validity evidence for use of the Learning Goal Scoring Rubric to assess the quality of written learning goals and residents' goal writing skills. METHOD: This two-part study used the rubric to assess University of Colorado third-year pediatric residents' written learning goals to obtain validity evidence. In study 1, five raters independently scored 48 goals written in 2010-2011 and 2011-2012 by 48 residents, who also responded to the Jefferson Scale of Physician Lifelong Learning (JeffSPLL). In study 2, two raters independently scored 48 goals written in 2011-2012 by 12 residents. Intraclass correlation coefficients (ICCs) assessed rater agreement to provide evidence for response process. Generalizability theory assessed internal structure. Independent-samples Mann-Whitney U tests and correlations assessed relationship to other variables. Content was matched to published literature and instructional methods. RESULTS: The ICC was 0.71 for the overall rubric. In study 1, where the generalizability study's (G study's) object of measurement was learning goals, the phi coefficient was 0.867. In study 2, where the G study's object of measurement was the resident (goal writing skill), the phi coefficient was 0.751. The total mean score of residents with goal writing training was significantly higher than that of those without (7.54 versus 4.98, P < .001). Correlation between goal quality and JeffSPLL score was not significant. Investigators agreed that the content matched the published literature and instructional methods. CONCLUSIONS: Preliminary validity evidence indicates that this scoring rubric can assess learning goal quality and goal writing skill.


Subject(s)
Education, Medical, Continuing , Education, Medical, Graduate , Goals , Internship and Residency , Learning , Pediatrics/education , Writing , Colorado , Humans
14.
J Grad Med Educ ; 5(4): 639-45, 2013 Dec.
Article in English | MEDLINE | ID: mdl-24455015

ABSTRACT

BACKGROUND: The traditional 1-month training blocks in pediatrics may fail to provide sufficient exposure to develop the knowledge, skills, and attitudes residents need for practice and may not be conducive to mentoring relationships with faculty and continuity with patients. INTERVENTION: We created a 4-month career-focused experience (CFE) for third-year residents. The CFE included block time and longitudinal experiences in different content areas related to residents' choice of urban and rural primary care, hospitalist medicine, or subspecialty care (prefellowship). Content was informed by graduate surveys, focus groups with primary care pediatricians and hospitalists, and interviews with fellowship directors. Outcomes were assessed via before and after surveys of residents' attitudes and skills, assessment of skills with an objective structured clinical examination (OSCE), and interviews with residents and mentors. RESULTS: Twenty-three of 49 third-year residents took part in the first 2 years of CFE. Two residents dropped out, leaving 21 who completed the 4-month experience (9 in primary care, 2 in hospitalist medicine, and 10 in a subspecialty). Residents reported improvement in their clinical skills, increased satisfaction with faculty mentoring and evaluation, and the ability to focus on what was important to their careers. OSCE performance did not differ between residents who completed the CFE and those who did not. Administrative burden was high. CONCLUSIONS: Four-month career-focused training for pediatrics residents is feasible and may be effective in meeting part of the new requirement for 6 months of career-focused training during pediatrics residency.

15.
Acad Pediatr ; 11(5): 394-402, 2011.
Article in English | MEDLINE | ID: mdl-21684232

ABSTRACT

OBJECTIVE: To assess the feasibility of a new multi-institutional program of direct observation and report what faculty observed and the feedback they provided. METHODS: A program of direct observation of real patient encounters was implemented in 3 pediatric residency programs using a structured clinical observation (SCO) form to document what was observed and the feedback given. Outcome variables included the number of observations made, the nature of the feedback provided, resident attitudes about direct observation before and after implementation, and the response of the faculty. RESULTS: Seventy-nine preceptors and 145 residents participated; 320 SCO forms were completed. Faculty provided feedback in 4 areas: content, process of the encounter, patient-centered attitudes and behaviors, and interpersonal skills. Feedback was 85% specific and 41% corrective. Corrective feedback was most frequent for physical examination skills. After program implementation, residents reported an increase in feedback and a decrease in discomfort with direct observation; in addition, they agreed that direct observation was a valuable component of their education. Participation rates among faculty were high. CONCLUSIONS: Direct observation using SCOs results in timely and specific feedback to residents about behaviors rarely observed in traditional precepting models. Resident competency in these clinical skill domains is critical for assessing, diagnosing, and managing patients. The SCO methodology is a feasible way to provide formative feedback to residents about their clinical skills.


Subject(s)
Ambulatory Care Facilities , Clinical Competence , Continuity of Patient Care , Internship and Residency , Knowledge of Results, Psychological , Pediatrics/education , Adolescent , Adult , Child , Child, Preschool , Cohort Studies , Feasibility Studies , Female , Humans , Infant , Infant, Newborn , Male , Program Evaluation
17.
Acad Med ; 84(1): 58-66, 2009 Jan.
Article in English | MEDLINE | ID: mdl-19116479

ABSTRACT

PURPOSE: Traditional promotion standards rely heavily on quantification of research grants and publications in the curriculum vitae. The promotion and retention of educators is challenged by the lack of accepted standards to evaluate the depth, breadth, quality, and impact of educational activities. The authors sought to develop a practical analysis tool for the evaluation of educator portfolios (EPs), based on measurable outcomes that allow reproducible analysis of the quality and impact of educational activities. METHOD: The authors, 10 veteran educators and an external expert evaluator, used a scholarly, iterative consensus-building process to develop the tool and test it using real EPs from educational scholars who followed an EP template. They revised the template in parallel with the analysis tool to ensure that EP data enabled valid and reliable evaluation. The authors created the EP template and analysis tool for scholar and program evaluation in the Educational Scholars Program, a three-year national certification program of the Academic Pediatric Association. RESULTS: The analysis tool combines 18 quantitative and 25 qualitative items, with specifications, for objective evaluation of educational activities and scholarship. CONCLUSIONS: The authors offer this comprehensive, yet practical tool as a method to enhance opportunities for faculty promotions and advancement, based on well-defined and documented educational outcome measures. It is relevant for clinical educators across disciplines and across institutions. Future studies will test the interrater reliability of the tool, using data from EPs written using the revised template.


Subject(s)
Education, Medical/standards , Educational Measurement/methods , Faculty, Medical/organization & administration , Program Evaluation , Staff Development/methods , Humans , United States
18.
Teach Learn Med ; 18(1): 18-21, 2006.
Article in English | MEDLINE | ID: mdl-16354135

ABSTRACT

BACKGROUND AND PURPOSE: We evaluated the physical-examination section of a multimedia program developed to teach infant history and physical-examination skills. METHODS: A total of 71 students participated: one group viewed only the physical-examination section (PX), one the history section (HX), one none of the program (CX). We assessed physical-examination skills by direct observation of medical students performing an abdominal exam and scored using a checklist at baseline, immediately after intervention, and at the end of the pediatric clerkship. We analyzed results using analysis of variance with repeated measures. RESULTS: Baseline scores were PX = 2.5, HX = 2.8. The PX group scored significantly higher immediately postintervention at 6.8 compared to the HX group (3.1). At the end of the clerkship, significant differences between the groups remained. Final group mean scores were PX = 5.5, HX = 4.4, and CX = 2.7. CONCLUSION: The program improved examination skills with attenuation over 6 weeks.


Subject(s)
Clinical Clerkship/methods , Curriculum , Faculty, Medical , Multimedia , Pediatrics/education , Students, Medical , Teaching/methods , Child , Child, Preschool , Humans , Infant , Infant, Newborn , Medical History Taking , Physical Examination , Program Development , Program Evaluation
19.
Teach Learn Med ; 17(3): 228-32, 2005.
Article in English | MEDLINE | ID: mdl-16042517

ABSTRACT

BACKGROUND: Patient education and giving information is a core skill that improves patient adherence and medical outcomes. PURPOSE: To evaluate the impact of a teaching intervention on 3rd-year students' competency in patient education and information giving about asthma medication delivery. METHODS: Students (n=81) completed a 1-hr teaching intervention of didactics followed by role playing of asthma patient education scenarios. Using a standardized patient post intervention, patient education and information-giving skills about spacer/metered dose inhalers were scored overall and on a 12-item checklist and compared to a control group (n=70). Students' knowledge was evaluated using a short answer test. RESULTS: The performance of intervention students on overall patient education, 10 of the 12 checklist items, and the test was significantly higher than controls but did not approach competency. CONCLUSIONS: The 1-hr intervention improved clinical performance and knowledge, but students did not become competent. Future studies should investigate how competence in this and other core patient education skills can be successfully achieved.


Subject(s)
Clinical Competence , Educational Measurement/methods , Patient Education as Topic , Students, Medical , Teaching/methods , Analysis of Variance , Asthma/therapy , Education, Medical, Undergraduate , Female , Humans , Male , Patient Compliance , Self Administration
20.
Ambul Pediatr ; 4(3): 244-8, 2004.
Article in English | MEDLINE | ID: mdl-15153057

ABSTRACT

PURPOSE: To evaluate the effect of a videotaping program on third-year medical students' interviewing and self-assessment skills. METHODS: A self-assessment manual, listing and explaining 21 core elements of the medical interview, was developed. After reading the manual, students videotaped an interview and self-assessed their performances. Each student reviewed the videotape with a faculty member who also rated the performance. This process was repeated 1 week later. Changes in group performance, core element performance, and ability to self-assess after the intervention were evaluated by Cohen d values to measure effect size, McNemar chi-square test for repeated measures, and concordance between faculty and student ratings. RESULTS: Sixty students participated in the videotaping study. Students inaccurately self-assessed on the first video 14% of the time. The 6 poorest-performed core elements were the least accurately self-assessed. Lack of concordance between the global rating given by faculty and student identified all students with inflated self-assessment. One review session had a large effect on overall performance and interpersonal skills and a moderate effect on history-taking skills. A large effect on performance was seen for 3 core elements, a moderate effect for 12 elements, and a small effect for the remaining 6 elements. Performance of the core elements that needed improvement did improve in 74% of the students (P <.001). Students' overall ability to self-assess improved significantly (P <.01). CONCLUSIONS: Our videotaping program improved students' interviewing and self-assessment skills and identified students with inflated views of their abilities. Medical educators should re-evaluate and readopt this excellent teaching and learning tool.


Subject(s)
Education, Medical/methods , Self-Assessment , Videotape Recording/methods , Adult , Education, Medical/standards , Humans , Students, Medical
SELECTION OF CITATIONS
SEARCH DETAIL
...