Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
1.
Sci Rep ; 14(1): 10843, 2024 05 12.
Article in English | MEDLINE | ID: mdl-38735990

ABSTRACT

The Johns Hopkins Learning Environment Scale (JHLES) was developed by Robert B. Shochet, Jorie M. Colbert and Scott M. Wright of the Johns hopkins university school of medicine and consists of 28 items used to evaluate perception of the academic environment. The objective was to translate and adapt the JHLES to Polish cultural conditions and to validate the Polish version of the tool. The JHLES questionnaire was completed by students of all years (first-fifth) of the faculties of dental medicine at the Medical University of Lublin and the Medical University of Gdansk. The total surveyed population consisted of 597 students. The overall reliability of the tool was excellent. Confirmatory factor analysis was performed in order to confirm structural consistency with the original JHLES tool. Consequently, all indices had acceptable values (close to 1 or 0, depending on the case), and there was consistency in the results, which shows that the JHLES model is supported by the data. In the present study, the JHLES has been validated in a sample of dental students for the first time in Poland and Europe. Our study provided good evidence for the reliability and validity of the Polish version of the JHLES. In conclusion, the Polish-language version of the JHLES questionnaire is a reliable and valid instrument for analysing the learning environment for students, and its factor structure is supported by the data.


Subject(s)
Learning , Humans , Poland , Surveys and Questionnaires , Female , Male , Factor Analysis, Statistical , Reproducibility of Results , Students, Dental/psychology , Young Adult , Adult , Psychometrics/methods
2.
Med Sci Educ ; 31(1): 109-116, 2021 Feb.
Article in English | MEDLINE | ID: mdl-34457870

ABSTRACT

BACKGROUND: Reflective writing is used throughout medical education to help students navigate their transformation into medical professionals. Assessment of reflective writing, however, is challenging; each available methodology of assessment has distinct advantages and disadvantages. We tested if combining two independent assessment mechanisms-a faculty-designed rubric and Academic Writing Analytics (AWA), an automated technique-could be used together to form a more robust form of evaluation. METHODS: We obtained reflective essays written by first year medical students as part of a clinical skills course. Faculty scored essays using a rubric designed to evaluate Integration, Depth, and Writing. The same essays were subjected to AWA analysis, which counted the number of reflective phrases indicative of Context, Challenge, or Change. RESULTS: Faculty scored the essays uniformly high, indicating that most students met the standard for reflection as described by the rubric. AWA identified over 1400 instances of reflective behavior within the essays, and there was significant variability in how often different types of reflective phrases were used by individual students. CONCLUSIONS: While data from faculty assessment or AWA alone is sufficient to evaluate reflective essays, combining these methods offer a richer and more valuable understanding of the student's reflection.

3.
J Med Educ Curric Dev ; 7: 2382120520902186, 2020.
Article in English | MEDLINE | ID: mdl-32047857

ABSTRACT

BACKGROUND: Although learning environment (LE) is an important component of medical training, there are few instruments to investigate LE in Latin American and Brazilian medical schools. Therefore, this study aims to translate, adapt transculturally, and validate the Medical School Learning Environment Scale (MSLES) and the Johns Hopkins Learning Environment Scale (JHLES) to the Brazilian Portuguese language. METHOD: This study was carried out between June 2016 and October 2017. Both scales have been translated and cross-culturally adapted to Brazilian Portuguese Language and then back translated and approved by the original authors. A principal components analysis (PCA) was performed for both the MSLES and the JHLES. Test-retest reliability was assessed by comparing the first administration of the MSLES and the JHLES with a second administration 45 days later. Validity was assessed by comparing the MSLES and the JHLES with 2 overall LE perception questions; a sociodemographic questionnaire; and the Depression, Anxiety, and Stress Scale (DASS-21). RESULTS: A total of 248 out of 334 (74.2%) first- to third-year medical students from a Brazilian public university were included. Principal component analysis generated 4 factors for MSLES and 7 factors for JHLES. Both showed good reliability for the total scale (MSLES α = .809; JHLES α = .901), as well as for each subdomain. Concurrent and convergent validity were observed by the strong correlations found between both scale totals (r = 0.749), as well as with both general LE questions: recommend the school to a friend (MSLES: r = 0.321; JHLES: r = 0.457) and overall LE rating (MSLES: r = 0.505; JHLES: r = 0.579). The 45-day test-retest comparison resulted in a Pearson correlation coefficient of 0.697 for the JHLES and 0.757 for the MSLES. CONCLUSIONS: Reliability and validity have been demonstrated for both the MSLES and the JHLES. Thus, both represent feasible options for measuring LE in Brazilian medical students.

6.
Acad Med ; 90(6): 810-8, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25853689

ABSTRACT

PURPOSE: To construct a new measure to assess students' perceptions of the medical school learning environment (LE). METHOD: In 2012, students at Johns Hopkins University School of Medicine completed a survey containing 32 LE items. Additional questions asked about overall perception of the LE, personal growth, and recommending the school to a friend. Validity evidence for content, response process, internal structure, and relation to other variables was collected for interpretation of scores. RESULTS: Of 465 students surveyed, 377 (81%) completed all LE items. Exploratory factor analysis yielded the 28-item Johns Hopkins Learning Environment Scale (JHLES) with seven factors/subscales: community of peers, faculty relationships, academic climate, meaningful engagement, mentoring, inclusion and safety, and physical space. Students' overall JHLES scores ranged from 51 to 139, of a possible 28 to 140, with a mean (SD) of 107 (15). Overall scores and most subscale scores did not differ significantly by gender or racial/ethnic background, but did differ significantly by overall perception of the LE (P ≤ .001) and increased incrementally as overall perception improved. Overall JHLES scores were significantly higher for students with higher personal growth scores and students who would recommend the school (both P < .001). Subscale scores for all seven factors increased with improved overall perception of the LE (all P ≤ .005). CONCLUSIONS: The JHLES is a new measure to assess students' perceptions of the medical school LE, with supporting validity evidence and content describing the social, relational, and academic processes of medical school that support students' professional formation.


Subject(s)
Education, Medical, Undergraduate , Perception , Schools, Medical , Social Environment , Students, Medical , Attitude , Faculty, Medical , Female , Humans , Interpersonal Relations , Male , Mentors , Peer Group , Professional Competence , Surveys and Questionnaires
7.
Acad Med ; 89(12): 1687-93, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25054415

ABSTRACT

PURPOSE: Although most agree that supportive learning environments (LEs) are essential for effective medical education, an accurate assessment of LE quality has been challenging for educators and administrators. Two previous reviews assessed LE tools used in the health professions; however, both have shortcomings. The primary goal of this systematic review was to explore the validity evidence for the interpretation of scores from LE tools. METHOD: The authors searched ERIC, PsycINFO, and PubMed for peer-reviewed studies that provided quantitative data on medical students' and/or residents' perceptions of the LE published through 2012 in the United States and internationally. They also searched SCOPUS and the reference lists of included studies for subsequent publications that assessed the LE tools. From each study, the authors extracted descriptive, sample, and validity evidence (content, response process, internal structure, relationship to other variables) information. They calculated a total validity evidence score for each tool. RESULTS: The authors identified 15 tools that assessed the LE in medical school and 13 that did so in residency. The majority of studies (17; 61%) provided some form of content validity evidence. Studies were less likely to provide evidence of internal structure, response process, and relationship to other variables. CONCLUSIONS: Given the limited validity evidence for scores from existing LE tools, new tools may be needed to assess medical students' and residents' perceptions of the LE. Any new tools would need robust validity evidence testing and sampling across multiple institutions with trainees at multiple levels to establish their utility.


Subject(s)
Attitude , Education, Medical , Internship and Residency , Schools, Medical , Students, Medical , Environment , Humans , Learning , Perception , Psychometrics , Reproducibility of Results , Surveys and Questionnaires
8.
Acad Med ; 88(2): 246-52, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23269291

ABSTRACT

PURPOSE: The medical school learning environment (LE), encompassing the physical, social, and psychological context for learning, holds significant influence on students' professional development. Among these myriad experiences, the authors sought to gauge what students judge as influencing their perceptions of the LE. METHOD: Fourth-year medical students at Johns Hopkins University participated in this cohort survey study before their 2010 graduation. A list of 55 events was iteratively revised and pilot-tested before being administered online. Responses assessed whether students experienced each event and, if so, the degree of impact on perceptions of the LE. A calculated mean impact score (MIS) provided a means to compare the relative impact of events. RESULTS: Of 119 students, 84 (71%) completed the survey. Students rated the overall LE as exceptional (29/84; 35%), good (36/84; 43%), fair (17/84; 20%), or poor (2/84; 2%). Eighty percent of students experienced at least 41 of the 55 events. MIS values ranged from 2.00 to 3.76 (highest possible: 4.00). Students rated positive events as having the highest impact. Few significant differences were found across gender, age, or surgical/nonsurgical specialty choice. MIS distributions differed between those perceiving the LE as exceptional or fair to poor for 22 (40%) of 55 events. CONCLUSIONS: This study attempted to identify the discrete events that medical students perceive as most affecting their sense of the LE. Knowing the phenomena that most strongly influence student perceptions can inform how settings, relationships, and interactions can be shaped for meaningful learning and professional formation.


Subject(s)
Environment , Learning , Perception , Schools, Medical , Social Environment , Students, Medical/psychology , Adult , Cohort Studies , Female , Humans , Male , Maryland , Surveys and Questionnaires
10.
Acad Med ; 85(4): 578-9, 2010 Apr.
Article in English | MEDLINE | ID: mdl-20354368

ABSTRACT

The case presentation is a time-honored tradition in clinical medicine, and medical journals and national conferences have provided a forum for this type of scholarship for more than a century. Case presentations can also be used by educators as a means to understand challenging learner experiences, and by doing so, lead to advances in the practice of medical education. Medical school faculty are asked to serve in student advisor roles, yet best practices for student advising are not known. Unlike clinicians, who often discuss difficult patient cases, medical educators do not typically have opportunities to discuss challenging student cases to learn how best to support trainees. In this commentary, the authors-from the Johns Hopkins University School of Medicine Colleges Advisory Program (CAP), a longitudinal advising program with the goal of promoting personal and professional development of students-describe the novel quarterly Advisory Case Conference, where medical student cases can be confidentially presented and discussed by faculty advisors, along with relevant literature reviews, to enhance faculty advising skills for students. As medical student advising needs often vary, CAP advisors employ adult learning principles and emphasize shared responsibility between advisor and advisee as keys to successful advising. Unlike traditional clinical case conferences, the Advising Case Conference format encourages advisors to share perspectives about the cases by working in small groups to exchange ideas and role-play solutions. This model may be applicable to other schools or training programs wishing to enhance faculty advising skills.


Subject(s)
Education, Medical/methods , Faculty, Medical/standards , Students, Medical , Teaching/methods , Adult , Clinical Competence , Humans , Mentors , United States
11.
Acad Med ; 85(4): 654-9, 2010 Apr.
Article in English | MEDLINE | ID: mdl-20354382

ABSTRACT

Advising medical students is a challenging task. Faculty who serve as advisors for students require specific skills and knowledge to do their jobs effectively. Career choice is one of the many complex issues about which medical students often seek assistance from a faculty advisor. The authors present a case of a third-year medical student with career indecision, with a focus on the various factors that may be influencing her thinking about career choice. Key advising principles are provided as a framework for the discussion of the case and include reflection, self-disclosure, active listening, support and advocacy, confidentiality, and problem solving. These principles were developed as part of the Advising Case Conference series of the Johns Hopkins University School of Medicine Colleges Advisory Program. Emergent themes from the case included a student's evolving professional identity, a student's distress and burnout, lifestyle considerations, and advisor bias and self-awareness. The authors propose reflective questions to enhance meaningful discussions between the advisor and student and assist in problem solving. Many of these questions, together with the key advising principles, are generalizable to a variety of advising scenarios between advisors and learners at all levels of training.


Subject(s)
Career Choice , Clinical Clerkship/methods , Decision Making , Mentors , Professional Role/psychology , Students, Medical/psychology , Teaching/standards , Educational Technology/methods , Female , Humans
12.
Med Teach ; 29(4): 353-7, 2007 May.
Article in English | MEDLINE | ID: mdl-17786750

ABSTRACT

BACKGROUND: In July 2005, a learning community was created at Johns Hopkins University School of Medicine (JHUSOM) to foster camaraderie, networking, advising, mentoring, professionalism, clinical skills, and scholarship--The Colleges. The cultural and structural changes that emerged with the creation of this program have resulted in JHUSOM bearing a resemblance to J. K. Rowling's fictional Hogwarts School of Witchcraft and Wizardry. AIMS: This manuscript will describe the similarities between these two revered schools, and highlight the innovations and improvements made to JHUSOM's learning environment. DESCRIPTION: The intense, stressful, and lengthy professional training required to achieve competency in the practice of medicine and in the practice of witchcraft (albeit fictional) have meaningful parallels. CONCLUSION: The supportive learning environment at these two schools should afford the next generation of graduates to have an even more enriching experience than those who have come before them.


Subject(s)
Community Participation , Learning , Literature , Metaphor , Schools, Medical , Witchcraft , Baltimore , Cultural Diversity , Educational Measurement , Faculty , Health Facility Environment , Humans , Interpersonal Relations , Role , Social Environment , Students, Medical
SELECTION OF CITATIONS
SEARCH DETAIL
...