Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS One ; 7(10): e47564, 2012.
Article in English | MEDLINE | ID: mdl-23077638

ABSTRACT

Blueberry growers in Maine attend annual Cooperative Extension presentations given by university faculty members. These presentations cover topics, such as, how to prevent plant disease and monitor for insect pests. In 2012, in order to make the sessions more interactive and promote learning, clicker questions and peer discussion were incorporated into the presentations. Similar to what has been shown at the undergraduate level, after peer discussion, more blueberry growers gave correct answers to multiple-choice questions than when answering independently. Furthermore, because blueberry growers are characterized by diverse levels of education, experience in the field etc., we were able to determine whether demographic factors were associated with changes in performance after peer discussion. Taken together, our results suggest that clicker questions and peer discussion work equally well with adults from a variety of demographic backgrounds without disadvantaging a subset of the population and provide an important learning opportunity to the least formally educated members. Our results also indicate that clicker questions with peer discussion were viewed as a positive addition to university-related informal science education sessions.


Subject(s)
Learning , Science/education , Humans , Peer Group
2.
CBE Life Sci Educ ; 10(2): 149-55, 2011.
Article in English | MEDLINE | ID: mdl-21633063

ABSTRACT

Concept inventories, consisting of multiple-choice questions designed around common student misconceptions, are designed to reveal student thinking. However, students often have complex, heterogeneous ideas about scientific concepts. Constructed-response assessments, in which students must create their own answer, may better reveal students' thinking, but are time- and resource-intensive to evaluate. This report describes the initial meeting of a National Science Foundation-funded cross-institutional collaboration of interdisciplinary science, technology, engineering, and mathematics (STEM) education researchers interested in exploring the use of automated text analysis to evaluate constructed-response assessments. Participants at the meeting shared existing work on lexical analysis and concept inventories, participated in technology demonstrations and workshops, and discussed research goals. We are seeking interested collaborators to join our research community.


Subject(s)
Engineering/education , Mathematics/education , Research , Science/education , Students/psychology , Technology/education , Artificial Intelligence , Educational Measurement , Software , Thinking
SELECTION OF CITATIONS
SEARCH DETAIL
...