Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Eval Program Plann ; 71: 68-82, 2018 12.
Artigo em Inglês | MEDLINE | ID: mdl-30165260

RESUMO

At its core, evaluation involves the generation of value judgments. These evaluative judgments are based on comparing an evaluand's performance to what the evaluand is supposed to do (criteria) and how well it is supposed to do it (standards). The aim of this four-phase study was to test whether criteria and standards can be set via crowdsourcing, a potentially cost- and time-effective approach to collecting public opinion data. In the first three phases, participants were presented with a program description, then asked to complete a task to either identify criteria (phase one), weigh criteria (phase two), or set standards (phase three). Phase four found that the crowd-generated criteria were high quality; more specifically, that they were clear and concise, complete, non-overlapping, and realistic. Overall, the study concludes that crowdsourcing has the potential to be used in evaluation for setting stable, high-quality criteria and standards.


Assuntos
Crowdsourcing/métodos , Crowdsourcing/normas , Avaliação de Programas e Projetos de Saúde/métodos , Avaliação de Programas e Projetos de Saúde/normas , Opinião Pública , Confiabilidade dos Dados , Humanos , Projetos de Pesquisa
2.
Eval Program Plann ; 66: 183-194, 2018 02.
Artigo em Inglês | MEDLINE | ID: mdl-28919291

RESUMO

This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach.


Assuntos
Crowdsourcing/métodos , Crowdsourcing/normas , Confiabilidade dos Dados , Internet , Projetos de Pesquisa/normas , Humanos , Grupos Minoritários/psicologia , Pesquisa Qualitativa , Reprodutibilidade dos Testes , Evasão Escolar/psicologia
3.
Eval Program Plann ; 54: 63-73, 2016 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-26519690

RESUMO

This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews.


Assuntos
Crowdsourcing/normas , Confiabilidade dos Dados , Entrevistas como Assunto/normas , Pesquisa Qualitativa , Projetos de Pesquisa , Adulto , Feminino , Humanos , Internet , Masculino , Avaliação de Programas e Projetos de Saúde , Fatores Socioeconômicos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...