Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Dev Neurorehabil ; 21(4): 266-278, 2018 May.
Article in English | MEDLINE | ID: mdl-26809945

ABSTRACT

OBJECTIVE: This paper demonstrates how to conduct a meta-analysis that includes both between-group and single-case design (SCD) studies. The example studies whether choice-making interventions decrease challenging behaviors performed by people with disabilities. METHODS: We used a between-case d-statistic to conduct a meta-analysis of 15 between-group and SCD studies of 70 people with a disability, who received a choice intervention or control. We used robust variance estimation to adjust for dependencies caused by multiple effect sizes per study, and conducted moderator, sensitivity, influence, and publication bias analyses. RESULTS: The random-effects average was d = 1.02 (standard error of 0.168), so the 95% confidence interval (CI) suggests choice-making reduces challenging behaviors by 0.65 to 1.38 standard deviations. Studies that provided choice training produced a significantly larger intervention effect. CONCLUSION: Choice-making reduces challenging behaviors performed by people with disabilities.


Subject(s)
Meta-Analysis as Topic , Neurological Rehabilitation/methods , Disabled Persons/psychology , Humans , Neurological Rehabilitation/standards , Research Design/standards
2.
J Appl Behav Anal ; 49(3): 656-73, 2016 09.
Article in English | MEDLINE | ID: mdl-27174301

ABSTRACT

The published literature often underrepresents studies that do not find evidence for a treatment effect; this is often called publication bias. Literature reviews that fail to include such studies may overestimate the size of an effect. Only a few studies have examined publication bias in single-case design (SCD) research, but those studies suggest that publication bias may occur. This study surveyed SCD researchers about publication preferences in response to simulated SCD results that show a range of small to large effects. Results suggest that SCD researchers are more likely to submit manuscripts that show large effects for publication and are more likely to recommend acceptance of manuscripts that show large effects when they act as a reviewer. A nontrivial minority of SCD researchers (4% to 15%) would drop 1 or 2 cases from the study if the effect size is small and then submit for publication. This article ends with a discussion of implications for publication practices in SCD research.


Subject(s)
Publication Bias , Research Design , Research Personnel/psychology , Humans , Research Personnel/statistics & numerical data , Surveys and Questionnaires
SELECTION OF CITATIONS
SEARCH DETAIL
...