Your browser doesn't support javascript.
loading
Assessing inter- and intra-rater reliability of movement scores and the effects of body-shape using a custom visualisation tool: an exploratory study.
Ross, Gwyneth B; Zhao, Xiong; Troje, Nikolaus F; Fischer, Steven L; Graham, Ryan B.
Afiliação
  • Ross GB; School of Human Kinetics, Faculty of Health Sciences, University of Ottawa, 200 Lees Avenue, Ottawa, ON, K1N 6N5, Canada.
  • Zhao X; School of Human Kinetics, Faculty of Health Sciences, University of Ottawa, 200 Lees Avenue, Ottawa, ON, K1N 6N5, Canada.
  • Troje NF; Centre of Vision Research & Department of Biology, York University, Toronto, ON, M3J 1P3, Canada.
  • Fischer SL; Department of Kinesiology, University of Waterloo, Waterloo, ON, N2L 3G1, Canada.
  • Graham RB; School of Human Kinetics, Faculty of Health Sciences, University of Ottawa, 200 Lees Avenue, Ottawa, ON, K1N 6N5, Canada. rgraham@uottawa.ca.
BMC Sports Sci Med Rehabil ; 16(1): 205, 2024 Sep 30.
Article em En | MEDLINE | ID: mdl-39350265
ABSTRACT

BACKGROUND:

The literature shows conflicting results regarding inter- and intra-rater reliability, even for the same movement screen. The purpose of this study was to assess inter- and intra-rater reliability of movement scores within and between sessions of expert assessors and the effects of body-shape on reliability during a movement screen using a custom online visualisation software.

METHODS:

Kinematic data from 542 athletes performing seven movement tasks were used to create animations (i.e., avatar representations) using motion and shape capture from sparse markers (MoSh). For each task, assessors viewed a total of 90 animations. Using a custom developed visualisation tool, expert assessors completed two identical sessions where they rated each animation on a scale of 1-10. The arithmetic mean of weighted Cohen's kappa for each task and day were calculated to test reliability.

RESULTS:

Across tasks, inter-rater reliability ranged from slight to fair agreement and intra-rater reliability had slightly better reliability with slight to moderate agreement. When looking at the average kappa values, intra-rater reliability within session with and without body manipulation and between sessions were 0.45, 0.37, and 0.35, respectively.

CONCLUSIONS:

Based on these results, supplementary or alternative methods should be explored and are likely required to increase scoring objectivity and reliability even within expert assessors. To help future research and practitioners, the custom visualisation software has been made available to the public.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: BMC Sports Sci Med Rehabil Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Canadá País de publicação: Reino Unido

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: BMC Sports Sci Med Rehabil Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Canadá País de publicação: Reino Unido