Observational cross sectional study.To examine the inter-rater reliability of novice raters in using the Movement System Impairment (MSI) approach system and to explore the patterns of disagreement in classification errors. The inter-rater reliability of individual tests items used in the MSI approach is moderate to good; however, the reliability of the classification algorithm has been tested only preliminarily.Using previously recorded patient data (n = 21), 13 novice raters classified patients according to the MSI schema. The overall percent agreement using the kappa statistic as well as the agreement/disagreement among pair-wise comparisons in classification assignments were examined.There was an overall 87.4% agreement in the pairs of classification judgments with a kappa coefficient of 0.81 (95% CI: 0.79, 0.83). Raters were most likely to agree on the classification of Flexion (100%) and least likely to agree on the classification of Rotation (84%).The MSI classification algorithm can be learned by novice users and with training, their inter-rater reliability in applying the algorithm for classification judgments is good and similar to that reported in other studies. However, some degree of error persists in the classification decision-making associated with the MSI system, in particular for the Rotation category.
- Classification of low back pain
- Subgrouping of low back pain