Inter‐rater agreement in a peer performance evaluation system was analyzed using a sample of 44 individuals who rated focal persons in seven teams. Objective information concerning individual performance on multiple choice tests, as well as information gleaned from individual contributions to team testing and team graded exercises, resulted in high inter‐rater reliabilities (assessed via ICCs) and strong criterion related validity for the performance evaluation instrument. A discussion centers on the effect of providing objective job performance information to evaluation participants.
Valle, M. and Davis, K. (1999), "Teams and performance appraisal: Using metrics to increase reliability and validity", Team Performance Management, Vol. 5 No. 8, pp. 238-244. https://doi.org/10.1108/13527599910304912Download as .RIS
MCB UP Ltd
Copyright © 1999, MCB UP Limited