TY - JOUR AB - Inter‐rater agreement in a peer performance evaluation system was analyzed using a sample of 44 individuals who rated focal persons in seven teams. Objective information concerning individual performance on multiple choice tests, as well as information gleaned from individual contributions to team testing and team graded exercises, resulted in high inter‐rater reliabilities (assessed via ICCs) and strong criterion related validity for the performance evaluation instrument. A discussion centers on the effect of providing objective job performance information to evaluation participants. VL - 5 IS - 8 SN - 1352-7592 DO - 10.1108/13527599910304912 UR - https://doi.org/10.1108/13527599910304912 AU - Valle Matthew AU - Davis Kirk PY - 1999 Y1 - 1999/01/01 TI - Teams and performance appraisal: Using metrics to increase reliability and validity T2 - Team Performance Management: An International Journal PB - MCB UP Ltd SP - 238 EP - 244 Y2 - 2024/04/25 ER -