"The question facing every Local Systemic Change project staff intent on effecting the changes described by NRC is how do we know that such changes have occurred or are moving in the right direction? Should we assume that exposure to 130 or more hours of inservice is enough to ensure that project goals will be achieved--that the desired changes in classroom practices and teacher attitudes and perspectives will take place? Or is 'time on task' not the powerful factor in teacher education that it is in student learning? Is it enough to get self-reports from teachers on how they feel about respecting and utilizing students' ideas about science and the importance of evidence in science? Or should we require observable evidence of performance and practice? Can we use student perceptions of the classroom environment as indicators of change or is there simply too much 'noise' in such measures? Then there is the whole area of student performance -- is the only real true test of a systemic reform effort's success the extent to which students perform better on measures of achievement and attitude? Or is the examination of such measures unfair and the results of such measures uninterpretable?"
In this paper, the authors present information "on three different instruments developed to provide insight into the degree to which the Science PALs and Science Co-op projects can be claimed to be effective as implied by the NRC changes in emphasis." They argue for multiple-measure documentation strategies, involving multiple constituents including students, teachers, external observers and supervisors.