Communication Center  Conference  Projects Share  Reports from the Field Resources  Library  LSC Project Websites  NSF Program Notes
 How to Use this site    Contact us  LSC-Net: Local Systemic Change Network
Educational Reform & Policy

Professional Development

Teaching and Learning

LSC Papers and Reports

Cross Site Reports

LSC Case Study Reports

Papers / Presentations authored by LSC Members

About LSC Initiatives and their impact

Bibliographies

Paper

  New!     

A Summary of Project Efforts to Examine the Impact of the LSC on Student Achievement

author: Eric Banilower
published in: Horizon Research
published: 02/14/2001
posted to site: 02/14/2001

by Eric Banilower
October 2000

Prepared for:
The National Science Foundation
4201 Wilson Boulevard
Arlington, VA 22230

Prepared by: Horizon Research, Inc.
326 Cloister Court
Chapel Hill, NC
27514- 2296

Introduction

In February 2000, HRI surveyed the PIs of the LSC projects to ascertain whether they had undertaken any studies examining the impact of the LSC on student achievement. Forty- seven of the 68 projects responded. Of these, 12 indicated that they had no student achievement studies. HRI then contacted the remaining 35 projects for information about their study design, instrumentation, and results. Twenty- seven of the projects participated in the interviews between April and June 2000. The interviews revealed twelve projects had completed or nearly finished studies, four had begun studies, and six were in the planning stage of their studies. The other five projects interviewed did not have student achievement studies. A summary of the data collection process is shown in Figure 1.

History of Data Collection
Figure 1

This report analyzes individually the nine completed, or nearly completed, studies HRI was able to obtain, and then attempts to draw some conclusions across all of them. It is important to note that many of the studies reported only group means and did not statistically test group differences. Without information regarding the variance of group scores (i. e., standard errors or standard deviations), it was impossible for HRI to statistically test these differences or to estimate the magnitude of any differences for these projects. When possible, information on effect sizes1 and the results of statistical tests are included.2

One of the key issues to consider when analyzing these studies is their internal validity. In other words, how strong a case do the authors build that any impacts (i. e., increases in student achievement) are attributable to the treatment variable (i. e., participation in the LSC)? Two of the main factors contributing to a study's validity are how well the study's methodology controls for extraneous variables (e. g., initial ability level or school tracking policies) and the level of bias in sample selection (e. g., only teachers of advanced students are in the experimental group). A solid study should be designed to rule out plausible rival hypotheses that could explain any differences found equally well as the study's research hypothesis. To help the reader weigh the results, this report point outs any major threats to internal validity in each of the studies.

Mathematics Studies

HRI was able to obtain reports or summaries of results from five mathematics projects, including one project which sent results from studies done independently by five participating schools. Overall, the quality of the studies is mixed; while most of the mathematics studies had notable threats to internal validity, a couple took steps to reduce these threats and strengthen their case that the LSC is attributable for gains in student achievement. One of the most common weaknesses of these studies was not controlling for initial differences between treatment and control groups. The exceptions were the Project 4 study and school #5 in Project 3.

In general, the studies appear to show positive impacts of the LSC on students' mathematics achievement. However, results need to be interpreted with caution since in most cases it is difficult to make the case that the impact is due primarily to the LSC and not to other, unmeasured, interventions or policies.

Project 1 (K- 12 Mathematics)

At this time, the project has compared project wide results for 4th , 8th , and 10th graders from 1999

to 1998 using performance assessment items developed by the Balanced Assessment Project. Over 1000 students per grade level were tested each year. Sixteen items were repeated on both years' assessments: six 4th grade items, six 8th grade items, and four 10th grade items. The study

found significant differences on nine of the sixteen items. As can be seen in Table 1, students in 1999 scored higher than students in 1998 on six of the items (three at 4th grade, two at 8th grade,

and one at 10th grade) and scored lower on 3 items (all at the 8th grade level). The author points

out that one should not read too much into these results as they are small changes, that different students were tested each year, and there was no control for initial differences in students' ability levels.

Horizon Research, Inc. 3 October 2000

Table 1
1998- 1999 Comparison of Student Performance Scores
Item Grade Max Score 1998 Mean 1999 Mean Difference Effect Size
Halve It 4 15 4.82 6.05 1.23* .31
Block Towers 4 5 1.79 2.34 0.55* .26
Toothpick Squares 4 5 2.51 2.92 0.41* .24
Tim's Number 4 15 4.08 4.05 -0.03
Favorite Sports 4 5 2.00 1.97 -0.03
Pears and Bananas 4 5 2.22 2.08 -0.14
Toothpick Squares 8 5 1.65 2.94 1.29* .87
Leisure Center 8 15 5.26 5.72 0.46* .18
Take a Cube 8 5 2.81 2.33 -0.48* -.25
Pam's Number 8 15 7.93 5.91 -2.02* -.44
Building Units 8 5 3.37 2.75 -0.62* -.52
Metro 8 5 2.12 2.18 0.06
Calendar Patterns 10 15 3.81 4.42 0.61* .20
Number Grids 10 15 3.78 4.05 0.27
Swimming Race 10 5 1.82 1.75 -0.07
Bottle 10 5 1.14 0. 89 -0.25

Project 2 (K- 8 Mathematics)

The project looked at the percent of students at or above the national norm on the ITBS, comparing 1999 data to each school's baseline year (the year before they became involved in the LSC). Results show increases for most of the schools involved with the LSC (see Table 2). However, there are no comparable data shown for non- LSC schools making it difficult to attribute these increases to the LSC. It is possible that ITBS scores were higher over this time period across the entire city due to factors unrelated to the LSC (e. g., familiarity with the assessment, or district retention policies).

Table 2
Percent of Students at or above National Norm on the ITBS
School Baseline 1999 Difference Effect Size
21 18.6 52.0 33.4* 0.72
29 31.9 65.1 33.2* 0.68
24 9.1 27.7 18.6* 0.50
26 51.9 73.4 21.5* 0.45
23 21.8 42.4 20.6* 0.45
9 41.2 62.7 21.5* 0.43
20 18.3 37.1 18.8* 0.43
15 17.9 37.0 19.1* 0.43
6 48.6 68.9 20.3* 0.42
30 18.7 36.1 17.4* 0.39
22 18.4 34.8 16.4* 0.38
5 56.0 73.5 17.5* 0.37
12 14.7 30.0 15.3* 0.37
3 12.0 25.1 13.1* 0.34
1 43.2 59.7 16.5* 0.33
2 51.4 66.5 15.1* 0.31
10 32.1 47.2 15.1* 0.31
17 29.9 44.9 15.0* 0.31
8 15.1 27.8 12.7* 0.31
18 14.9 27.4 12.5* 0.31
16 38.8 53.4 14.6* 0.29
4 38.0 51.8 13.8* 0.28
25 21.1 32.1 11.0* 0.25
13 15.3 22.7 7.4* 0.19
14 29.2 37.8 8.6* 0.18
27 16.0 23.3 7.3
28 34.1 40.0 5.9
19 96.6 100.0 3.4
11 13.3 16.2 2.9
7 65.6 67.0 1.4

 next page