Communication Center  Conference  Projects Share  Reports from the Field Resources  Library  LSC Project Websites  NSF Program Notes
 How to Use this site    Contact us  LSC-Net: Local Systemic Change Network
Virtual Conference 2003

Virtual Conference 2002

Virtual Conference 2001

Other LSC Conference Archives

Lessons Learned 2002

Lessons Learned 2000

Effects of the LSC

Other Presentations

Public Engagement

Conference Schedule

Conference Material

  New!     

Proposal for Measuring the LSC Program's Effects on Student Achievement in Science

author: Horizon Research, Inc.
presented at: Studying the Effects of the LSC on Students Conference
published: 02/02/2001
posted to site: 02/02/2001
HRI proposed study of student achievement

Studying the Effects of the LSC on Students Conference

Horizon Research, Inc. Proposal for Measuring the LSC Program’s Effects on Student Achievement in Science

Presentation given by Dan Heck of Horizon Research (dheck@horizon-research.com) , distributed in the conference materials binder.
Use this link to see the slides associated with this talk.

Pressure is growing on high-profile government funded education initiatives to demonstrate their effects, particularly on student achievement. In response, beginning in 1998, NSF has incorporated into the RFP for Local Systemic Change Initiatives (LSC) a requirement that each project examine its effects on student achievement and other student outcomes. Being sensitive to differences in local contexts, NSF is allowing projects flexibility in how they choose to show evidence of effects on student outcomes. Differences in the nature of relevant student achievement data available to each project, as well as differences in other data about students, teachers, and schools available to each project, make this flexibility a necessity. The range of studies that projects will produce will provide a wealth of evidence about the variety of effects the LSCs are having on student outcomes. Studying program-wide effects systematically, however, will be made more difficult due to the varying instrumentation and designs of the studies.

HRI has proposed a study to meet this challenge for investigating program-wide effects on student achievement in science, without imposing undue burden on projects. The study is limited to only those projects that include a late elementary (grades 4-6) science component. This choice was made for three reasons: the majority of LSC science projects are included in this group, few projects already have student achievement data in science available for the late elementary grades, and items for measuring science achievement (although limited) are available for these grade levels.

Instrumentation

The study will employ an achievement test in science made up of multiple choice items taken from the National Assessment of Educational Progress (NAEP) and the Third International Mathematics and Science Study (TIMSS). Items from these sources have been through extensive validity, reliability, scaling, and item functioning analyses as measures of science achievement.

The instrument will be administered to 4th, 5th, or 6th grade students in different projects. For this reason, the items selected for the study include some taken from both the 4th grade and 8th grade NAEP and TIMSS item pools. The items represent a range of difficulty, allowing appropriate testing of students’ science achievement across a broad range of achievement levels.

The items have been selected, with the assistance of an expert in science assessment and the Principal Investigators of the K-8 science LSCs, to represent the science content areas central to the units of LSC-designated instructional materials most frequently taught in the 4th, 5th, and 6th grades. The instrument will yield scores for student achievement in: overall science, earth science, life science, physical science, and nature of science.

A questionnaire will be used to gather information from teachers regarding which science units they have taught during the study year and the extent of their professional development as a part of the LSC project. Teachers will also provide some demographic information about the students in their classes, including eligibility for the free/reduced lunch program. A student questionnaire will be used to gather additional demographic data.

Projects may want to augment the HRI assessment instrument to measure additional outcomes of interest. In particular, projects are encouraged to add open-ended items or performance tasks. These additions will enable projects to include in their studies types of knowledge that are not measured by the HRI instrument, and to model more appropriately the kinds of assessment the project advocates for classroom use.

Sample

HRI will randomly sample four to six classes of students (approximately 120 students) per project at the 4th, 5th, or 6th grade level for the program study. This limited sampling requirement will keep the level of burden on any given project to a minimum and still provide enough data across the entire LSC for the study to have considerable statistical power for all planned analyses. The demographic characteristics of the sampled students will be compared to the student demographics of the districts to examine the sample’s representativeness of the population.

(HRI will accommodate individual project requests to include a larger sample if the project opts to use this study for their own information needs or to fulfil their NSF requirement for a project-level study of the LSC’s effects on student achievement.)

Design

This study will use longitudinal (pre-post) data to control for students’ prior knowledge of the science content being tested. Several demographic factors will also be controlled. The pre-test will be administered near the beginning of the school year, and the post-test at the end. In order to match pre- and post-test data, HRI will assign ID numbers to each test, and teachers and/or project staff will be responsible for making sure that the same student receives the test with the same ID for the pre- and post-tests. These ID numbers will also be used to link student demographic data provided by teachers.

This study provides built-in comparison groups: not all students tested will have been exposed to each content area covered on the instrument during the period between pre- and post-tests (see instrumentation section), not all students will have received instruction from teachers at equivalent levels of training by the LSCs, and not all students will have been taught using LSC-designated instructional materials. Via the teacher questionnaire, HRI will be able to determine the extent to which classes were taught a given content area and the extent to which it was taught using the LSC-designated instructional materials. At the pre-test, HRI will ask participating teachers how much LSC professional development they have received, both overall and in each content area covered by the sub-scales of the instrument. At the post-test, HRI will ask teachers about the science content areas covered that year, and the extent to which they used the LSC-designated instructional materials to teach each content area. Teachers will also report the amount of LSC professional development they have received overall and in each content area during the study year.

HRI will also have information about the English-language proficiency and socioeconomic status (SES) of students from the teacher, and gender and race/ethnicity directly from the students. These data will allow HRI to examine the extent of any "achievement gaps" by gender, race/ethnicity, English-language proficiency and SES. The effects of the LSC with respect to these achievement gaps will also be studied.

Internal Validity

The assessment instrument will be piloted to confirm the psychometric properties of the items and scales, and to gauge the instructional sensitivity of the items.

Demographic, background, and treatment characteristics of the samples of students and teachers will be compared to information from other research and evaluation studies on the LSC program to examine the sample for potential biases.

Information collected for the science achievement program study about participation in LSC professional development, coverage of content, and use of the LSC-designated instructional materials will be compared to results from the Teacher Questionnaire and Classroom Observation Protocol used for the LSC Core Evaluation. This comparison will assure credibility of the data and consistency with other research and evaluation on the LSC Program.

Analysis

Descriptive statistics on all variables will be presented to provide information about the characteristics of the sample of students and teachers. Descriptive statistics presented for the variables measured on the assessment instrument, including sub-scales, will provide basic information on the status of science achievement across the represented LSC projects. The sample for the program study will not be large enough to provide representative data at the project level (although projects electing to administer the science achievement instrument to a larger sample will have this capability).

A series of hierarchical linear models (HLM) will be used to test relationships among the independent variables measured at the student and teacher levels and the outcomes measured on the assessment instrument. Models will be included for overall science achievement gains, and science achievement gains on each of the sub-scales.

The models will include three levels of equations: student, teacher, and project. Independent predictor variables will be included at the student-level and teacher-level. The project-level equations, with no independent predictors, will serve only as a means to account for variance in outcomes that lies across projects. No specific analyses will be performed at the project level.

The main outcome of interest in the study is science achievement, including achievement on the sub-scales, on the post-test administration of the assessment instrument. Individual scores on the pre-test will be used to adjust the post-test outcome scores for initial achievement levels (prior knowledge), yielding estimates of the "gain scores" in achievement for the time period between the pre-test and post-test administrations of the assessment instrument. Gain scores of students receiving instruction in a content area will be compared to gain scores of students not receiving instruction in that area. The extent to which instruction delivered using the LSC-designated instructional materials and the extent of teachers’ LSC professional development will also be considered in the analysis.

Student-level variables will be entered both as controls for demographic differences and as factors for examining important achievement gaps. These variables include:

    • socioeconomic status,
    • English-language proficiency,
    • race/ethnicity, and
    • gender.

Teacher-level variables (which represent important treatment factors related to and not related to the LSC) will be tested as predictors of science achievement gains and of any achievement gaps that exist in the outcomes. These variables include:

    • teaching of tested content areas between the pre-test and post-test,
    • extent of teaching of tested content areas using LSC-designated instructional materials
    • overall amount of LSC professional development, and
    • extent of LSC professional development on teaching using the LSC-designated instructional materials for the tested content areas.