Communication Center  Conference  Projects Share  Reports from the Field Resources  Library  LSC Project Websites  NSF Program Notes
 How to Use this site    Contact us  LSC-Net: Local Systemic Change Network
Virtual Conference 2003

Virtual Conference 2002

Virtual Conference 2001

Other LSC Conference Archives

Lessons Learned 2002

Lessons Learned 2000

Effects of the LSC

Other Presentations

Public Engagement

Conference Schedule

Conference Material

  New!     

Slides and Notes from "Everything You Wanted to Know About the Guidelines for Studying the Effects of the LSC on Students, But Were Afraid to Ask!" presentation.

author: Eric Banilower, Horizon Research, Inc.
presented at: Studying the Effects of the LSC on Students
published: 02/02/2001
posted to site: 02/02/2001

Everything You Wanted to Know About the Guidelines for Studying the Effects of the LSC on Students,
But Were Afraid to Ask!

Eric Banilower of Horizon Research (erb@horizon-research.com)

These slides were presented during a talk about interpreting the guidlines for assessing the effects of the LSC on students. Draft guidelines were distributed by Horizon Research in the conference materials binder.

The purpose of this presentation was to provide an overview of the Guidelines for Studying the Effects of the LSC on Students. The purpose of the guidelines is to help LSC projects assess the effects of their activities on students and student learning. Recognizing the difficulty of measuring student impacts, these guidelines have been developed to help projects design studies that will meet both their own information needs and those of NSF. The guidelines address a number of important issues for research and evaluation studies, including deciding on appropriate measures, study design, data analysis, and reporting, with a particular emphasis on being able to make the case that any gains you may detect are attributable to the LSC.


A variety of stakeholders will be interested in the results of your project, and how you report your results will often vary depending on the audience. NSF and the research community will be interested in the technical details, while a highlights report will generally be more appropriate for the school board and parents. Thus, you should be prepared to develop more than one report, each appropriately presented for its audience.

Types of Reports

  • Technical Report

    • NSF

  • Highlights

    • School Board
    • Superintendent
    • Parents

  • Press Release


The guidelines address many of the key issues in research and evaluation including instrumentation, sampling, design, analysis, and reporting. However, even the strongest study will be of little value if it is not reported appropriately and well. Thus, a high quality technical report will provide the context for the research and build a case that any observed impacts are attributable to the LSC. To do this, the report will need to address each of the key areas addressed in the guidelines.

Technical Report

  • Instrumentation

    • What instrument was used
    • Why that instrument was selected for the study
    • Psychometric information
    • What data are returned


The technical report should also describe how your samples were chosen and provide some descriptive information about your subjects. In the educational context, where students are nested within classrooms, it is important not only to describe your sample of students, but also the teachers of the students and the schools they attend. Because of the threat of sampling bias it is important to provide the following information:

Sampling

  • How the sample was selected

  • The size of the sample

  • The composition of the sample

  • The representativeness of the sample


A technical report should also include a description of the research design and the rationale for why that design was appropriate for your context. While there are numerous research designs, the guidelines describe the two key features necessary for making the case for attribution:

Design

  • Description of the research design

  • Rationale for using that design

  • Comparison of "treated" students to another group

  • Examination of initial equivalency of the groups


The credibility of a study can be undermined if alternative explanations for the results, such as selection biases, are ignored. Thus, a sound study (and technical report) will:

Internal Validity

  • Identify plausible alternative explanations

  • Use analytic techniques or argument to rule them out

  • Acknowledge remaining shortcomings


Analysis methods and tools should be consistent with the study design and the type and level of outcome data being investigated. An appropriate analysis in a quantitative study includes both descriptive and inferential statistics. Descriptive statistics should include:

For continuous variables (e.g., student test scores, teacher experience), the technical report should include:

Quantitative Analysis:Descriptive Statistics

  • Continuous Variables

    • N's, means, and standard deviations

    For categorical variables (e.g., gender, race/ethnicity), the report should include:

  • Categorical Variables

    • N's, and frequency distributions

  • Overall and for each subgroup


These statistics should be reported overall and for each subgroup your study investigates.

Quantitative Analysis: Inferential Statistics

  • The appropriateness of the statistical procedure

  • Statistical test data:

    • Test statistic
    • Degrees of freedom
    • p-value
    • Effect size


Frequently it is useful to combine quantitative analysis with qualitative analysis. The former provides an overview of success as determined by outcomes that lend themselves to direct measurement and numerical summarization; the latter provides information on outcomes that are best addressed through rich description.

For qualitative analyses, it is important to provide full descriptions of how the data were collected, how the data were analyzed, and how conclusions were drawn from the analysis. Depending on the number of cases you include, either individual case studies or an integrative analysis across data sources may be reported. If the latter approach is chosen, examples from the data should be provided to support your methods and your conclusions. Such examples will both enable your audiences to judge the credibility of your conclusions and gain a deeper understanding of the context of the effects observed.

Qualitative Analysis:

  • How the data were collected

  • How the data were analyzed

  • How the conclusions were drawn

  • Rich examples from the data