skip to navigation skip to content
WCER - Wisconsin Center for Education Research Skip Navigation accessibility
 
School of Education at the University of Wisconsin-Madison

ABOUT WCER NEWS Events Cover Stories Research News International Research Press WHAT'S THE RESEARCH ON...? PROJECTS All Active Projects All Completed Projects PUBLICATIONS LECTURE SERIES PEOPLE Staff Directory Project Leaders ERG - EVALUATION RESOURCES GROUP RESOURCES Conference Rooms Equipment GRANT SERVICES GRADUATE TRAINING SERVICE UNITS Director's Office Business Office Technical Services Printing & Mail EMPLOYMENT CONTACT INFO MyWCER WORKSPACE LOGIN

   
Home > News > Cover Stories >
Surveys of the Enacted Curriculum
Surveys of the Enacted Curriculum

Standards-based reform may not yet have brought instruction into alignment with state tests, according to recent findings by WCER’s Andrew Porter and John Smithson. Their recent study, unveiled at this spring’s AERA convention, also found that mathematics instruction was slightly more aligned with NAEP standards (National Assessment of Educational Progress) than with state tests. In science, the opposite was true.

Porter and Smithson reached their conclusions using an instrument they devised for states and districts to use in conducting formative and summative evaluations of standards-based reforms in math and science. Eleven states participated in the study, which was funded by the National Science Foundation through a subcontract with the Council of Chief State School Officers.

Teachers were asked to describe the degree of emphasis their instruction placed on each of many topics in mathematics and science over the past year. For each topic they taught, teachers also indicated the degree to which they emphasized one of several cognitive demands (for example, memorization, solving novel problems, formulating hypotheses). Porter and Smithson also analyzed NAEP and state test content, item by item.

The resulting data allowed comparison of the alignment of assessment to assessment (including state assessment alignment with NAEP), alignment of instruction to assessment, and alignment of instruction with instruction (state by state) at each grade level, for each subject. Alignment is described through an index that ranges from 1.0, perfect alignment, to 0.0, no alignment. The index also describes the extent to which relative emphases of each topic and cognitive demand on say, one test, matches the relative emphasis of the same content on another test (or instruction).

Two caveats are necessary before considering these results, Smithson says. First, to the extent that a state test is not aligned to a state’s content standards, one might not want instruction to be aligned to the state’s test. Nevertheless, to the extent a state test is used in an accountability program, it may have an influence over instructional practice.

Second, these data are illustrative only. The samples of instruction in each state cannot be taken as representative of that state. The samples are neither random nor sufficient in size. The data are, however, a proof of concept of the approach to testing effects of standards-based reform on instruction. And the results of the content analyses of the tests are definitive.

For each subject and grade level, state tests are more aligned with each other than they were with NAEP, although the differences are not large (e.g., .36 versus .31 for eighth-grade mathematics and .45 versus .35 for eighth-grade science).

Porter and Smithson found that instruction in one state is quite similar to instruction in another state. State average instruction-to-instruction alignment indicators ranged from .63 to .80. These high degrees of alignment suggest better sampling might have produced similar results. “However,” said Porter, “one should not interpret this to indicate lack of variation in practice across teachers within a state. When individual teacher reports of content are compared within a state, or even within a school, the degree of alignment drops considerably.”

In general, instruction in a state was no more aligned to that state’s test than it was aligned to the tests of other states. Porter says this suggests that standards-based reform has not yet brought instruction into alignment with state tests.

Smithson and Porter used the data to construct “maps” of the content coverage of state assessments, NAEP assessments, and instructional practice as reported by teachers (see Figure XX). Practicing educators have found these topographical representations of content to provide a useful representation of content emphasis in instruction and on assessments. Content maps can be compared to get a picture of where there is alignment and where there is not alignment between, say, a state assessment and NAEP, or a state assessment and instruction in that state.

“While these maps are powerful tools for helping practitioners understand their own instruction and their state assessment, they are not exactly correct in one respect,” Smithson says. “For the purposes of map construction, content emphasis is calculated as though the distinction between topics and the distinctions between cognitive demands are on an ordered scale, but they are not. Still, if one compares the topographical maps to a more correct bar graph, the topographical maps are not misleading and tend to be easier to interpret.”

The results provide an indication of the power of the approach described here for assessing alignment. Though Porter and Smithson do not present analyses of state content standards and frameworks, such analyses could be done using procedures similar to those used for analyzing state and NAEP assessments. Analyses such as those reported here provide an objective and replicable way of testing the effects of standards-based reform on instructional practices.


The Surveys of Enacted Curriculum (SEC) collaborative works with states in developing a systematic, efficient method of collecting, analyzing, and reporting data on curriculum content and instructional practices. State participants and invited experts work together to develop the survey materials and to improve their knowledge and skills in survey design, data analysis, reporting, and strategies for using data in professional development.

The SEC collaborative develops strategies for using enacted curriculum surveys and data with local districts and schools. The tools and materials developed—including surveys, data analyses, and report formats, and methods of analyzing curriculum and assessment alignment—have been developed for broad use and disseminated via the Internet or a CD ROM.

For more information contact John Smithson at 608-263-4354 or johns@mail.wcer.wisc.edu.