Case Study: Comparing the Predictive Validity of High-Stakes Standardized Tests

Assessing Whether PARCC and MCAS Predict College Outcomes in Massachusetts
Although the study . . . found the tests were about equally good at predicting student grades in their first year of college, it said the PARCC standards for college readiness were a better predictor of whether students could make A’s or B’s in math than the MCAS proficiency standard.
- Boston Globe
Project Facts
  • This is the first study to measure the predictive validity of the PARCC exam and compare it to the state assessment that it was intended to replace.
  • Overall, MCAS and PARCC scores have a similar ability to predict college grades. Although scores are equally predictive, the two assessments differ in the degree to which their designated performance standards predict college grades and the need for remedial math, with PARCC outperforming MCAS.
  • The findings helped inform state policymakers considering a switch to the PARCC assessment.
  • By examining rigorous evidence about the validity of these two exams, Massachusetts provides a model for other states weighing difficult choices about their educational assessments.
The Issue
The Approach

The Massachusetts Executive Office of Education was responsible for recruiting the study sample and administering the tests. The study sample consisted of 847 first-year college students who had graduated from Massachusetts high schools and then enrolled at one of 11 public in-state campuses that participated in the study. Students who volunteered to take part were randomly assigned to complete one component of either the MCAS or PARCC exams, ensuring that students taking the PARCC assessment were not systematically different from those taking the MCAS. Mathematica Policy Research analyzed the resulting data and explained results of the study to state policymakers.

The Impacts

The results of this study fueled an intense debate. Although some stakeholders believed the PARCC was a more effective tool than the MCAS at measuring students’ college readiness, the study showed no difference between the new test and the long-standing assessment. Mathematica’s study revealed that both exams succeeded in predicting college readiness.

Key findings include the following:

  • MCAS and PARCC scores are comparable to Scholastic Assessment Test scores in terms of predicting college outcomes.
  • The overall validity of scores on PARCC assessments in predicting college grades is similar to the validity of scores on the MCAS. In both math and English language arts, the correlation between PARCC scores and college outcomes was statistically indistinguishable from the correlation between MCAS scores and college outcomes.
  • Meeting the PARCC standard for college readiness in math predicted a higher level of college performance than meeting the MCAS standard for math proficiency. Students meeting the PARCC math standard were also found to be less likely to need remediation than those meeting the standard for the MCAS.
  • In English language arts, however, there were no statistically significant differences between the level of college performance predicted by the PARCC standard for college readiness and the MCAS proficiency standard.
  • Because the underlying scores on the MCAS and PARCC assessments are equally predictive of college outcomes, the study revealed that Massachusetts policymakers had more than one way to align high school mathematics test standards with college readiness: either adopt the PARCC exam or continue using MCAS while simply setting a higher score threshold for college readiness. Either of these options would ensure that the state’s high school assessments provide better information about college readiness to students, parents, educators, and policymakers.
  • Even though we cannot directly compare the assessments of other states with the PARCC, the study provides useful evidence for any state considering adopting PARCC assessments. Indeed, in many other states, the difference between existing proficiency standards and those of PARCC is likely to be substantially larger than in Massachusetts, where proficiency standards were already well above those of most states.
To learn more about how Mathematica Policy Research can provide solutions to your challenges, contact: info@mathematica-mpr.com.

This case study is for informational purposes only. Mathematica Policy Research, a nonpartisan research firm, provides a full range of research and data collection services, including program evaluation and policy research, survey design  and data collection, research assessment and interpretation, and program performance/data management, to improve public well-being. Its clients include federal and state governments, foundations, and private-sector and international organizations. The employee-owned company, with offices in Princeton, N.J.; Ann Arbor, Mich.; Cambridge, Mass.; Chicago, Ill.; Oakland, Calif.; and Washington, D.C., has conducted some of the most important studies of education, disability, health care, family support, employment, nutrition, and early childhood policies and programs.

About the Project

Download the Mathematica PARCC study podcast

This study, conducted for the state of Massachusetts, examined the predictive validity of the Partnership for Assessment of Readiness for College and Careers (PARCC) exam when compared to a specific state assessment--the Massachusetts Comprehensive Assessment System (MCAS)—that had been used in prior years. The state was debating whether to continue using the MCAS or adopt the new PARCC exams for testing the achievement of public school students. The tests define student performance in different ways: the MCAS measures student proficiency related to statewide standards, and the PARCC measures whether students are on track to succeed in college.

This study sought to answer the open question of whether the PARCC exam succeeds in measuring college readiness better than the MCAS; at this point in time, no prior research had analyzed PARCC test scores as a predictor of college outcomes.

The 10th-grade MCAS and corresponding PARCC tests were administered to a sample of first-year college students at 11 public higher-education institutions across the state. For each test, Mathematica examined whether high-scoring students performed better in college than low-scoring students.