ACT College Readiness Scores: Asking the Wrong Questions
An article in today’s newspaper reports the decline in overall college readiness scores for current high school students in Michigan as reported by their ACT scores. The authors point out that nearly 200 schools in Michigan do not have a single student considered “college ready” by ACT. Be prepared, in the coming weeks, this, along with the number 17.8% (the percentage of students who are “college ready” by the ACT standard in Michigan) will be used by politicians statewide.
ACT has a long track record of developing, scoring and reporting valid test scores. Students at our school take the EXPLORE (a version of the ACT for middle schoolers, with important grade level benchmarks) test in seventh and eighth grade. The research, parent tools, and materials provided for schools to analyze results are first rate. Simply put, ACT is an exceptional non-profit organization who produces a valid, high quality test.
To be considered “college ready” by ACT standards, high school juniors need to score 18 in English, 22 in math, 22 in reading, and 23 in science. Considerable time and research has developed these standards, and they are a good predictor of a student’s ability to handle college level coursework. Research by ACT indicates that students who score above the benchmarks are likely to earn a “B” in respective freshman level courses.
While the metric is valid, the way the data is used is flawed. If a high school junior were to score a 27 in science, a 25 in math, but a 21 in reading and a 16 in English, that student would not be considered “college ready” by the benchmark. The fact is, a number of high school students fall into this trap where they meet or exceed college readiness benchmarks in one, two, or even three subjects, but not all four.
Students who meet, or exceed the ACT benchmark score in all four tests can certainly be seen as well rounded, and extremely college ready. However, I’m not interested in how well rounded my future doctor, lawyer, or engineer are. I’m more interested in knowing how my future cardiologist scored on the Science test, or how well the person designing the car I will drive in the future fared on the math test. The future lawyer that will draw up important documents for my family’s estate? I want someone that is college ready in reading and English, his science aptitude is secondary to me. The person who will prepare my income tax forms in the year 2030? He can score less than a “B” in his freshman biology course and still be considered suitable for the job.
17.8% is one number that a politician or reporter can point to for a quick statement about how schools are performing, but it’s not the correct number to use. I’m far more interested in knowing how students who declared an interest in medicine scored on their Science test. What about students who took the test and did not express an interest in any future coursework? Their results can have a huge impact on the conversation around education policy in Michigan.
It’s not sufficient to simply throw out a number here or there. Standardized test data can benefit everyone, but politicians and reporters must dig a little deeper when sharing information with citizens. It’s a strong metric, but weak analysis.