Hmm. In looking at the CogAT information I also see they refer to it as a measure of "learned reasoning". So perhaps my concerns should really be aimed at a misuse of the test by schools? I doubt most schools are spending time making sure that all students are learning to do the kinds of problems that are on the test, so I see two problems. First, the equal access concerns that I raised previously, and second, the interpretation (misinterpretation?) of results when prepped students and unprepped students are being considered and compared within the same group. I see it a bit the way I see the grade level question that comes up periodically as related to EXPLORE. Grade accelerated students are supposed to be considered alongside students of the grade they've accelerated to because they are then being compared to kids who have been exposed to the same content. Yet there is certainly a difference between a 3rd grader who is instructed at a fifth grade level and a fifth grader who is instructed at a fifth grade level, even if they have identical EXPLORE scores. And there is a significant difference between two third graders who score at the same level but have been exposed to different levels of curriculum.

All of which begs the question....is there any good
screener out there that schools can use to accurately identify students who are slipping under the radar due to learning style, lack of opportunity/exposure, language proficiency, etc.? And, without taking an individual IQ test, how do parents get an accurate picture of their children's typicality or atypicality? Maybe that's not possible? And is there any role for context in helping schools accurately identify needs? Or will that just lead to inaccurate parent reporting for fear that information they share will be used against them?