Gifted Issues Discussion homepage
Posted By: Appleton SAT scores compared to state testing data - 04/14/19 08:15 AM
My 7th grader took the SAT through Duke TIP and did much better than expected. He's always a rockstar in math, but he did well enough on the reading/writing section to qualify for state recognition (>50th percentile for high school juniors/seniors who take the test). His reading subtest was slightly higher than writing.

The reading/writing SAT score really surprised me because his reading scores on our yearly state exam (STAAR) are mediocre. He tends to score around the 60th percentile, which in Texas just barely puts him in the "meets grade level" category. I think our state exams are quite challenging (~25% of students fail), but it still seems like a pretty big discrepancy. Anyone have similar experiences with school testing data showing lower performance than the SAT/ACT or other above level tests?
Posted By: aeh Re: SAT scores compared to state testing data - 04/14/19 06:04 PM
The biggest difference between your state testing results and the SAT/ACT for this age group is that state testing includes open-response questions, including expository writing. SAT/ACT do not. What is described as "writing" is still multiple choice, and requires no actual writing (unless you do the optional essay, which typically is not expected for talent search testing, such as TIP). I would expect that, when you look at your DC's score breakdown for state testing, the multiple choice sections likely are stronger than the OR or expository sections.

For example, on the grade four test in your state (beginning with last year's test), 25% of the score is based on revision (multiple choice), and 50% on editing (multiple choice), while the remaining 25% is based on a single writing sample (composition). A student could meet criteria without doing well on the writing sample, but they would have to get every single mc item correct. On the grade 7 test, 35% of the score is based on the writing sample (with the composition raw score doubled in the calculation of the scaled score), and it is impossible to meet criteria without obtaining at least a few points on the composition.

I would guess that your child's skills for generating accurate writing samples are not at the same level as for identifying errors in mechanics, grammar, and syntax in other people's brief writing samples.
Actually, that's not the case in 7th grade. They have separate reading tests until high school. Writing is only tested in 4th and 7th grade which is as you described, part grammar/editing and part expository writing. He actually scored "advanced/commended" for the writing STAAR in 4th grade and we don't have the 7th grade results back yet. The reading STAAR is the one that he doesn't do all that great on. He takes the reading test about three weeks from now, and I'm really curious to see how he does.

The school does another reading test, I think it's called STAR360 and he has told me that he always scores 12th grade+ for reading grade level. I don't really know what it is about the STAAR reading test that is different. What is it testing that is difficult for my child compared to other tests?
Posted By: aeh Re: SAT scores compared to state testing data - 04/15/19 07:38 PM
Interesting. So there's a linkage study used to discuss the predictive of value of STAR360 for STAAR, which suggests about .75 or so correlation, with about 80+% accuracy for predicting classifications (performance levels).

http://doc.renlearn.com/KMNet/R004573515GK64DE.pdf

I'll have to do some more digging, but at a guess, there may be differences in the proportion of literary to informational text on the various tests, or whether reading comprehension is assessed by responding to questions or cloze (fill-in-the-blanks), or whether the text in question is sentences vs paragraphs. The SAT probably has more inferential comprehension on it, vs an elementary/middle school level text. Also, there are often differences between tests with a single grade-level item set (as many state tests have), and those with adaptive testing (as many progress-monitoring tests such as STAR360 do). (I haven't checked to see if either of those tests is adaptive.) Those are many of the questions I'd ask, anyway.
Posted By: Kai Re: SAT scores compared to state testing data - 04/16/19 01:13 AM
Which scores align best with what you know to be true about his reading?

My son, who began reading at age 2 and who could read essentially anything you put in front of him by age 7 had his reading comprehension scores on the ITBS (paper and pencil achievement test) get progressively worse for several years in a row in middle childhood. I finally discovered that he was skipping the passages and just answering the questions. When I forced him to read the test and questions aloud (as a homeschooler I administered the test myself at home), his score went right back up to the 90 something percentile (on a test that was three years above age-grade level).

So there's that possibility.
Posted By: aeh Re: SAT scores compared to state testing data - 04/16/19 01:35 AM
You can probably get a sense of what he got wrong on the state test. Typically, schools have item-by-item breakdowns of his responses and which were correct, which you can compare to the released complete test items from last spring.
https://tea.texas.gov/student.assessment/STAAR_Released_Test_Questions/
It appears to have both literary and informational text, with at least one set of compare/contrast questions using multiple reading selections. Reading passages are fairly lengthy (some multiple pages), with a lot of looking back required for answering questions.

The SAT does not have multiple-page readings, if I recall correctly. Nor, I believe, would STAR360. FYI, here's a linkage study for Star360 and the SATs:
https://doc.renlearn.com/KMNet/R61746.pdf

Or, as Kai suggested, you can ask him what his test-taking approach is, perhaps using the actual test items as examples.
I don't really know. I would assume that he's somewhere in between. I definitely don't think that he's just barely meeting grade level. In general I'm inclined to trust national tests like the SAT over something made in the state of Texas. There is a lot of controversy over the STAAR test in Texas. Still, it must be measuring something because there are kids who do very well on it.
I think that I'll have him complete one of the published tests for 7th graders to see how he scores and which type of questions he has problems with. It's also possible that something "clicked" this year. His math ability has grown so much in the past year, perhaps puberty is having some kind of impact on his brain.

Thank you for linking the article, I'll have to get his numerical score on the Star360 and see how much it aligns with his SAT.
I received my child's STAAR scores and he did better this year on the reading exam. He scored in the 81st percentile for reading, which qualifies as "masters grade level" and the 86th percentile for writing which falls in the lower "meets grade level" category. The standards must just be higher for the writing exam.

These scores are still lower than what would be expected from his SAT reading/writing score but it isn't disparate enough to make me question it too much. It is interesting to me that he would have never qualified to take the SAT through Duke TIP with his reading STAAR scores but ended up performing better on reading/writing than most kids in that group.

I wonder what the people who designed the STAAR tests were thinking when they developed the score classification categories. For 7th grade reading, "meets grade level" only encompasses 13% of students (59th-72nd percentile) and a range of four raw score points (31-34 out of 42). I would have guessed that at least 30-40% of kids would fallen somewhere in the on grade level category. This narrow classification range means that a lot kids change categories from year to year despite only performing slightly better or worse.
Posted By: Saritz Re: SAT scores compared to state testing data - 06/17/19 09:24 PM
W live in Texas and I have a low opinion of the STAAR test in general.

STAAR is not necessarily accurate as a measure of how your individual child is doing in school and I'm not sure it's even intended to be so. It is high-stakes testing designed to evaluate teachers, schools and districts more than individual students. It's poorly done. The 5th grade version had a graphic that included the F-bomb this year.

Bottom line, I'd take almost any other instrument more seriously than STAAR. The TEA posts graphs of the distribution curve for various tests on their website in the assessment section. The curve does not always or even frequently correlate to an actual bell curve, which leads me to wonder if the state doesn't manipulate the data as needed.

The Star360 tests for Reading and Math are put out by Renaissance, are adaptive, and therefore may be more useful if you want to know how your student is doing.


Quote
I finally discovered that he was skipping the passages and just answering the questions. When I forced him to read the test and questions aloud (as a homeschooler I administered the test myself at home), his score went right back up to the 90 something percentile (on a test that was three years above age-grade level).

Coming late to this discussion but this made me laugh out loud. I have a child who would SO do this. I feel like I should thank you for the heads up.
I heard about the f-bomb. The whole test feels like a failed experiment, I'm surprised that the state hasn't moved on to a new instrument since everyone seems to hate it.

I looked up our school tests results and almost half of the 7th graders failed the STAAR writing exam, as well as more than half of the boys. Statewide, 30% failed writing (39% of boys). I know that kids at my son's school who fail reading or math lose an elective and have to take a remedial class, not sure if it applies to failure on the writing test.
© Gifted Issues Discussion Forum