Gifted Issues Discussion homepage
Are they close to the real questions? And do you score one point for each question or how do they convert the raw score to the final 1-25 points?

Hi
I can't tell you for sure, but according to DS the practice questions are easier than the actual test. Of course, it is the perception of a 12 yr old that took the test 2 years ago.

As far as scoring goes, it is not a one point per Question as the sections have anywhere from 28 to 40 questions each.

Jtooit
The conversion is done differently for each version of the test.

My DD did far better than I'd anticipated, based on the sample questions. She had no interest in even looking at the sample questions, though, and is generally diligent in an actual test environment.

I will say that her kid-perception did not map to actual test results. One section, she answered all 30 questions, said she thought she got them all right, and scored a 19. One section, she answered 22 of 28, said she guessed on almost all of them, and scored a 19.
Thank you. I also thought the practice questions are easier than the actual test as my DD10 breezed through with 2 wrong out of 45. Note she is basically an athlete spending far less time on math than juggling soccer ball or gym matts.
The book you get when you register for the test also seems (by DS's report) to be somewhat easier than the actual test, but it's nice to look at for the style of the questions and the kinds of things that might be asked. He didn't do the online ones, ran through the paper practice test once (in four separate short sittings, one sitting for each topic) and then took the EXPLORE. He's not a big prep kind of guy.

DeeDee
My dd thought the online questions were the same as the mailed sample test. Also according to dd, we were mailed the same sample test two years in a row. So of course she had all the sample test answers memorized. Snort.
according to DS, real Explore test is much more difficult than sample test.
The official SAT guide from the College Board has questions from past exams, with the difficulty stated on a scale from 1 to 5 IIRC. Does the official ACT guide do something similar? The ACT and Explore are produced by the same organization, and I wonder if the easier questions on the ACT are at the same level as Explore questions.
I will say that her kid-perception did not map to actual test results. One section, she answered all 30 questions, said she thought she got them all right, and scored a 19. One section, she answered 22 of 28, said she guessed on almost all of them, and scored a 19.

My only caveat to your thoughts above would be to not assume that she was very off. If, as I suspect, she thought that the reading was easy and she scored a 19, she only missed two questions out of the 30. My son had the same reading score and at first I assumed it was a bit of a weakness, but our school district had posted some answers about the EXPLORE for 8th grade parents and this was the answer re: "my son only missed two questions, how did he end up with a 19?"ACT uses an equipercentile equating methodology and smoothing procedure that results in Explore subscale scores that have the same meaning across multiple years and forms. In other words, a student that obtains a subscale score of 18 one year is performing at the same academic level as a student that obtains a subscale score of 18 on a different year using a different form. You are correct, this year on the Reading subscale obtaining 28 out of 30 items correct resulted in a Reading subscale score of 19; earning 29 out of 30 items correct resulted in Reading subscale score of 22 and earning 30 out of 30 items correct yielded a Reading subscale score of 25. In the case of this specific subscale with 30 total items, missing one or two questions has in impact in terms of subscale score that is remarkable. This is not an error nor is it necessarily a weakness of this test. As a predictor of future success on the PLAN and ACT the Explore is highly reliable.
I would just add to the chorus that the EXPLORE sample questions are easier than the actual test. My ds10 did very well (made the awards ceremony) but even so, he thought that it was more challenging than the practice questions.
We didn't do any prep, but if I were going to prep him I'd probably concentrate on ACT prep or maybe SAT quetions of the day; it depends how old she is.
Originally Posted by momtofour
this year on the Reading subscale obtaining 28 out of 30 items correct resulted in a Reading subscale score of 19; earning 29 out of 30 items correct resulted in Reading subscale score of 22 and earning 30 out of 30 items correct yielded a Reading subscale score of 25.

Wow. Yes, it was the reading. Did your district post scaled-to-raw-score conversions, or something else that allowed people to know how many answers they got right?

(NUMATS reported that no kids scored 24 on Reading, which I assume indicates that no raw score mapped to a 24, but there were kids who scored 20, 21, and 23, so presumably there were some versions of the test that scaled up differently.)
Both DS and DD thought the sample questions were easier than the actual test questions.

If you are really wondering whether sample questions are predictive of the actual scores, then the answer depends on your child.

For my DS, who is more advanced, you can say that performance on the samples predicted performance on the actual test. For my DD, who is not as advance, doing really well on the sample questons did not necessarily translate to top scores on the Explore. That seems to hold true for them on other out of level tests like SCAT.
Wow. Yes, it was the reading. Did your district post scaled-to-raw-score conversions, or something else that allowed people to know how many answers they got right?

(NUMATS reported that no kids scored 24 on Reading, which I assume indicates that no raw score mapped to a 24, but there were kids who scored 20, 21, and 23, so presumably there were some versions of the test that scaled up differently.)

Since we took the test through NUMATS, I didn't get a lot of what my district sent home. However, the information above was posted on our district gifted group's website. Apparently, some parents of 8th grade gifted/advanced kids were having a heart attack worrying that little Susie or Johnny wouldn't get recommended for honors English/Reading in 9th grade because the "recommended" EXPLORE score was >22 (meaning, you needed a perfect score). There was another FAQ explaining that those were guidelines and yes, if Susie or Johnny were excellent students with lots of other good data, they'd still be recommended for the honors track in HS.
I didn't save my daughters' EXPLORE tests (they're 21 and 19 now, so it was a long time ago) but I do think that when you take it through district you get a very detailed report. When another ds took it through a local university (not a talent search, but offering gifted programming), we not only got his exact answers/scores, but we got a copy of his exam as well (the actual test booklet).
© Gifted Issues Discussion Forum