0 members (),
823
guests, and
33
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
S |
M |
T |
W |
T |
F |
S |
|
|
|
|
|
|
1
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
9
|
10
|
11
|
12
|
13
|
14
|
15
|
16
|
17
|
18
|
19
|
20
|
21
|
22
|
23
|
24
|
25
|
26
|
27
|
28
|
29
|
30
|
31
|
|
|
|
|
|
|
|
Joined: Sep 2011
Posts: 3,363
Member
|
Member
Joined: Sep 2011
Posts: 3,363 |
Kelli, I pulled out one of my ds' full computer report from one of his WJ-III Achievement Tests... the scores you have posted above are standard scores, which are scores that have been referenced to a norm'd group of students... so they show your ds' performance relative to same age or same grade peers. I suspect that the standard score for the "broad" tests are computed from the subtest raw scores, rather than the subtest standard scores, and that's why they don't seem to make sense.
polarbear
|
|
|
|
Joined: Feb 2013
Posts: 1,228
Member
|
Member
Joined: Feb 2013
Posts: 1,228 |
I am almost 100% certain that there is an error somewhere with the reading scores. The numbers just don't make sense. OP, you need to focus on this, and firmly request the tester get this sorted out ASAP.
|
|
|
|
Joined: Dec 2012
Posts: 882
Member
|
Member
Joined: Dec 2012
Posts: 882 |
WJ comes with a scoring program. You plug in the raw scores and the software calculates the scores. I know the numbers look odd but they are not due to calculation errors - unless your tester decided to calculate scores manually, which would be a very strange thing to do.
ETA: OP, if I were in your place, I'd opt for a portfolio. I don't recall noticing a lot of differences between the two achievement tests, at least not enough to justify the extra expense.
Last edited by Mana; 04/16/14 01:48 PM.
|
|
|
|
Joined: Feb 2013
Posts: 1,228
Member
|
Member
Joined: Feb 2013
Posts: 1,228 |
WJ comes with a scoring program. You plug in the raw scores and the software calculates the scores. I know the numbers look odd but they are not due to calculation errors - unless your tester decided to calculate scores manually, which would be a very strange thing to do. Regardless, it is essentially impossible for the Reading numbers in the OP to be correct. Something has gone wrong somewhere.
|
|
|
|
Joined: Dec 2012
Posts: 882
Member
|
Member
Joined: Dec 2012
Posts: 882 |
Regardless, it is essentially impossible for the Reading numbers in the OP to be correct. Something has gone wrong somewhere. What if their formula for calculating broad reading weighs reading fluency more than other skills? ETA: I'm not sure if I'm making sense and I have to get back to work!
|
|
|
|
Joined: Feb 2013
Posts: 1,228
Member
|
Member
Joined: Feb 2013
Posts: 1,228 |
Regardless, it is essentially impossible for the Reading numbers in the OP to be correct. Something has gone wrong somewhere. What if their formula for calculating broad reading weighs reading fluency more than other skills? Do you know this to be the case, or are you playing devil's advocate?
|
|
|
|
Joined: Sep 2011
Posts: 3,363
Member
|
Member
Joined: Sep 2011
Posts: 3,363 |
WJ comes with a scoring program. You plug in the raw scores and the software calculates the scores. I know the numbers look odd but they are not due to calculation errors - unless your tester decided to calculate scores manually, which would be a very strange thing to do. Regardless, it is essentially impossible for the Reading numbers in the OP to be correct. Something has gone wrong somewhere. Why do you think it's impossible to get the reading #s posted above, without seeing the raw scores and without having the computation formula? How do you think they are being calculated? Sorry, not trying to be picky, just curious what your thought process is in saying they can't be correct. The reason I mentioned my ds' report is that his Broad #s also look odd if you are only looking at the part of the report that lists SS vs percentile. I'm fairly certain the Broad Total # isn't calculated from the subtest standard scores, but rather from a combination of the raw scores and then it's compared to the norm group to come up with a Broad Standard Score... which is why you need to look at the full report, and why the relative values of SS #s might seem odd when you first look at them. polarbear ps - if I'm wrong about how the scores are calculated, please let me know. I don't want to spread any misinformation!
|
|
|
|
Joined: Feb 2013
Posts: 1,228
Member
|
Member
Joined: Feb 2013
Posts: 1,228 |
OP, in your score report do you have a column labelled "W"?
What are those scores?
|
|
|
|
Joined: Mar 2014
Posts: 9
Junior Member
|
OP
Junior Member
Joined: Mar 2014
Posts: 9 |
Thanks for all your input! My powers of Goigle rarely fail me, but it is proving nearly impossible to determine exactly how broad scores are calculated. As others have noted scoring on the WJ III is by computer only, so there are no tables available to view. It's just confusing because with the other broad scores "the whole is greater than the sum of the parts" in a sense, but not the reading cluster.
Polarbear- thanks for the explanation. I wish we could find the exact formula! What are the most useful numbers on the full report? His Broad Reading score of 137 had a grade equivalent of 4.6, Broad Math 139 GE 3.0, and Broad Written 139 and GE 3.3. Can anyone tell by comparing scores if it was norms for Kindergarten or 1st grade? Does it matter to DYS if scores are based on grade or age? I would love to find out it was normed for 1st, which would hopefully result in qualifying scores! Wishful thinking on my part, I know, but a girl can dream can't she?
|
|
|
|
Joined: Mar 2014
Posts: 9
Junior Member
|
OP
Junior Member
Joined: Mar 2014
Posts: 9 |
22B, I'm still waiting on the tester's supervisor to sign off on the report, but she let me see it and copy down all the numbers. There were no W scores, just Raw, standard, percentile, and grade equivalent. Should I ask for W? What would that tell me? Thanks!
|
|
|
|
|