Gifted Bulletin Board

Welcome to the Gifted Issues Discussion Forum.

We invite you to share your experiences and to post information about advocacy, research and other gifted education issues on this free public discussion forum.
CLICK HERE to Log In. Click here for the Board Rules.

Links


Learn about Davidson Academy Online - for profoundly gifted students living anywhere in the U.S. & Canada.

The Davidson Institute is a national nonprofit dedicated to supporting profoundly gifted students through the following programs:

  • Fellows Scholarship
  • Young Scholars
  • Davidson Academy
  • THINK Summer Institute

  • Subscribe to the Davidson Institute's eNews-Update Newsletter >

    Free Gifted Resources & Guides >

    Who's Online Now
    0 members (), 167 guests, and 10 robots.
    Key: Admin, Global Mod, Mod
    Newest Members
    parentologyco, Smartlady60, petercgeelan, eterpstra, Valib90
    11,410 Registered Users
    March
    S M T W T F S
    1 2
    3 4 5 6 7 8 9
    10 11 12 13 14 15 16
    17 18 19 20 21 22 23
    24 25 26 27 28 29 30
    31
    Previous Thread
    Next Thread
    Print Thread
    #249881 07/18/22 07:17 PM
    Joined: Apr 2012
    Posts: 192
    W'sMama Offline OP
    Member
    OP Offline
    Member
    Joined: Apr 2012
    Posts: 192
    Does anyone have the WISC-V manual to look up the composite & percentiles for these? I did write to aeh to ask but not sure how often they’re on here.

    I have a list of the sum of scaled scores (on left) and I already filled in what I could find for composites/percentiles just by looking at what others had posted about their own kids that matched mine. Thank you!

    ….Index…...SSS………….COMPOSITE………..%ile
    GAI………………..82……..……….
    FSIQ……………..114……………..
    EFI…….…………58……………130……………….
    VECI…..………..66……………143….………….99.8
    FRI……………….35……………144……………….99.8
    VCI………………32…..………..133……………….99
    VSI……….……. 31…..…………132…….…………98
    WMI…….……. 36……………….
    PSI……….…….25………………..
    QRI……………..29……………….
    NVI ………..……96……………….
    EGAI ………….128…………..…139……………..99.5
    CPI …………….61…………………

    W'sMama #249884 07/19/22 03:20 PM
    Joined: Apr 2014
    Posts: 4,047
    A
    aeh Offline
    Member
    Offline
    Member
    A
    Joined: Apr 2014
    Posts: 4,047
    I started a message to you, but hadn't finished...sorry!

    I can take a look, certainly, but before I do so, I should preface it with a reminder that scores are only one very small part of an individual, and even only a limited component of an individual's learning profile.

    I'm curious, btw, why you have only sums of scaled scores, and no standard scores or percentiles. Typically, assessment professionals would provide the latter two (possibly without the former), along with, one hopes, a thoughtful interpretive narrative. It's a bit unusual to have only the SSS. If you're just waiting for the formal report to arrive, I would encourage you to do that first, since that would include more contextual clinical information.


    ...pronounced like the long vowel and first letter of the alphabet...
    W'sMama #249890 07/20/22 03:05 PM
    Joined: Apr 2012
    Posts: 192
    W'sMama Offline OP
    Member
    OP Offline
    Member
    Joined: Apr 2012
    Posts: 192
    So what I do have is actually screenshots of the scaled scores from two different administrations four and five years ago. My son was being evaluated for a 504 at the public school and the following year at a university by supervised grad students as part of testing for ADHD.

    I know I have a more complete report somewhere and it’s probably even in my email but I’ve been searching and haven’t been able to find it.

    I talked to Linda Silverman at a lunch roundtable at PGR this year. She looked at the scaled scores and said his testing was “screwy” and that I should get a real tester instead of getting it free through the school or cheap through the University because she questions the ability of both of the testers. A more experienced tester sounds $$$ to me though.

    The reason she said it was screwy is that his scores all seesawed from one year to the next. There was a big spread between scaled scores within each administration but also across the administrations. So like one year he got a 9 on a certain subtest and the next year he got a 16 or something. And vice a versa he got 18 on matrix reasoning the first time and a 10 a year later. But it was like that for every subtest except I think vocabulary.

    There were a few other things that were not ideal like the school only administered I think seven subtests and the university did more. One thing the university did that the school didn’t was comprehension and he got a 19 on that. I asked them about extended norms and they said no although I don’t remember if they were saying the extended norms weren’t necessary or just that they didn’t give him any more questions.

    Another thing Linda said was that I could trust the high scores and she didn’t know what was going on otherwise. I wonder if his ADHD was interfering with attention on certain subtests each time.

    Anyway since she said I could trust the high scores I took the high scores from each subtest to get the sum of scaled scores by looking up each index to see what’s included. I know combining the two administrations in this way wouldn’t be valid to submit to Davidson or other programs but it might be helpful to see if he does actually have a huge spread across different abilities or if it’s likely that attention was the problem during testing. BTW my daughter has been a DYS since age 5 but since my son didn’t qualify we’ve never gone to any Davidson events because I don’t want him excluded because of his disability.

    W'sMama #249892 07/21/22 03:15 PM
    Joined: Apr 2014
    Posts: 4,047
    A
    aeh Offline
    Member
    Offline
    Member
    A
    Joined: Apr 2014
    Posts: 4,047
    I see. That helps a bit. So a few other notes that one should keep in mind: administrations less than 24 months apart of the exact same instrument technically result in question marks on the second administration (all other things being equal). So I don't really encourage cherry-picking the higher scaled scores from each subtest for this kind of estimated collection of index scores, especially if any of the higher scores are in the second administration. Of course, it sounds like you already had some doubts about his testing, especially because of the ADHD aspect.

    A bit more general context: The district likely only administered seven subtests because that is the standard core battery on the WISC-V necessary to obtain an FSIQ. Nothing notably suspicious about that. His suspected disability was ADHD, not an intellectual or learning disability, so there was no compelling reason for them to do supplementary or ancillary subtests when they were just documenting that he was not intellectually impaired. The university evaluators had different circumstances, and made different decisions. (Other evaluators, including myself, might have made other choices.) The ExNorms probably were not applied because he does not appear to have obtained two 19s in any given index, which is generally considered best practice.

    (Incidentally, these preceding two paragraphs taken together mean that if you were really going to create a composite data set, the least concerning way would be to take the scaled scores from the seven subtests administered by the school and combine them with the additional subtests from the university to generate hybrid index scores. I still don't really recommend it, as the reliability of scores is highest for the FSIQ, lower for the index scores, and even lower for individual subtest scores.)

    With regard to the variation, one should always start by repeating the caution that subtest scaled scores are substantially lower in test-retest reliability than index or IQ scores. But if there is real variation, you may very well be correct in your speculation that inconsistent attention affected performance. That wouldn't be surprising. I have already mentioned the possible artificial score-raising effects of practice, since the second administration was so close in time. We're not in a position to judge the skill of either set of evaluators, nor do I wish to cast aspersions on anyone, but that's always a possibility. Depending on his age at administration, it's also possible that some of the differences are related to differences in the items presented to him on each occasion, even for subtests with the same name. For example, the two PSI subtests are not actually the same respective tasks at age seven and eight. Even on subtests with more continuity across the ages, differences related to starting at a lower/higher level (due to age), or with different teaching items may affect performance for certain specific examinees. This is not, of course, a comprehensive list of explanations.

    And on a side note, I understand why Linda is a bit skeptical about the evaluators you had, based on her experience, since the population of examinees she sees is heavily weighted toward families who were not satisfied with their experiences elsewhere (and certainly, I have seen my share of less-than-optimal evaluations from all types of sources), but I do want to point out that many of the evaluators working in K-12 public schools, clinics and hospitals came out of the same training programs. (As it happens the current presidents of the National Association of School Psychologists (NASP) and the American Psychological Association (APA) were classmates and are close friends.)

    I understand that you are trying to get a better sense of what his real profile is (preferably without the $x000 expense of a fresh neuropsych!), but I think this might be a more nuanced and effective discussion starting from the two real datasets, if you are comfortable with that. (Or if not, feel free to add to our existing pm thread.)


    ...pronounced like the long vowel and first letter of the alphabet...
    aeh #249893 07/22/22 06:01 AM
    Joined: Apr 2012
    Posts: 192
    W'sMama Offline OP
    Member
    OP Offline
    Member
    Joined: Apr 2012
    Posts: 192
    Sure here you go:
    Testing @ school age 8 yr 5 mo

    Similarities
    17
    Vocabulary
    15

    Matrix
    Reasoning
    18
    Figure Weights
    15

    Block Design
    10

    Digit Span
    19

    Coding
    13


    Testing @ University 9 yrs 9 mos

    Similarities
    13
    Vocabulary
    15
    (Information)
    15
    (Comprehension)
    19

    Block Design
    15
    Visual Puzzles
    16

    Matrix Reasoning
    10
    Figure Weights
    17
    (Picture Concepts)
    11
    (Arithmetic)
    12

    Digit Span
    17
    Picture Span
    17

    Coding
    9
    Symbol Search
    12

    W'sMama #249897 07/22/22 06:56 PM
    Joined: Apr 2014
    Posts: 4,047
    A
    aeh Offline
    Member
    Offline
    Member
    A
    Joined: Apr 2014
    Posts: 4,047
    Thanks.

    So in terms of differences, the only subtest that both is more likely to have declined because of examiner differences, and actually declined in any notable way is Similarities. This doesn't definitively say that it was examiner error, though, since we already know there is an alternate explanation (ADHD). The other ones that decreased enough to make me wonder are pretty cut and dried in their scoring, with limited space for queries or clinical judgement (MR and Cd)--but do have in common readily identifiable vulnerabilities to unmanaged ADHD, in the case of MR because it is multiple choice, and can be impacted by impulsivity, and in the case of Cd because weaknesses in sustained attention can impact speed and accuracy.

    The resulting VECI, however, is nearly identical to the previous VCI. (Remember that index scores are more reliable than subtest scores.) The consistency across most VECI subtests, including the ones that weren't retests, suggests that this is probably generally in the ballpark of his real performance.

    The VSI subtests went the other direction, where BD rose quite a bit, with the similar score in VP suggesting that this might be a legitimate representation of visual spatial thinking (note that VP was not a retest, so there's no question about validity for it). There are some students with dysregulated attention for whom BD's timed aspect is a particular vulnerability, which may be one explanation for the lower performance the first time around. Another that should be noted is score inflation from retest effects, since the novelty of the designs is a significant aspect of the test. In this case, I tend to think that the similar score on VP is more confirmatory.

    FRI is where the most dramatic fall occurred. I'll note that, taken in isolation, this EFI looks like the MR score is not a fluke, since three of the four EFI subtests are in the average range, while one is an outlier, in the extremely high range. The Ar score doesn't appear to be lower because of working memory artifacts, since both WMI subtests are in the extremely high range (and are consistent with the previous test). Notably, PC is also a variation of multiple choice. (To be fair, FW is too, and was strong both times.) So it's not entirely clear why the whole FRI is so much lower the second time, especially where these are not tasks with a lot of opportunity for examiner error. (This would be where any clinical commentary from either set of examiners would be useful.) One can speculate that fluctuating attention, fatigue or impulsivity may have been factors, given the diagnosis.

    Based on our available data, I think it would be fair to say that the VCI, VSI and WMI at the time were probably in the MG range, with PSI in the average range. One might argue that VCI/VECI should have been closer to the HG range. FRI officially was at the MG/HG border. Playing around with his highest subtest scores for each subtest from each administration actually just slides him around from the lower to the upper end of the HG range.

    In sum, this looks very much like a mostly MG learner with slightly higher abstraction (both verbal and nonverbal), drifting toward HG, with age-appropriate processing speed. Not, of course, taking into account impacts of ADHD. As you already know, this is a profile with many strengths cognitively; I don't see huge spreads between (or even really within) ability areas, with the exception of the FRI results, which we've discussed already. Is it possible there was more cognition not captured anywhere in these two assessments? Of course. If one were reassessing, it would be informative mainly in documenting improvements in the management of his ADHD, or if either of the two usual reasons for evaluation were pertinent (1, to elucidate/respond to a presenting problem impacting IRL function; 2, for access to necessary resources).

    If you do decide to pursue updated evaluation, and he is receiving disability-based services from the district, have a thoughtful conversation with the evaluators he would have in-district for a 504- or IEP-based triennial before exploring private evals, and make sure to maintain clear communication, so that all assessment data is valid, and duplicative testing is minimized. I can't speak for others, of course, but I always try to work collaboratively with other evaluators; it's in the best interest of the child.


    ...pronounced like the long vowel and first letter of the alphabet...
    aeh #249898 07/22/22 09:43 PM
    Joined: Apr 2012
    Posts: 192
    W'sMama Offline OP
    Member
    OP Offline
    Member
    Joined: Apr 2012
    Posts: 192
    Cool thank you!


    Moderated by  M-Moderator 

    Link Copied to Clipboard
    Recent Posts
    Testing with accommodations
    by aeh - 03/27/24 01:58 PM
    Quotations that resonate with gifted people
    by indigo - 03/27/24 12:38 PM
    For those interested in astronomy, eclipses...
    by indigo - 03/23/24 06:11 PM
    California Tries to Close the Gap in Math
    by thx1138 - 03/22/24 03:43 AM
    Gifted kids in Illinois. Recommendations?
    by indigo - 03/20/24 05:41 AM
    Powered by UBB.threads™ PHP Forum Software 7.7.5