0 members (),
289
guests, and
23
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
S |
M |
T |
W |
T |
F |
S |
|
|
|
|
|
1
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
9
|
10
|
11
|
12
|
13
|
14
|
15
|
16
|
17
|
18
|
19
|
20
|
21
|
22
|
23
|
24
|
25
|
26
|
27
|
28
|
29
|
30
|
|
|
Joined: May 2013
Posts: 2,157
Member
|
Member
Joined: May 2013
Posts: 2,157 |
On tests like NWEA MAP, my kid who scores in the 99th percentile for a fourth grader is testing around the 50th percentile compared to 12th graders. So this study would put him at 8 years ahead. Can he actually do 12th grade standards? No. But it shows that the average 12th grader is not actually proficient with 12th grade math standards. Not sure if this makes any sense. But I think that's where they are going w/ this study. Basically any student that tests above the 90th percentile on a test like MAP is testing like the average student in a much higher grade. Anyone scoring about the 75th percentile or so is probably scoring like the average student 2 or 3 grade levels ahead. It doesn't mean that those kids in the 4+ ahead schools are actually accelerated 4 years in terms of the work. They just score a lot better on the tests. That's my take on this at any rate.
|
|
|
|
Joined: Dec 2012
Posts: 2,035
Member
|
Member
Joined: Dec 2012
Posts: 2,035 |
I don't know MAP not being in the US but my impression is that it isn't adaptive?
Really though if you are aiming the instuction at +/- 1SD then there are going to be a lot of kids who are below or above the instructional level. A number of kids I. Ds9's class are 2-3 years ahead and about 20% are 1 year ahead according to teacher assessments for national standards.
|
|
|
|
Joined: Sep 2007
Posts: 3,299 Likes: 2
Member
|
Member
Joined: Sep 2007
Posts: 3,299 Likes: 2 |
I'm confused. What else is new? Anyway, it seemed odd to me that nearly half of California students were exceeding standards. This is not the California school system I know.From the article: In a recent policy brief, four colleagues addressed this question and found that very large percentages of students (between 15 percent and 45 percent) are performing above grade level—and that these percentages represent staggeringly large numbers of students. In California alone, for example, this group comprises more than 1.4 million pupils. Hmm. They were looking at results of the Smarter Balanced test for school year 2014-15. It was easy to find the summary results for California. The results are clear: 3.15 million students took the English test, and 16% exceeded standards (about 504,000 kids). In math, 14% of 3.17 million students exceeded the standards (444,000). Note that 56% of students did NOT meet the standards in English, and 67% didn't meet them in math. This information sounded more typical of statistics for this state. Meanwhile, the OP's story linked to a "detailed" breakdown of percentages of students scoring beyond grade level by grade. Those numbers don't fit the with the numbers published by the CA department of education. They aren't even close.So where did the 1.4 million figure come from? Oh....45% of 3.15 million is ~1.4 million. They (apparently) pulled a number out of the air, applied it to all of California, and exclaimed that nearly half of our students are performing above grade level --- when in reality, 2/3 are below grade level in math and 56% are below in English. I call bogosity/lying on this one.
|
|
|
|
Joined: Sep 2007
Posts: 3,299 Likes: 2
Member
|
Member
Joined: Sep 2007
Posts: 3,299 Likes: 2 |
Can I just ask, what does "exceeds expectations" actually mean?
Do the Smarter Balanced tests include lots of above-level questions? Or do they have lots of grade-level questions? I suspect the latter.
If so, then "meets expectations" just means "score is at least equal to some minimum [say, 55%] on a grade-level test." Exceeds would therefore mean, ""score is at least equal to some other minimum [say, 70%] on a grade-level test." My son's results confirm part of this idea, but with score ranges that I can't interpret (meets in English = 2583-2681, with lowest possible score being 2299 and highest being 2795).
Thus, exceeding the standards would NOT mean that a student is skilled at above-level work. It just means s/he got more on-level questions right.
I mean, isn't that the whole point of the out-of-level testing that the Johns Hopkins CTY is always talking about --- that in-level testing doesn't discriminate between very good grade-level students and students who are ahead of grade?
Did I mention that this report was written by a guy from ...Johns Hopkins CTY? And another author is from Duke's G&T program?
Have I missed something? Seriously --- if I've got this all wrong, I'd like to know.
Last edited by Val; 09/20/16 12:39 PM. Reason: More detail added
|
|
|
|
Joined: Apr 2013
Posts: 5,261 Likes: 8
Member
|
OP
Member
Joined: Apr 2013
Posts: 5,261 Likes: 8 |
Thanks to everyone who is digging into this! I am excited about the study as it focuses on "kids at the top", who may need more than the standard grade-level curriculum. I believe that the fact that they've identified this as a significant percentage (15%-45%) has great potential for creating future policy which would make it easier for top performing students to receive instruction in their zone of proximal development (ZPD), with less advocacy required by parents and/or less resistance to advocacy. I find that the 3 Implications and the 3 Final Thoughts presented at the end of the study are realistic, pragmatic, and make it very worthwhile to pursue the further research suggested. Posters have asked great questions, which had me digging deeper and fact-checking as well. As there is quite a bit of information presented, this is one of many times I wish we were discussing in person so we could each point to the exact words, phrases, diagrams, footnotes, etc which we are responding to at the moment (as some may find the routine back-and-forth of conversation comfortable in person, but rather painstaking and/or contentious when typed and read). That said, I'll continue with digging into the information presented in the article (and research study upon which the article is based)... and proceed with typing responses to a few things in posts, with what I find in the information presented. Please feel free to re-direct attention to other parts of the research study as needed.
|
|
|
|
Joined: Apr 2013
Posts: 5,261 Likes: 8
Member
|
OP
Member
Joined: Apr 2013
Posts: 5,261 Likes: 8 |
Northwest Evaluation Association (NWEA) offers Measures of Academic Progress (MAP) which is computer-adaptive testing ( CAT). The study's cited resource 15 brings us to a webpage which touts that MAP assessment is aligned to common core standards, and that MAP formative assessments provide data which predict (correlate to) standardized summative test results. I will suggest that MAP assessments are valued for this alignment and correlation, as teachers/schools/districts can proactively anticipate the high-stakes standardized test scores upon which their own evaluations of teaching efficacy will be based. I believe the study referenced the downloadable score correlation table found here, as the study states: " Because MAP has been aligned to the Smarter Balanced assessment, we were able to evaluate MAP scores using the Smarter Balanced criteria for grade-level proficiency.15"
|
|
|
|
Joined: Apr 2013
Posts: 5,261 Likes: 8
Member
|
OP
Member
Joined: Apr 2013
Posts: 5,261 Likes: 8 |
blackcat, consistent with what you mention about your child's scores in the 99th percentile of 4th graders nearly matching the 50th percentile of 12th graders... in the study's "Figure 2" I see scores of 4th graders performing at the 90th percentile depicted as nearly matching scores of 8th graders performing at the 50th percentile and 12th graders at the 25th percentile. When reading the study's "Implication 2" ( Implication 2: The U.S. K-12 context, which is organized primarily around age-based grade levels, needs serious rethinking) I began to wonder whether a MAP score in particular subject area might be one criteria (among several) to consider when placing students in cluster groups by readiness and ability without regard to chronological age. IMO, this does not mean that every academically advanced 4th grader (average age 10) should be in every class with every 8th grader (average age 14) and every 12th grader (average age 18) performing at the same level on MAP. Appropriate pacing, teaching strategies, and "fit" must also be considered. However in raising awareness of these extreme examples, the multi-age placement which is often currently accomplished through much effort and advocacy may be more readily considered an option going forward.
|
|
|
|
Joined: Apr 2013
Posts: 5,261 Likes: 8
Member
|
OP
Member
Joined: Apr 2013
Posts: 5,261 Likes: 8 |
aeh, I believe the study stated they looked for "proficiency", using tests developed to measure according to common core standards. The study provides Table 4 and Table 6 which are "cut scores developed for each grade level and content area" for "Proficient-ELA" and "Proficient-Mathematics". The downloadable "data linking table" on this webpage (found by following the study's linked reference #15) shows four categories for each grade level: Not Met, Nearly Met, Met, Exceeded. This corresponds to " California set four levels of cut scores for ELA and mathematics" (resource 12 shows these to be 1, 2, 3, and 4) and " Wisconsin provided cut scores on ELA and mathematics at four levels: below basic, basic, proficient, and advanced.11 The state set proficient to indicate performance that was on grade level." This seems to indicate: Met = Level 3 = Proficient = on grade level, and Exceeded = Level 4 = Advanced = above grade level. The article states, "a MAP test score that is equivalent to ninth-grade performance is in fact based on ninth-grade content knowledge and skills."
|
|
|
|
Joined: Apr 2013
Posts: 5,261 Likes: 8
Member
|
OP
Member
Joined: Apr 2013
Posts: 5,261 Likes: 8 |
Thank you for sharing that link of summary results for California, the California Assessment of Student Performance and Progress (CAASPP). To be fair, please note there may be a bit of apples-and-oranges comparison when looking at CAASPP and data from the study: 1) CAASPP includes Smarter Balanced and other test instruments: " CAASPP includes a number of assessments, but the most widely given are the Smarter Balanced Summative Assessments, which evaluate student progress on the California standards in mathematics and English language arts/literacy, often referred to as the Common Core." 2) the article stated that Smarter Balanced was one source of data for CA, but not the only source: " Based on the... California Smarter Balanced... and multistate MAP data" The results are clear: 3.15 million students took the English test, and 16% exceeded standards (about 504,000 kids). In math, 14% of 3.17 million students exceeded the standards (444,000). I agree these results which you cite from CAASPP do not sum to 1.4 million. That said, 1) CAASPP includes Smarter Balanced and other test instruments: " CAASPP includes a number of assessments, but the most widely given are the Smarter Balanced Summative Assessments, which evaluate student progress on the California standards in mathematics and English language arts/literacy, often referred to as the Common Core. " 2) the article cites MAP tests (which are given more frequently and may show growth beyond the Smarter Balanced Assessment, yielding a different result). Note that 56% of students did NOT meet the standards in English, and 67% didn't meet them in math. This information sounded more typical of statistics for this state. Agreed. I believe the study also agrees... 1) Because this study is focused on performance above grade level, " at or below grade level" is treated as one aggregated group, and is cited as 65% for ELA, 86% for Mathematics. 2) The study showed the results from Florida exceeded those from Wisconsin which exceeded those from California. The CAASPP which you provided a link to includes Smarter Balanced and other measurement instruments. The study's Table 2 shows CA Smarter Balanced results by grade level. Here's how they compare for ELA: ELA - Advanced, Grade 3: CAASPP: 18% . . . SB: 21%. . .(+3%) ELA - Advanced, Grade 4: CAASPP: 19% . . . SB: 27%. . .(+8%) ELA - Advanced, Grade 5: CAASPP: 17% . . . SB: 33%. . .(+16%) ELA - Advanced, Grade 6: CAASPP: 13% . . . SB: 33%. . .(+20%) ELA - Advanced, Grade 7: CAASPP: 12% . . . SB: 36%. . .(+24%) ELA - Advanced, Grade 8: CAASPP: 12% . . . SB: 37%. . .(+25%) ELA - Advanced, Grade 11: CAASPP: 23%. . . SB: not reported in Table 2 Total . . . . . . . . . . . . . . CAASPP: 16%The difference in percentage reported as Advanced by Smarter Balanced grows each year as compared with the percentage reported as Advanced by CAASPP. I would be curious as to what other assessments are being given for CAASPP and whether these other assessments may have a disproportionate number of students performing at level 1, 2, 3, as compared with the population of students taking Smarter Balanced. (For example, possibly the other assessments are not aligned to the curriculum being taught.) So where did the 1.4 million figure come from? Oh....45% of 3.15 million is ~1.4 million.
They (apparently) pulled a number out of the air, applied it to all of California I think the 1.4million came from MAP data as the article states " Relying specifically on the MAP data, one out of every ten fifth-graders is performing at the high school level in reading, and nearly one child in forty at this age is performing at the high school level in mathematics ... Converting these percentages to numbers of children provides a sobering picture of the number of students who are not well served under the current grade-based educational paradigm. In Wisconsin alone, somewhere between 278,000 and 330,000 public-school students are performing more than a full grade above where they are placed in school. And as mentioned above, in the much larger state of California, that number is between 1.4 million and 2 million students."
|
|
|
|
Joined: Apr 2013
Posts: 5,261 Likes: 8
Member
|
OP
Member
Joined: Apr 2013
Posts: 5,261 Likes: 8 |
Can I just ask, what does "exceeds expectations" actually mean? Part of a previous response recently posted upthread may address this: The study provides Table 4 and Table 6 which are "cut scores developed for each grade level and content area" for "Proficient-ELA" and "Proficient-Mathematics". The downloadable "data linking table" on this webpage (found by following the study's linked reference #15) shows four categories for each grade level: Not Met, Nearly Met, Met, Exceeded. This corresponds to " California set four levels of cut scores for ELA and mathematics" (resource 12 shows these to be 1, 2, 3, and 4) and " Wisconsin provided cut scores on ELA and mathematics at four levels: below basic, basic, proficient, and advanced.11 The state set proficient to indicate performance that was on grade level." This seems to indicate: Met = Level 3 = Proficient = on grade level, and Exceeded = Level 4 = Advanced = above grade level. The article states, " a MAP test score that is equivalent to ninth-grade performance is in fact based on ninth-grade content knowledge and skills." Do the Smarter Balanced tests include lots of above-level questions? Or do they have lots of grade-level questions? I suspect the latter. I believe this is why MAP test score results were also used by this study.
|
|
|
|
|