0 members (),
85
guests, and
13
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
S |
M |
T |
W |
T |
F |
S |
|
|
|
|
|
1
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
9
|
10
|
11
|
12
|
13
|
14
|
15
|
16
|
17
|
18
|
19
|
20
|
21
|
22
|
23
|
24
|
25
|
26
|
27
|
28
|
29
|
30
|
|
|
Joined: May 2009
Posts: 282
Member
|
Member
Joined: May 2009
Posts: 282 |
I have to agree with Grinity. While I think that it is appropriate to demand more challenging curriculum, and while I am aware that some of our kids aren't good test takers (DD11 can NOT seem to get the rhythm of test taking down and will sit on a problem in a timed test rather than skip it and come back), I have deep concerns about prepping kids for these kinds of tests. I don't think that they are equivalent to college entrance tests which are widely prepped for in a very open way. Several concerns:
First, it masks data in a way which will perpetuate the underidentification of kids who do not come from middle/upper middle class backgrounds. As well educated families artificially raise the scores of kids who are likely to already be advantaged in vocabulary and general exposure to content, it will become even harder to see the children with raw ability but disadvantaged backgrounds. I think part of the point of using a test like COGAT is to try and identify the kids who have achievement that is out of line with ability indicators.
Second, from a parent standpoint, it muddies the value of the test for those of us who are feeling our way through the "how-atypical-is-this?" puzzle. We can't all afford individual IQ testing, especially if we have multiple children. Aptitude measures and out of level testing are helpful to us, but only if our unprepared children are taking them along with other unprepared children. Prepping turns these tests into acheivement tests, since it ends up measuring how well a child has learned to interpret a taught problem type.
Third, I think that the end impact for children who would have very high scores without being prepped is that they look equivalent to children who actually don't have the same educational needs. This, it would seem to me, would lead to less rigor than needed in follow up programming.
All of that said, the crux of the problem is that too many districts use tests like COGAT as a stand alone gatekeeper. It should be one of several possible indicators, and should not be used to exclude children from educational options.
It is scary to have a child who does not shine until properly programmed for, because it creates a vicious cycle where lack of opportunity begets underachievement and underacheivement begets continued lack of opportunities. I get that in a big way. IMHO though, prepping invalidates the value of an aptitude instrument and makes it hard to make progress in moving schools to better meet the needs of gifted kids. Kids can be prepped to get better scores, but it won't make them more gifted. What it will do is create a misleading impression of what gifted is. Many educators have very limited knowledge or experience with giftedness. In part, these educators learn as they meet and work with kids who they are told are gifted. So what happens when they program for bright kids who they've been told are gifted and that programming works? How do they learn that what they are doing is inadequate for gifted learners?
Please note, I am not suggesting that a child is not gifted because a child has prepped for a test. I'm sure that there are both gifted and bright children who prep. My comments relate not to the impact or outcome for any single child, but to the potential danger when we look at it on a broader scale.
|
|
|
|
Joined: Jun 2010
Posts: 1,457
Member
|
Member
Joined: Jun 2010
Posts: 1,457 |
After reading through this thread, my thoughts on the ethics of prepping for tests in general run along these lines: 1. If the test materials are indicated as secret by the publisher, it's not OK to prep specifically for that test. 2. If the test materials are indicated as open by the publisher, it's OK to prep specifically for that test. (I'm not sure whether any of the tests we routinely discuss here fall into this category.) 3. However, buying any actual copy of a test is almost certainly not OK, unless the publisher has released previous versions themselves (e.g. past LSATs). With an IQ test like the SB5 or WISC-IV, where I am guessing that problems are released fairly infrequently, it would be extremely unethical. Buying any current copy of any test is obviously wrong. 4. If the test materials are not indicated as either secret or open by the publisher, the more that an array of third-party test prep material and services is allowed to flourish unchecked, the greater the likelihood that it is OK to prep specifically for that test, as the publisher may implicitly authorize test prep by their inaction. (At the other end of the possibly implicitly authorized third-party prep materials, if a few independent, low-quality publishers claim to offer test prep, the materials are not sold through large outlets like Amazon that make some effort to police their offerings for legality, there are open lawsuits based on the third-party prep materials, etc., it is less likely OK to prep specifically for that test.) 5. It's always OK to teach thinking skills. So, for example, the mere presence of analogies on different types of tests doesn't mean one can't expose a child to analogies. For another example, over-the-top stressful hothousing of things like vocabulary wouldn't be unethical test prep, though it would be bad parenting. 6. It's always OK to practice test-taking in general, to lower stress levels etc. (7. The fact that test prep will inevitably happen to some degree is a compelling reason not to rely just on numbers from one type of test, or maybe any types of preppable tests.) I took the time to review some CogAT test prep materials available on Amazon, based on Bostonian's mention. Some of them seem to just be bundles of materials from, for example, the Critical Thinking Company, including Mind Benders and other materials that parents here sometimes buy for their children just to challenge them and develop thinking skills. Here's an example, and here's another non-Amazon offering along similar lines. Under my Rule # 5 above, I consider these to be okay, unless we are prepared to insist that kids must be tested in their natural, untaught, vegetative state. In a different category are prep materials like this Practice Test for the Cognitive Abilities Test CogAT, which includes in its "Editorial Reviews" description the following (and here's the publisher's website): Mercer Publishing has the only available practice materials in the format of the CogAT�* exam." ------------------------------------------------------------------------------- Most of the questions in this book are at the 2nd to 3rd grade level of difficulty. Please see our grade-specific books for additional grade options. The Cognitive Abilities Test (CogAT�*), published by Riverside Publishing, Multilevel Edition tests are commonly given to 3rd grade students and above (and sometimes students in 2nd grade), although it depends on what test your school/program provides and the test level that they use for your grade level. The A - H level tests expect that the child is able to read and answer the test questions themselves. This practice test contains nine subtests in the three test areas found on the CogAT�: VERBAL Verbal Classification 20 questions Sentence Completion 20 questions Verbal Analogies 25 questions QUANTITATIVE Quantitative Relations 25 questions Number Series 20 questions Equation Building 15 questions NONVERBAL Figure Classification 25 questions Figure Analogies 25 questions Figure Analysis 15 questions This book contains a full length practice test with answer key. The object of this practice test is to familiarize your child with sample questions they will face on test day, how the tests are formatted, the symbols used and the number of questions in in each test area. However, since this practice test has not been standardized with Riverside Publishing and the actual CogAT�* test, a valid CogAT�* test score cannot be concluded from their results on this practice test. * CogAT� is a registered trademark of Houghton Mifflin Company. The Cognitive Abilities Test (CogAT�) is owned by Houghton Mifflin Company and published by Riverside Publishing, a Houghton Mifflin Company. Neither Houghton Mifflin Company nor Riverside Publishing was involved in the production of, nor endorses, these practice tests. Looking at the description, I again see some things that can't be off-base to teach children, like verbal analogies. These materials are also presumably not the subject of a lawsuit, or I'd expect Amazon to pull them after notification from test publisher Riverside. At this point I was thinking that perhaps the publisher implicitly authorized teaching to the test. I followed up to see if there was any information from the publisher on prepping for the CogAT, and found the following paper linked from Riverside's website, written by one of the two authors of the CogAT: Lohman, D. F. (2006). Practical advice on using the Cognitive Abilities test as part of a talent identification system.From page 15: Prepare the students for the test... If at all possible, go over the directions for the test�especially those with unfamiliar item formats such as matrices�a day or two before the test. Make up additional practice items to ensure that ALL children understand what they are supposed to do. NEVER start the test unless you are sure that the children understand what they are supposed to do. From all of this, I conclude that the Mercer Publishing materials are okay to use too. The test publisher seems to allow publication by third parties of extensive study materials, and insists that students be exposed not only to the rules of the test, but even practice questions, at least days in advance of a test. My personal opinion, short answer: you can apparently prep for the CogAT in any way, except for buying actual CogAT tests only intended to be sold to testers, without ethical worry. And in a highly competitive locale which uses the CogAT, based on the easy availability of high-quality third-party prep materials, prepping might actually be the closest approach to an apples-to-apples comparison. Would I prep my son for the CogAT? Probably not, as I don't think I'd need to even if that test were used in our school district. However, I have bought him some books that include things like analogies in the past. I didn't buy them to prep specifically for a test, but rather because they were used in my GT program when I was a child. I refuse to feel bad for stimulating my child to think. I guess we could wrangle over whether the intent behind some teaching or other intellectual stimulation can make it wrong, where the content itself is not objectionable. I simply can't pass judgment on someone else for teaching their child general thinking skills for any reason. There should be more of it, not less. My wife just proposed a hypothetical: a parent exposes their child to lots of analogies and vocabulary leading up to a test, resulting in a high score, and then afterward does nothing, resulting in much lower numbers a year or two later. My wife thinks that would be wrong. I would agree, but place the blame on the failure to stimulate except near a test time. ETA: Based on finding practice tests for the OLSAT published by Pearson (next post), I'm wondering if it is more ethical / less unethical to prep for group-administered tests in general, barring of course some information from the publisher that they consider it to be wrong. ETA 2: Pearson apparently does not produce any practice tests for the NNAT. ETA 3: I noticed these pre-tests for the CogAT: http://www.riversidepublishing.com/products/cogAt/pricing_pretest.html#1
Note the language about supplying credentials in order to buy the tests (and my next post about pretests for the OLSAT); I'm not sure how much that applies to these pretests. However, I do think the mere existence of pretests from the publisher makes it more likely to be okay to prep for the CogAT. ETA 4: IT IS POSITIVELY NOT OKAY TO PREP EXTENSIVELY AT HOME FOR THE COGAT. Grinity followed up with the test publisher, and found that they only release prep materials with the intent that they be used by schools.
Last edited by Iucounu; 07/17/11 05:50 PM.
Striving to increase my rate of flow, and fight forum gloopiness.
|
|
|
|
Joined: Jun 2010
Posts: 1,457
Member
|
Member
Joined: Jun 2010
Posts: 1,457 |
My husband got this book from Amazon - '4 Practice Tests for Cogat'. It was only $26 and has tons of content for the Grade 2 Cogat testing in the format of the test. We have seen a bunch of books and this one is good. I recommend it. My daughter looks forward to working on it every morning.
Anyone knows if there are similar books for OLSAT? The publisher of the OLSAT, Pearson, apparently publishes OLSAT practice tests. However, during checkout this warning appears: "Your qualifications should be on file with us before you complete your order or there may be a delay." This is probably a general warning based on Pearson selling a lot of psychologist-only materials, but you might want to call them to see if parents are also intended to buy those OLSAT prep materials. I don't know why they wouldn't be; practice is practice, and Pearson offers a lot of practice test bundles for the OLSAT, more than I'd expect just to familiarize a student with the format of the test. If it appears that the publisher thinks practicing privately for the OLSAT is okay, here are a couple of additional searches that might help: Amazon Google
Striving to increase my rate of flow, and fight forum gloopiness.
|
|
|
|
Joined: May 2009
Posts: 282
Member
|
Member
Joined: May 2009
Posts: 282 |
Hmm. In looking at the CogAT information I also see they refer to it as a measure of "learned reasoning". So perhaps my concerns should really be aimed at a misuse of the test by schools? I doubt most schools are spending time making sure that all students are learning to do the kinds of problems that are on the test, so I see two problems. First, the equal access concerns that I raised previously, and second, the interpretation (misinterpretation?) of results when prepped students and unprepped students are being considered and compared within the same group. I see it a bit the way I see the grade level question that comes up periodically as related to EXPLORE. Grade accelerated students are supposed to be considered alongside students of the grade they've accelerated to because they are then being compared to kids who have been exposed to the same content. Yet there is certainly a difference between a 3rd grader who is instructed at a fifth grade level and a fifth grader who is instructed at a fifth grade level, even if they have identical EXPLORE scores. And there is a significant difference between two third graders who score at the same level but have been exposed to different levels of curriculum.
All of which begs the question....is there any good screener out there that schools can use to accurately identify students who are slipping under the radar due to learning style, lack of opportunity/exposure, language proficiency, etc.? And, without taking an individual IQ test, how do parents get an accurate picture of their children's typicality or atypicality? Maybe that's not possible? And is there any role for context in helping schools accurately identify needs? Or will that just lead to inaccurate parent reporting for fear that information they share will be used against them?
|
|
|
|
Joined: Jan 2008
Posts: 1,917
Member
|
Member
Joined: Jan 2008
Posts: 1,917 |
All of which begs the question....is there any good screener out there that schools can use to accurately identify students who are slipping under the radar due to learning style, lack of opportunity/exposure, language proficiency, etc.? And, without taking an individual IQ test, how do parents get an accurate picture of their children's typicality or atypicality? Maybe that's not possible? And is there any role for context in helping schools accurately identify needs? Or will that just lead to inaccurate parent reporting for fear that information they share will be used against them? Well, I can't answer your question, but I can say that the most useful tests in terms of finding out where my kid is compared to his classmates and where he should be going next have been the NWEA MAP tests. The schools that use these best will group kids with similar scores together and provide appropriate content based on what the students already know. And though of course we're talking GT, these tests are useful to educating of all kids, GT or not. Avoids the whole issue of GT - teach the the readiness of the kid. (Yes, of course this is an oversimplified response.)
Last edited by st pauli girl; 07/10/11 02:16 PM.
|
|
|
|
Joined: May 2009
Posts: 282
Member
|
Member
Joined: May 2009
Posts: 282 |
Well, I can't answer your question, but I can say that the most useful tests in terms of finding out where my kid is compared to his classmates and where he should be going next have been the NWEA MAP tests. The schools that use these best will group kids with similar scores together and provide appropriate content based on what the students already know. And though of course we're talking GT, these tests are useful to educating of all kids, GT or not. Avoids the whole issue of GT - teach the the readiness of the kid. (Yes, of course this is an oversimplified response.) Yes, I see the value in tests like MAP. Our district will begin using it and, while I'm reserving a smidge of judgement until I see it in action, I tend to think it's a good move. Conceptually, I prefer adaptive testing formats like MAP that don't require students to spend a lot of time answering questions that are significantly off-level. I guess what I'm looking for (and this is where I worry about test prep) are ways to pick out some of those "caged cheetahs". I love the Stephanie Tolan essay, "Is It A Cheetah", because it acknowledges that there are highly gifted students who are invisible in school because they never have the chance to show what they can truly do. Those students sit unidentified (even sometimes with aware, advocating parents) because their classroom performance looks unremarkable. They will not excel on tests like MAP because they have never seen some of the content--their unremarkable classroom performance doesn't suggest a need to accelerate them to what they could do in the right circumstances. A personal example: when DD was 9 she took a math reasoning test. She was then assessed with some district assessments, based on district standards. I did not find out until much later that she scored above the 99th percentile on both the regular and gifted scales of the reasoning test. What I was told at the time is that she was hitting standards a year or two ahead, and therefore could be instructed in the regular classroom with some differentiation. In hindsight (and with the belated information from the reasoning test), I draw a different conclusion. I conclude that the discrepancy showed inadequate classroom instruction. It was improperly paced given what she was capable of acquiring. For students who need something other than the opportunity to work with materials from later grades (or even in those higher grade level classrooms)--for students who actually need instruction delivered in a different manner and at a different pace, acheivement tests will continue to be an inadequate identification tool. Acheivement tests will particularly miss the gifted children who come from uneducated family backgrounds or who have had undifferentiated classroom instruction and opportunities. In many ways, I think these are the children who most need to be identified early on, and who most need intervention that is based on ability rather than acheivement. Which brings me back again to my discomfort with test prep for reasoning tests. How do we find those kids if they have peers who are able to prep for these tests when they are not able to do the same? Am I just not showing enough confidence in the quality of the tool? Does prep not make a significant enough impact to worry about?
|
|
|
|
Joined: Jun 2010
Posts: 1,457
Member
|
Member
Joined: Jun 2010
Posts: 1,457 |
Which brings me back again to my discomfort with test prep for reasoning tests. How do we find those kids if they have peers who are able to prep for these tests when they are not able to do the same? Am I just not showing enough confidence in the quality of the tool? Does prep not make a significant enough impact to worry about? I think test prep probably does have a sizable impact, just as any sort of learning that increases ability is going to have an impact. How much? I have no idea. But I understand your worries and share them. It's obvious others here do too. When you have young children especially, the enrichment they've received will make some of them seem highly advanced compared to age peers (or perhaps they will be highly advanced, but not in a way that correlates fully to intelligence levels they will reach later). Hence similar concerns over parents hothousing their children. Learning works! The thing is, I don't think you can easily draw a bright line with some of these issues. How much, exactly, would taught material have to resemble the specific format and/or content of items on a particular test, to even constitute specific test prep? And how much teaching/enrichment/stimulation is unfair hothousing? People comment in negative, sometimes offhand ways about hothousing; for example we've heard people here relate how some jealous parents comment that their children haven't been "given the same opportunities". The comments are often offered as commentary that a hothousing parent is essentially abusing their child, but what's really at the heart of the label? Jealousy and angst that someone else's children have or will have advantages over the speaker's own, and a violated sense of fairness based on the speaker's own choices. And is it really desirable that kids not be taught, just to try to get a more valid benchmark on a test, especially knowing that some parents would always cheat the system? Or should the solution be to try to improve the imperfect assessment tools, rely on more than test numbers from a single test for identification, etc.? Testing is inherently imperfect, and you can't control what a parent will do in her own home. It would be more workable and fair in my opinion to expose all children to some standardized test prep before a test, rather than try to enforce a lack of prep. They will not excel on tests like MAP because they have never seen some of the content--their unremarkable classroom performance doesn't suggest a need to accelerate them to what they could do in the right circumstances. This is one of the reasons I'm uncomfortable with heavy reliance on achievement for identification of giftedness. Some reliance is okay, as it's some evidence, but underprivileged and other under-the-radar, unidentified children will always tend to be at a disadvantage.
Striving to increase my rate of flow, and fight forum gloopiness.
|
|
|
|
Joined: Sep 2008
Posts: 33
Junior Member
|
Junior Member
Joined: Sep 2008
Posts: 33 |
I just wanted to point out that the OP is most likely a spammer. He/She only has 2 posts on this forum and while this one talks about using the book with her daughter the other post just a few minutes later talks about using it with his/her son.
I have seen very similar posts about this same book on a local city data forum, a local mothering forum, and a local gifted listserv all in the past week.
|
|
|
|
Joined: Sep 2008
Posts: 1,898
Member
|
Member
Joined: Sep 2008
Posts: 1,898 |
Yes, I remember thinking "probably spam" at the time, but it wasn't blatant enough to report as such - and look, it lead to an interesting discussion! That's the best kind of spam :-)
Email: my username, followed by 2, at google's mail
|
|
|
|
Joined: May 2009
Posts: 282
Member
|
Member
Joined: May 2009
Posts: 282 |
The thing is, I don't think you can easily draw a bright line with some of these issues. How much teaching/enrichment/stimulation is unfair hothousing? How much, exactly, would taught material have to resemble the specific format and/or content of items on a particular test, to even constitute specific test prep?
And is it really desirable that kids not be taught, just to try to get a more valid benchmark on a test, especially knowing that some parents would always cheat the system? Or should the solution be to try to improve the imperfect assessment tools, rely on more than test numbers from a single test for identification, etc.?
Testing is inherently imperfect, and you can't control what a parent will do in her own home. It would be more workable and fair in my opinion to expose all children to some standardized test prep before a test, rather than try to enforce a lack of prep. I would agree with your points here, although I would hate to see yet more time spent on test prep in school: it's hard enough to find the time to teach everything.... One of the more interesting ideas I've heard suggests looking at "x" percentage of each demographic tested. That makes sense to me, although the follow up would have to be differentiated with current achievement levels in mind. In my fantasy approach, schools identify and form several types of gifted clusters (each in a different classroom). One cluster would be students who need fully individualized learning opportunities/instruction. These students likely top out both aptitude and achievement testing. Another cluster would be kids who can be instructed as a group, but with an approach and materials that are actually different in both form and content from general education practices/curriculum. These students are near or at the top of achievement and aptitude testing, but aren't quite as far out there and/or are not the kinds of learners who want a completely individualized approach. I'm thinking that the fourth grader who has not been grade accelerated but tops 20 on everything on the EXPLORE test would be typical of the first group; the 16-20's or scattered scores would be typical of this second group. A third cluster would be a combination of bright/mildly gifted students. These students will excel given curriculum that is a 1-2 years ahead and/or with differentiated materials and assignments. However, they can succeed and have needs met with a regular approach, as long as the material is sufficiently advanced. Finally(?), I would envision a cluster that includes kids with high ability indicators relative to their demographic group, but lower than expected achievement. Depending on the specific students, school, etc. this might be two clusters--one in which kids are underachieving due to intrinsic differences which need to be addressed, and one in which kids are underachieving due to lack of an enriched lifestyle. In my fantasy world this also allows teachers at each grade level to learn and specialize in different types of gifted education. Sometimes it seems that even where schools use clustering, they lack an appreciation of the differences between gifted students and create clusters that don't work well together. I wouldn't want parents to withold what they think will meet the needs of their children, and I recognize that the types of problems on CogAT and other instruments are just plain fun for some of our children. If there was a variety of options for meeting the range of needs on the gifted end of the spectrum, maybe parents wouldn't feel a need to be so cautious about what they share with schools, and prepping wouldn't be an issue at all--it would just be part of the overall picture a school had of a child when trying to figure out which placement was the best fit. I guess what I want is an all cards honestly on the table approach to understanding typicality/atypicality. So many of us have had experiences in at least one subject area where we hear, "oh we have lots of students who....your child will fit right in here". With some of the picture obscured by circumstances that are not readily shared, and with other parts of the picture obscured by instruments or learning opportunities with inadequate ceilings, it is no wonder that this is too often a false statement.
Last edited by Taminy; 07/11/11 08:04 PM. Reason: typo
|
|
|
|
|