Gifted Bulletin Board

Welcome to the Gifted Issues Discussion Forum.

We invite you to share your experiences and to post information about advocacy, research and other gifted education issues on this free public discussion forum.
CLICK HERE to Log In. Click here for the Board Rules.

Links


Learn about Davidson Academy Online - for profoundly gifted students living anywhere in the U.S. & Canada.

The Davidson Institute is a national nonprofit dedicated to supporting profoundly gifted students through the following programs:

  • Fellows Scholarship
  • Young Scholars
  • Davidson Academy
  • THINK Summer Institute

  • Subscribe to the Davidson Institute's eNews-Update Newsletter >

    Free Gifted Resources & Guides >

    Who's Online Now
    0 members (), 167 guests, and 10 robots.
    Key: Admin, Global Mod, Mod
    Newest Members
    parentologyco, Smartlady60, petercgeelan, eterpstra, Valib90
    11,410 Registered Users
    March
    S M T W T F S
    1 2
    3 4 5 6 7 8 9
    10 11 12 13 14 15 16
    17 18 19 20 21 22 23
    24 25 26 27 28 29 30
    31
    Previous Thread
    Next Thread
    Print Thread
    Page 4 of 4 1 2 3 4
    Joined: Sep 2008
    Posts: 1,898
    C
    Member
    OP Offline
    Member
    C
    Joined: Sep 2008
    Posts: 1,898
    Posting on this old thread since the URL of the paper it discusses has changed (and the thread is too old for editing the original entry to be permitted). Here is the new one, for the benefit of anyone still interested. I'm still mystified by the paper, and would still like to hear from anyone who thinks they can give an explanation for the anomalies I started this thread with!


    Email: my username, followed by 2, at google's mail
    Joined: Sep 2007
    Posts: 3,296
    Val Offline
    Member
    Offline
    Member
    Joined: Sep 2007
    Posts: 3,296
    Hi CM,

    I'm not really sure what you're looking for. I skimmed through the paper (read more closely in parts), and my impression is that her study is flawed at best. Her writing is garbled in places, too, and I felt that the introduction rambled and failed to make her purpose clear.

    This paper had a lot of flaws that have pointed out in this thread. I still find it very hard to believe that she could have had so many children at or above the 99th percentile, especially that so many could have had IQs of at least 170. I don't get the control group structure at all (esp. the second "control" group). It seems messy.

    I don't know much about the Stanford-Binet test, but this article says that the ceiling of the revision that was released in 1986 (they administered it in the early 90s, right?) had a ceiling of 148. So I'm confused about her assertion that the ceiling of the test was 170. Other versions seem to have ceilings that are much higher than 170. She should have made this clear.

    In fact, in looking through the paper, I saw that it's heavy on generalizations and has no quantitative data, apart from the dubious reporting of IQs ("Most of them now say....", "Too many had dissipated....", "Others, though, felt that....). I didn't find a single table in the entire paper. She didn't provide a copy of her survey questionnaire (if one even existed) or her basis for making judgments about "success." There was no statistical analysis, no reporting of percentages of subject's answers to questions, no nothing. Just lots-of-people-felt-this-way stuff. The entire thing seems to be subjective.

    The study seems to be something of an edumacation project to me. I've seen these before (as a reviewer and reader): they're characterized by failure to use rigorous or even semi-rigorous methods, failure to structure the study properly, failure to provide data, and an apparent willingness to cherry-pick information that supports an original hypothesis, rather than allowing data to drive conclusions.

    HTH....

    Val



    Joined: Sep 2008
    Posts: 1,898
    C
    Member
    OP Offline
    Member
    C
    Joined: Sep 2008
    Posts: 1,898
    Originally Posted by Val
    I'm not really sure what you're looking for.
    Really, an explanation as to how something that looked like a peer-reviewed journal could publish something that looks so bad. Explanations I can think of include:
    - this isn't really a peer-reviewed journal in the sense I understand it (nobody critically read the paper ever, or they did but nobody forced the author to rewrite); or
    - actually it isn't as bad as I think, e.g. because someone who worked in the field and understood the conventions of the field would be able to see things that to them are obvious explanations of the discrepancies reported here.

    Of these the first seems the more likely, but maybe I'm wrong or maybe there's another explanation I haven't thought of. I'd just like to know!


    Email: my username, followed by 2, at google's mail
    Joined: Sep 2007
    Posts: 3,296
    Val Offline
    Member
    Offline
    Member
    Joined: Sep 2007
    Posts: 3,296
    Originally Posted by ColinsMum
    Really, an explanation as to how something that looked like a peer-reviewed journal could publish something that looks so bad. Explanations I can think of include:
    - this isn't really a peer-reviewed journal in the sense I understand it (nobody critically read the paper ever, or they did but nobody forced the author to rewrite); or
    - actually it isn't as bad as I think, e.g. because someone who worked in the field and understood the conventions of the field would be able to see things that to them are obvious explanations of the discrepancies reported here.

    Of these the first seems the more likely, but maybe I'm wrong or maybe there's another explanation I haven't thought of. I'd just like to know!

    FWIW, the site I pulled it from classified it as a magazine. The guidelines for authors indicated that everything is peer-reviewed, so at least one or two people read it and approved of it. From my perspective, this reflects poorly on the publication.

    It's hard for me to see a legitimate way around the lack of things that I listed in my last message (study design, data, etc.). If people who work in this field were to argue that their conventions don't require any of those things, I'd question the validity of their research even more. Unfortunately, from what I've seen, standards in the education field can be very low ("can be," not "always;" obviously not making a blanket statement about everyone here).

    Val

    Joined: Feb 2011
    Posts: 5,181
    Member
    Offline
    Member
    Joined: Feb 2011
    Posts: 5,181
    I am not at all surprised at the publication of something that seems so poorly vetted. This is why ONE publication on a finding is "interesting" and five begin to be "convincing."


    Often the 'independent' reviewers are nowhere near as unbiased or detached as they theoretically ought to be (in a small field it is all the more potentially incestuous). It's possible that this was a soft-reviewed paper, or that the journal only managed to generate a single 'qualified' reviewer, who happened to be too busy to REALLY review the paper, or tossed it to a borderline-competent or green graduate student or post-doc. That happens-- even in big name journals like Analytical Chemistry or Neuroscience, occasionally you'll find a publication that is cringe-worthy.

    I wouldn't assume that it's you and that if you had enough expertise it would look 'better' to you. Chances are good that it would be even more obvious how awful the methodology or data analysis actually is. wink

    I agree with Val here in general terms. The social science disciplines all too often train people to look at correlation and assume causative linkages, and the physical sciences tend to train people to avoid that very natural human impulse at all costs. <SIGH>

    As every physical scientist learns: the plural of anecdote is NOT 'data' in any sense of the term.

    Therefore, the so-called conclusions drawn from cherry-picked anecdotes are little more than pet conjectures, because the experimental design is frequently biased to such an extreme that it isn't even POSSIBLE to call any of the variables truly dependent or independent. <shrug>

    I know a lot of physical scientists who privately snark pretty openly about social (airquotes) ''scientists'' and experimental design or statistics...('oh, look, isn't that cute?? They tried to use a two-tailed analysis here... how sweet that they tried... too bad that they don't explain why they dumped four data points from each trial.... Hmmm...'). This is what gave birth something called The Journal of Irreproducible Results. Think of it as MAD magazine for geeks. wink

    My own graduate group was going to do a study once for them, I recall. Head circumfrence against a number of other things-- caffeine consumption, IQ, shoe size, GPA, number of siblings... We got quite nice correlations by plotting inverses against one another... <giggling> And the nice thing is, we just invalidated anything that didn't fit-- oh, sure, sometimes we had to LOOK for a reason to disqualify a study participant. But most people have corrected vision or peircings, or eat meat, have a family member who is a republican/atheist/nudist or something... grin

    In all seriousness, though, I am very wary of ANYTHING plotted against even a simple mathematical transform of another quantity. Those relationships are frequently artifacts if there isn't a clear mechanistic reason posited or known for why the relationship should exist in the first place.


    Now, does social science make for thought-provoking and sometimes insightful reading? Of course. I'm not saying that there aren't usually some interesting perspectives stated. But I tend to view most of it with a pretty jaded and critical eye. Mistakes/oversights in the statistics or in the experimental design are things I look for immediately in judging how much stock I should put in the conclusions section. I always consider whether a paper seems to be a reasonable example of a rigorous investigation... or if it's more of an op-ed bit written by an expert in the field. MOST of the literature in this field is the latter, unfortunately. That's an observation as much as a criticism-- there are reasons why sampling is so hard here. The same thing is true in the medical literature for rare genetic conditions.

    smile


    Schrödinger's cat walks into a bar. And doesn't.
    Joined: Dec 2005
    Posts: 7,207
    Member
    Offline
    Member
    Joined: Dec 2005
    Posts: 7,207
    Originally Posted by HowlerKarma
    I am not at all surprised at the publication of something that seems so poorly vetted. This is why ONE publication on a finding is "interesting" and five begin to be "convincing."


    The social science disciplines all too often train people to look at correlation and assume causative linkages, and the physical sciences tend to train people to avoid that very natural human impulse at all costs. <SIGH>
    I particularly want to 'me2' these two points. I think they should be taught to every high school level student, along with the finer points of why the scientific method is so beloved by it's followers, even with all the flawed humans who use it.

    When I read Freeman's book, I was struck by all the 'association - causation' assumptions. She sure did find plenty of bad parenting amoung parents of gifted kids back in the 1970, but she never gets that the parents themselves are most likely gifted and grew up in even worse circumstances! And how difficult to raise would a child have had to be to drive a parent to seek identification in England in the 1970?

    Remember the story of all those Mom's of kids with autism who got told that they were 'refridgerator moms' who caused their kid's behavior because they were cold and detached. Nowadays the causation arrow points 180 degrees in the opposite direction - raising an autistic kid without support is seen to cause moms to become cold and detached.

    Live and learn -
    Grinity


    Coaching available, at SchoolSuccessSolutions.com
    Joined: Mar 2010
    Posts: 487
    Member
    Offline
    Member
    Joined: Mar 2010
    Posts: 487
    As a social scientist - Ouch!

    I agree that there are a lot of crappy papers being published - in all fields. I also agree that social scientists should have a better understanding of statistics. But condemning whole disiplines...

    Joined: Feb 2011
    Posts: 5,181
    Member
    Offline
    Member
    Joined: Feb 2011
    Posts: 5,181
    Sorry. There are good and critical researchers working in those fields. I've had the pleasure of working with some of them. I definitely don't intend to malign the people working in those disciplines as a whole. I do fault some of the training that they are given, however, since it seems to lead to misunderstandings in how the scientific method is supposed to work...


    I definitely think there is value in being able to trend-spot or to think outside of the causation box that scientists tend to live inside, though. That's where new ideas come from.

    It's just that too much social science research doesn't design experiments so as to allow for the hypothesis to be proven incorrect. (Yes, it's a problem in some science disciplines now, too, as I'm well aware- positive results get published and funded, and nothing else seems to matter, which doesn't serve the discipline very well.) It's possible that it is merely publication bias, but I don't really think so, having seen what ed researchers cook up in experimental design.


    But then again, experience also suggests that the maxims about scientists lacking basic social skills is also generally more often true than not...

    so there's that. whistle Hopefully there are exceptions to THAT, as well.

    Again-- apologies to any of the good social scientists out there that I may have inadvertently offended. I really do have a great deal of respect for my friends and co-authors from the other side of the campus, I promise. wink


    Schrödinger's cat walks into a bar. And doesn't.
    Joined: Mar 2010
    Posts: 487
    Member
    Offline
    Member
    Joined: Mar 2010
    Posts: 487
    smile Its all good.

    Actually I think training social scientists - and I have to be honest and say I'm still 'in training' - is where there is so much potential wasted! There seems to be so much time spent on indoctrination to specific disciplines that the higher purpose of developing real, quality, critical thinking skills is ignored.

    I also agree that statistics are so poorly understood in society at large and in some disciplines that it gives me headache. Few things drive me as batty as trying to explain to someone that 97% of people being x doesn't meant EVERYONE is and so them quoting what they're Great-Aunt Batty wasn't x doesn't change a thing! (Insert icon of someone running screaming in circles waving arms madly!)

    But all of that is completely off-topic isn't it. I haven't actually read the article in question at all, so I have no on-track comment to make. smile

    Page 4 of 4 1 2 3 4

    Moderated by  M-Moderator 

    Link Copied to Clipboard
    Recent Posts
    Testing with accommodations
    by aeh - 03/27/24 01:58 PM
    Quotations that resonate with gifted people
    by indigo - 03/27/24 12:38 PM
    For those interested in astronomy, eclipses...
    by indigo - 03/23/24 06:11 PM
    California Tries to Close the Gap in Math
    by thx1138 - 03/22/24 03:43 AM
    Gifted kids in Illinois. Recommendations?
    by indigo - 03/20/24 05:41 AM
    Powered by UBB.threads™ PHP Forum Software 7.7.5