Gifted Bulletin Board

Welcome to the Gifted Issues Discussion Forum.

We invite you to share your experiences and to post information about advocacy, research and other gifted education issues on this free public discussion forum.
CLICK HERE to Log In. Click here for the Board Rules.

Links
DITD Logo

Learn about the Davidson Academy’s online campus for profoundly gifted students living anywhere in the U.S.

The Davidson Institute for Talent Development is a national nonprofit dedicated to supporting profoundly gifted students through the following programs:

  • Davidson Fellows Scholarship
  • Davidson Young Scholars
  • Davidson Academy
  • THINK Summer Institute
  • DITD FaceBook   DITD Twitter   DITD YouTube
    The Davidson Institute is on Facebook, Twitter and YouTube!

    How gifted-friendly is
    your state?

    Subscribe to the Davidson Institute's eNews-Update

    Who's Online
    0 registered (), 0 Guests and 76 Spiders online.
    Key: Admin, Global Mod, Mod
    Newest Members
    SJ1, LAH33, velar, MercuryVenus, Emmy Mitchell
    11048 Registered Users
    October
    Su M Tu W Th F Sa
    1 2
    3 4 5 6 7 8 9
    10 11 12 13 14 15 16
    17 18 19 20 21 22 23
    24 25 26 27 28 29 30
    31
    Page 4 of 4 < 1 2 3 4
    Topic Options
    #88396 - 10/31/10 07:00 AM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    Wren Offline
    Member

    Registered: 01/14/08
    Posts: 1673
    But what is the point?

    Whether it is this Freeman research or the longitudianl research done with Hunter students, what makes a successful outcome for a child?

    There are PG kids without acceleration that are highly successful, look at Sotomayer. And PG kids who have all kids of acceleration, challenges and make nothing of their lives.

    After reading "The Element" by Ken Robinson, the ingredient that really matters, besides hard work, is passion. Without the passion and drive, you just won't be notable. Or take advantage of opportunities, according to him.

    So when I see research that clearly defines "what exactly was the defining factor or factors, I am interested.

    Ren

    Top
    #95342 - 02/23/11 02:41 AM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    ColinsMum Offline
    Member

    Registered: 09/19/08
    Posts: 1898
    Loc: Scotland
    Posting on this old thread since the URL of the paper it discusses has changed (and the thread is too old for editing the original entry to be permitted). Here is the new one, for the benefit of anyone still interested. I'm still mystified by the paper, and would still like to hear from anyone who thinks they can give an explanation for the anomalies I started this thread with!
    _________________________
    Email: my username, followed by 2, at google's mail

    Top
    #95905 - 03/02/11 10:40 AM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    Val Offline
    Member

    Registered: 09/01/07
    Posts: 3294
    Loc: California
    Hi CM,

    I'm not really sure what you're looking for. I skimmed through the paper (read more closely in parts), and my impression is that her study is flawed at best. Her writing is garbled in places, too, and I felt that the introduction rambled and failed to make her purpose clear.

    This paper had a lot of flaws that have pointed out in this thread. I still find it very hard to believe that she could have had so many children at or above the 99th percentile, especially that so many could have had IQs of at least 170. I don't get the control group structure at all (esp. the second "control" group). It seems messy.

    I don't know much about the Stanford-Binet test, but this article says that the ceiling of the revision that was released in 1986 (they administered it in the early 90s, right?) had a ceiling of 148. So I'm confused about her assertion that the ceiling of the test was 170. Other versions seem to have ceilings that are much higher than 170. She should have made this clear.

    In fact, in looking through the paper, I saw that it's heavy on generalizations and has no quantitative data, apart from the dubious reporting of IQs ("Most of them now say....", "Too many had dissipated....", "Others, though, felt that....). I didn't find a single table in the entire paper. She didn't provide a copy of her survey questionnaire (if one even existed) or her basis for making judgments about "success." There was no statistical analysis, no reporting of percentages of subject's answers to questions, no nothing. Just lots-of-people-felt-this-way stuff. The entire thing seems to be subjective.

    The study seems to be something of an edumacation project to me. I've seen these before (as a reviewer and reader): they're characterized by failure to use rigorous or even semi-rigorous methods, failure to structure the study properly, failure to provide data, and an apparent willingness to cherry-pick information that supports an original hypothesis, rather than allowing data to drive conclusions.

    HTH....

    Val



    Top
    #95908 - 03/02/11 11:38 AM Re: Freeman research vs A Nation Deceived etc. [Re: Val]
    ColinsMum Offline
    Member

    Registered: 09/19/08
    Posts: 1898
    Loc: Scotland
    Originally Posted By: Val

    I'm not really sure what you're looking for.

    Really, an explanation as to how something that looked like a peer-reviewed journal could publish something that looks so bad. Explanations I can think of include:
    - this isn't really a peer-reviewed journal in the sense I understand it (nobody critically read the paper ever, or they did but nobody forced the author to rewrite); or
    - actually it isn't as bad as I think, e.g. because someone who worked in the field and understood the conventions of the field would be able to see things that to them are obvious explanations of the discrepancies reported here.

    Of these the first seems the more likely, but maybe I'm wrong or maybe there's another explanation I haven't thought of. I'd just like to know!
    _________________________
    Email: my username, followed by 2, at google's mail

    Top
    #95915 - 03/02/11 12:06 PM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    Val Offline
    Member

    Registered: 09/01/07
    Posts: 3294
    Loc: California
    Originally Posted By: ColinsMum
    Really, an explanation as to how something that looked like a peer-reviewed journal could publish something that looks so bad. Explanations I can think of include:
    - this isn't really a peer-reviewed journal in the sense I understand it (nobody critically read the paper ever, or they did but nobody forced the author to rewrite); or
    - actually it isn't as bad as I think, e.g. because someone who worked in the field and understood the conventions of the field would be able to see things that to them are obvious explanations of the discrepancies reported here.

    Of these the first seems the more likely, but maybe I'm wrong or maybe there's another explanation I haven't thought of. I'd just like to know!


    FWIW, the site I pulled it from classified it as a magazine. The guidelines for authors indicated that everything is peer-reviewed, so at least one or two people read it and approved of it. From my perspective, this reflects poorly on the publication.

    It's hard for me to see a legitimate way around the lack of things that I listed in my last message (study design, data, etc.). If people who work in this field were to argue that their conventions don't require any of those things, I'd question the validity of their research even more. Unfortunately, from what I've seen, standards in the education field can be very low ("can be," not "always;" obviously not making a blanket statement about everyone here).

    Val

    Top
    #95921 - 03/02/11 12:39 PM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    HowlerKarma Offline
    Member

    Registered: 02/05/11
    Posts: 5181
    I am not at all surprised at the publication of something that seems so poorly vetted. This is why ONE publication on a finding is "interesting" and five begin to be "convincing."


    Often the 'independent' reviewers are nowhere near as unbiased or detached as they theoretically ought to be (in a small field it is all the more potentially incestuous). It's possible that this was a soft-reviewed paper, or that the journal only managed to generate a single 'qualified' reviewer, who happened to be too busy to REALLY review the paper, or tossed it to a borderline-competent or green graduate student or post-doc. That happens-- even in big name journals like Analytical Chemistry or Neuroscience, occasionally you'll find a publication that is cringe-worthy.

    I wouldn't assume that it's you and that if you had enough expertise it would look 'better' to you. Chances are good that it would be even more obvious how awful the methodology or data analysis actually is. wink

    I agree with Val here in general terms. The social science disciplines all too often train people to look at correlation and assume causative linkages, and the physical sciences tend to train people to avoid that very natural human impulse at all costs. <SIGH>

    As every physical scientist learns: the plural of anecdote is NOT 'data' in any sense of the term.

    Therefore, the so-called conclusions drawn from cherry-picked anecdotes are little more than pet conjectures, because the experimental design is frequently biased to such an extreme that it isn't even POSSIBLE to call any of the variables truly dependent or independent. <shrug>

    I know a lot of physical scientists who privately snark pretty openly about social (airquotes) ''scientists'' and experimental design or statistics...('oh, look, isn't that cute?? They tried to use a two-tailed analysis here... how sweet that they tried... too bad that they don't explain why they dumped four data points from each trial.... Hmmm...'). This is what gave birth something called The Journal of Irreproducible Results. Think of it as MAD magazine for geeks. wink

    My own graduate group was going to do a study once for them, I recall. Head circumfrence against a number of other things-- caffeine consumption, IQ, shoe size, GPA, number of siblings... We got quite nice correlations by plotting inverses against one another... <giggling> And the nice thing is, we just invalidated anything that didn't fit-- oh, sure, sometimes we had to LOOK for a reason to disqualify a study participant. But most people have corrected vision or peircings, or eat meat, have a family member who is a republican/atheist/nudist or something... grin

    In all seriousness, though, I am very wary of ANYTHING plotted against even a simple mathematical transform of another quantity. Those relationships are frequently artifacts if there isn't a clear mechanistic reason posited or known for why the relationship should exist in the first place.


    Now, does social science make for thought-provoking and sometimes insightful reading? Of course. I'm not saying that there aren't usually some interesting perspectives stated. But I tend to view most of it with a pretty jaded and critical eye. Mistakes/oversights in the statistics or in the experimental design are things I look for immediately in judging how much stock I should put in the conclusions section. I always consider whether a paper seems to be a reasonable example of a rigorous investigation... or if it's more of an op-ed bit written by an expert in the field. MOST of the literature in this field is the latter, unfortunately. That's an observation as much as a criticism-- there are reasons why sampling is so hard here. The same thing is true in the medical literature for rare genetic conditions.

    smile
    _________________________
    Schrödinger's cat walks into a bar. And doesn't.

    Top
    #95933 - 03/02/11 01:57 PM Re: Freeman research vs A Nation Deceived etc. [Re: HowlerKarma]
    Grinity Offline
    Member

    Registered: 12/13/05
    Posts: 7207
    Loc: Connecticut
    Originally Posted By: HowlerKarma
    I am not at all surprised at the publication of something that seems so poorly vetted. This is why ONE publication on a finding is "interesting" and five begin to be "convincing."


    The social science disciplines all too often train people to look at correlation and assume causative linkages, and the physical sciences tend to train people to avoid that very natural human impulse at all costs. <SIGH>


    I particularly want to 'me2' these two points. I think they should be taught to every high school level student, along with the finer points of why the scientific method is so beloved by it's followers, even with all the flawed humans who use it.

    When I read Freeman's book, I was struck by all the 'association - causation' assumptions. She sure did find plenty of bad parenting amoung parents of gifted kids back in the 1970, but she never gets that the parents themselves are most likely gifted and grew up in even worse circumstances! And how difficult to raise would a child have had to be to drive a parent to seek identification in England in the 1970?

    Remember the story of all those Mom's of kids with autism who got told that they were 'refridgerator moms' who caused their kid's behavior because they were cold and detached. Nowadays the causation arrow points 180 degrees in the opposite direction - raising an autistic kid without support is seen to cause moms to become cold and detached.

    Live and learn -
    Grinity
    _________________________
    Coaching available, at SchoolSuccessSolutions.com

    Top
    #95939 - 03/02/11 03:27 PM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    GeoMamma Offline
    Member

    Registered: 03/10/10
    Posts: 487
    As a social scientist - Ouch!

    I agree that there are a lot of crappy papers being published - in all fields. I also agree that social scientists should have a better understanding of statistics. But condemning whole disiplines...

    Top
    #95942 - 03/02/11 04:22 PM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    HowlerKarma Offline
    Member

    Registered: 02/05/11
    Posts: 5181
    Sorry. There are good and critical researchers working in those fields. I've had the pleasure of working with some of them. I definitely don't intend to malign the people working in those disciplines as a whole. I do fault some of the training that they are given, however, since it seems to lead to misunderstandings in how the scientific method is supposed to work...


    I definitely think there is value in being able to trend-spot or to think outside of the causation box that scientists tend to live inside, though. That's where new ideas come from.

    It's just that too much social science research doesn't design experiments so as to allow for the hypothesis to be proven incorrect. (Yes, it's a problem in some science disciplines now, too, as I'm well aware- positive results get published and funded, and nothing else seems to matter, which doesn't serve the discipline very well.) It's possible that it is merely publication bias, but I don't really think so, having seen what ed researchers cook up in experimental design.


    But then again, experience also suggests that the maxims about scientists lacking basic social skills is also generally more often true than not...

    so there's that. whistle Hopefully there are exceptions to THAT, as well.

    Again-- apologies to any of the good social scientists out there that I may have inadvertently offended. I really do have a great deal of respect for my friends and co-authors from the other side of the campus, I promise. wink
    _________________________
    Schrödinger's cat walks into a bar. And doesn't.

    Top
    #96027 - 03/03/11 03:17 PM Re: Freeman research vs A Nation Deceived etc. [Re: ColinsMum]
    GeoMamma Offline
    Member

    Registered: 03/10/10
    Posts: 487
    smile Its all good.

    Actually I think training social scientists - and I have to be honest and say I'm still 'in training' - is where there is so much potential wasted! There seems to be so much time spent on indoctrination to specific disciplines that the higher purpose of developing real, quality, critical thinking skills is ignored.

    I also agree that statistics are so poorly understood in society at large and in some disciplines that it gives me headache. Few things drive me as batty as trying to explain to someone that 97% of people being x doesn't meant EVERYONE is and so them quoting what they're Great-Aunt Batty wasn't x doesn't change a thing! (Insert icon of someone running screaming in circles waving arms madly!)

    But all of that is completely off-topic isn't it. I haven't actually read the article in question at all, so I have no on-track comment to make. smile

    Top
    Page 4 of 4 < 1 2 3 4


    Moderator:  M-Moderator 
    Recent Posts
    Renaissance STAR assessment
    by Yanaz
    10/20/21 10:07 PM
    His doc is shocked.
    by aeh
    10/20/21 01:45 PM
    New York City to Phase Out Its Gifted and Talented
    by Wren
    10/20/21 08:49 AM
    Acceleration in high school
    by aeh
    10/17/21 05:56 PM
    Girls and autism
    by indigo
    10/16/21 09:57 PM
    Davidson Institute Twitter