Originally Posted by Cricket2
Okay, number crunchers, can we figure out what a .6 correlation, which I believe constitutes a moderate correlation if I recall correctly from stats, would likely mean in terms of how many IQ points apart siblings would be on average?

I don't think we can. You could conceivably have two data sets with identical distributions and a .6 correlation. (Or a 1.0 correlation, or no correlation at all.)

If you assume for the sake of argument that Kid1 in each family lies precisely at the center of the sibling-cohort IQ distribution, 68% of the other kids in the family will fall within one standard deviation of Kid1's score, with most of those a good deal closer. A sibling cohort may or may not have an SD of 15, and it's possible that each of two kids would be outliers at opposite extremes. But if you had a thousand-kid family so the statistics meant something, most of the kids would fall into the same general IQ range. So we shouldn't be surprised when kids in a much smaller family often do, too. (Nor should we be surprised when they don't, because statistics don't tell you anything useful about a specific individual.)