~ A blog about IQ, the brain & success

Tag Archives: height

My recent post about how readers of brainsize.wordpress.com have an average IQ of 147 was damaging to the self-esteem of many readers. For example a reader named “Andrew” wrote:

I suppose I’m the dunce around here with an IQ of ~120!

Intelligence is relative, so even though an IQ of 120 is higher than 90% of the U.S. population, when you find yourself on a blog like this one where the average IQ is said to be 147, an IQ of 120 can feel extremely low.

Well the good news for all the “Andrews” out there is that my readers do not have an average IQ of 147 after all. That figure was arrived at using remarkabley indirect evidence. A poll I conducted found that the average reader was about 0.62 standard deviations taller than others of their demographic group, and since height is thought to correlate 0.2 with IQ, I simply dividing their average height (+0.62 SD) by 0.2, to estimate their average IQ to be 3.1 SD above the mean (IQ 147).

There were a couple problems with this however. For starters, as more readers have started to vote in the height poll, it seems the average reader is 0.43 SD taller than normal, not 0.62 SD as previously estimated. Secondly, while 0.2 is the correlation between IQ and height, what I really want is the correlation between height and g (general intelligence) and I want it corrected for reliability. This figure would be about 0.26.

So now with the revised height (+0.43 SD) divided by the true g loading of height (0.26), it seems Brain Size readers have an average general intelligence of 1.65 SD above normal. In other words, Brain Size readers average an IQ of 125. This is a far more believable figure than 147, which is about what you’d expect from the average academic Nobel Prize winner.

Many people are confused by why I divided the height of my readers (+0.43 SD) by the true g loading of height (0.26), instead of just multiplying it. For example, NBA players are ridiculously tall (perhaps +3.88 SD taller than other American men in their age group). If I wanted to estimated the average IQ of NBA players from their average height, would I divide their height by the g loading of height? If so, their estimated intelligence would be +14.92 SD (an average deviation IQ of 324 making the average NBA player more than 100 IQ points smarter than any person who ever lived!) So instead of dividing their height (+3.88 SD) by 0.26, I would multiply, which gives an estimated intelligence of +1 SD (IQ 115).

If the radical theory that (genetic) intelligence has been declining by the equivalent of 1.16 IQ points per decade is correct, scientists should hurry up and try to figure out how to clone the Victorians while they still have enough intelligence to do so, because at the rate genetic IQ is dropping, we may need Victorians to run our society before utter collapse. If this theory is correct, in just the last 125 years, we’ve set human evolution back at least 10,000 years and have done incredible genetic damage to our species.

And yet since the 19th century, intelligence as measured by IQ tests has increased at least 30 points, causing the difficulty of IQ tests to constantly have to be increased to keep the average score from rising above 100. This is known as the Flynn effect.

Earlier this summer I though I had the Flynn effect all figured out. Of course it wasn’t me who figured it out, it was scholar Richard Lynn who decades ago noted that the rise in IQ scores had been paralleled by a rise in height (and brain size) so obviously nutrition was causing both. The average Victorian man had an IQ below 70 (by modern standards) and a height around 5’5.8″. The average young white man today has an IQ around 100 and a height of 5’10.4″. So IQ has increased by about two standard deviations, and height has increased by a comparable amount (1.78 standard deviations). It seemed the nutrition theory explained everything quite nicely and I never quite understood why IQ experts kept violating Occam’s razor by searching for alternative explanations for the Flynn effect, when Lynn brilliantly explained it so long ago. And whenever the Flynn effect appeared bigger than nutrition could explain, Lynn noted that the increase in 20th century schooling probably spuriously added an additional 8 or so IQ points to modern test scores.

The problem is that this new theory that genetic intelligence declined by one standard deviation since the Victorian era means that we no longer have to only explain why moderns are scoring two standard deviations above Victorians on psychometric tests, we also have to explain why they’re not scoring one standard deviation lower. In other words we have a three standard deviation IQ difference to explain and nutrition (as measured by height) has only increased by 1.78 standard deviations.

But then I had an epiphany. If dysgenics has caused our genetic IQ to drop by one standard deviation, then why not our genetic height? Well, genetic intelligence has declined because in the absence of natural selection (survival of the fittest), low IQ people have more children than high IQ people because they have trouble controlling their sex drives, planning ahead and using birth control. There’s no similar reason to expect short people to have more children (and in fact the research shows that while short women have more kids, short men have less, so it’s a wash).

But dysgenic fertility is only half the proposed explanation for the one standard deviation drop in genetic IQ. The other half was explained by increased mutation load, and there’s reason to think height might be just as depressed by mutation load as IQ is. So just as mutations caused half the one standard deviation decline in genetic IQ, they likely depressed genetic height by half a standard deviation.

So I think if scientists were able to clone the Victorians, we would not only find that they score about 115 on IQ tests if reared in modern society (as opposed to the sub-70 IQ’s they would have obtained in the 19th century), but we would also find that Victorian men would be taller than white men today and clock in at 5’11.7″ (2.3 standard deviations higher than the 5’5.8″ height they had in their own era). In other words, when you control for genetic decline, we see nutrition has probably actually increased by 2.3 standard deviations, almost enough to explain the three standard deviation IQ difference we would expect between Victorians raised in our time compared to their time.

How can the remaining 0.7 standard deviation be explained? Probably by the rise in schooling, media, parental socio-economic status and other cultural factors that artificially prop up IQ scores without actually increasing intelligence.

Height is an interesting metaphor for intelligence. Both are incredibly socially valued traits for a man to have and if someone is taller or smarter than us, we look up to them literally or figuratively, respectively, and we refer to people who are exceptionally bright or exceptionally dull as intellectual giants or mental midgets respectively. Of course height is a physical variable that is measured directly in feet and inches on an absolute ratio scale, while intelligence is an abstract variable that is measured by comparing people to the norm of Western countries at the time that they live. So if you have the average intelligence of a Westerner of your generation, you’re assigned a deviation IQ (intelligence quotient) of 100. Of course most people deviate from the mean, and the standard amount that people deviate is defined as 15 IQ points, so the IQ range of 85 to 115 defines the limits of normal intelligence and 2/3rds of Westerners fall within this range.

Height could be measured on the same relative scale as IQ is, as Jensen once noted, we just don’t bother because we have an absolute scale to measure height on (feet and inches). But if we did measure height on this relative scale, we would call the scores HQ’s (height quotients). The height distribution of young white American men has a mean of 5’10.4″ and a standard deviation of 2.58 inches, so equating these values with 100 and 15 respectively allows us to assign men the following HQ’s based on their heights (if you’re older than 40 or younger than 20, you may want to add an age bonus to your HQ; if you’re a woman, add about 32 points to your HQ, so a 7 foot tall woman would have an HQ of about 211 instead of 179):

Being six feet tall is considered a respectable height for a man and men who are around six feet tall proudly proclaim that they are on message boards like this one. And yet 6 feet equates to an HQ of “only” 109. Few men would proudly proclaim their IQ is 109, even though men value their height just as much as their intelligence, if not more so. So why the double standard? I believe it’s because unlike height, no one can directly observe our intelligence so it’s extremely tempting to lie about it, and since everyone else is lying or very selectively and disingenuously reporting their IQ’s, massive IQ inflation occurs, and we have to greatly inflate our IQ’s just to keep them in the same rank order as everyone else’s self-reported IQ’s. As I’ve stated, a Promethean once knew a couple dozen people, all with alleged IQ’s above 170. His estimate for the actual IQ’s of this group? 115.

The above chart shows that a seven foot tall man has an HQ 139 points higher than a five foot tall man. Since height and IQ are correlated, does this mean seven footers are 139 IQ points smarter than five footers. No. Because the correlation between IQ and height is only 0.2, seven foot men would on average be 0.2(139) = 28 IQ points smarter than five foot men.

The other key point is that when men and women are the same height, the woman has an HQ that is 32 points higher (since these HQ scores are assigned relative to gender). Since men and women differ enormously in height but are virtually identical in IQ, and since height and IQ correlate 0.2 within both genders, when a man and woman are the same height, the woman have an IQ that is 0.2(32) = 6 points higher.

Previously I blogged about research showing that Victorians had faster simple reaction times than modern people. Since simple reaction time partly reflects the basic physiological speed of the brain, some folks think Victorians were (genetically) smarter than people today.

In a paper documenting the 20th century decline in reaction speed, scholar Irwin W. Silverman considers the confounding role of height. Height has increased by 1.5 standard deviations over the last 150 years, and this may be producing spuriously slow reaction times because nerve impulses have further to travel in a taller body. However Silverman seems to dismiss this possibility, citing research showing taller people have faster reaction times.

However within generations, taller people tend to be genetically smarter than shorter people. This is thought to be because both height and intelligence (or at least its correlates: money, status) are socially valued, so people who have an above average amount of both, or either, tend to reproduce with one another, causing the genes for both to become associated. In addition, some of the same genes that influence height, may also influence intelligence. A related point is that short stature and low intelligence may both reflect genetic mutation load, or inbreeding depression.

So the fact that the nerve impulse has further to travel in tall people may be completely negated by the fact that tall people have genetically faster brains. In other words, tall people may be so mentally quick, that they still perform well on reaction time tasks despite the test being physically biased against them.

However this genetic relationship between height and intelligence probably only holds within generations. Between generations, heights differ for nutritional reasons, probably not genetic reasons, so tall modern people do not have a genetic advantage over short Victorians with which to negate the fact that the reaction time tests are physically biased against the tall.

The confounding role of height may also explain why studies investigating the relationship between intelligence and nerve conduction velocity have yielded extremely inconsistent results. Speaking of which, has anyone investigated long-term changes in nerve conduction velocity? Measures of human NCV have been collected since the 19th century, though old studies may be crude.

Now to further confuse the issue, even though the Victorian sample from which scientist Francis Galton collected his reaction time data was short by modern standards, they were actually tall by 19th century standards. This is significant because attempts have been made to correct for the elite nature of Galton’s samples by adjusting the sample for occupational status. However even when you look at the subsets of Galton’s sample who were not elite (i.e. unskilled men, aged 26+) you find they were 66.47 inches tall (see table 10 in this HBD Chick blog post), even though the average 19th century man was, according to one major study, 166 cm (65.35 inches).

So why were even the non-elite men in Galton’s sample about 0.43 SD taller than the British average? Perhaps, because as HBD Chick explained, Galton’s sample was not just elite, they were extremely self-selected, and that may have biased the sample independently of occupational status. They had to come to Galton (he didn’t come to them). These were people who were intellectually curious and literate enough to even read about Galton’s research, and motivated enough to travel (perhaps in some cases from great distances) to the museum, find Galton’s test and pay good money to take it. And why would one want to take the test so badly unless deep down, one had reason to believe in one’s own biological superiority? Is it really surprising that people who wanted so badly to demonstrate that they’re superior, actually were, and that they would be high, not just on intelligence, but on its weak genetic correlates: height and reaction time. So, if even adjusting for occupation, Galton’s sample was 0.43 SD taller than other Victorians, then perhaps they were also 0.43 SD mentally faster than other Victorians of the same occupation, skewing Galton’s data.

But above I argued that Galton’s sample did better than modern people because they’re shorter than us, but now I’m arguing they did better than other Victorians because they were taller than them. Sounds like ad hoc gibberish, and maybe it is. But remember, within generations, good genes make people both taller and mentally quicker so tall people have faster reaction times than short people. But between generations, nutrition improves height but does not appear to improve reaction time, so shorter generations should have an unfair advantage on reaction time tests because the nerve impulse has less distance to travel.

As I explained in a previous post, a popular belief is that the Flynn Effect disproportionately impacts the lower parts of the bell curve. The reason this theory is so popular is because hypothesized causes of the Flynn Effect (increased schooling, better nutrition) are assumed to help primarily the most disadvantaged people, but have diminishing returns as you move up the scale. It’s also popular because it nicely explains why past generations had so many impressive scientists, thinkers and writers despite the average test score being so low. And for culturally biased tests like the SAT, the Flynn Effect is indeed much smaller at the high end. The percent of all American 17 years old (not just those who finish high school) capable of scoring at the highest levels on the Verbal SAT has not been increasing according to the book “The Bell Curve”, even though at the low end of the distribution, literary rates have been skyrocketing over the 20th century.

However when it comes to one of the best and most culture fair tests we have (the Raven Progressive Matrices), this theory absolutely collapses. In the previous post I talked about the dramatic Raven Flynn Effect in the brightest 10% of British men over the 20th century. But perhaps I wasn’t looking high enough. Perhaps the Flynn Effect disappears for the top 1%, or the the top 0.1%. Fortunately, the always resourceful James Flynn, long ago found the data. If you look at table 18 of a post by blogger Meng Hu, you can see Flynn’s data. What it shows is that in the Netherlands (where Raven scores jumped the equivalent of 21 IQ points in 30 years), getting even extremely high scores became a lot more common. By equating the changing frequency to the IQ distribution (mean 100, SD 15) we see that an IQ of 130 on the older norms equaled an IQ 110 on the newer norms. An IQ of 140 on the older norms equals an IQ of 120 on the newer norms, and an IQ of 150 on the older norms, equals an IQ of 130 on newer norms. So it seems the entire distribution increased by about 20 points from 1952 to 1982, including the extremely brilliant.

So what has caused these massive gains in such a short span of time? [UPDATE June 22/2014: I was informed in the comment section that the Flynn Effect gains among the gifted may have been theoretical rather than actually observed data, in which case, my above analysis is circular speculation]. James Flynn equated the older and younger generation on schooling and found that changes in education explained only about 1 IQ point of the massive rapid gains. Virtually nothing. And given the relatively high correlation between IQ and schooling (within generations), one can assume that people above IQ 150 were already finishing high school at high rates in 1952 so schooling seems an unlikely cause, especially for a test as seemingly culture fair as the Raven. So that brings us to Richard Lynn’s nutrition hypothesis which argues that 20th century nutrition has been increasing both intelligence and height. This theory is especially compelling when one consider that the unusually large Flynn Effect the Dutch enjoyed (7 points a decade!) is nicely paralleled by unusually rapid height gains over the same span of time.

The Dutch were not noted for their height until recently. It was only in the 1950s that they passed the Americans, who stood tallest for most of the last 200 years, said John Komlos, a leading expert on the subject who is professor of economic history at the University of Munich in Germany. He said the United States has now fallen behind Denmark.

So, like the Raven IQ gains, the height gains seem to have taken off in the 1950s

Many Dutch are much taller than average. So many, in fact, that four years ago the government adjusted building codes to raise the standards for door frames and ceilings. Doors must now be 7-feet, 6 1/2-inches high.

So like the Raven IQ gains, the height gains have been very strong at the highest levels.

In 1848, one man out of four was rejected by the Dutch military because he was shorter than 5-foot-2. Today, fewer than one in 1,000 is that short.

This is an astonishing statistic. It suggests that height among the Dutch has increased by 3.8 standard deviations since the 19th century. Does that imply that biological intelligence has increased by 57 IQ points?!! Obviously the nutrition theory is wrong to imply such absurd conclusions? Not so fast. Two points must be understood:

1) Primarily NON-VERBAL intelligence is affected by nutrition. Even the malnourished mind preserves virtually all its verbal-numerical intelligence. And this is not because these are crystallized skills acquired before the onset of malnutrition. Even when malnutrition occurs before birth, the ability to acquire verbal-numerical skills is virtually unscathed and these are the abilities that determine cultural achievements. Non-verbal disabilities are relatively invisible. So even if overall intelligence had increased by 3.8 SD, the increase would be extremely lopsided, with non-verbal IQ increasing much more, and verbal-numeral IQ hardly increasing at all. It’s unclear whether one should expect a 3.8 SD increase in overall IQ (verbal + non-verbal), or a 3.8 SD increase in just non-verbal IQ, in which case overall IQ would be much less increased. But either way, verbal-numerical ability (the building blocks of culture) would be virtually unchanged.

2) The reason height gains are relevant is that they parallel gains in brain size (and probably other properties of the brain too). But it’s unlikely that a 3.8 SD gain in height implies a 3.8 SD gain in brain size, since brain size probably has a much higher genetic floor than height does, which is why people of extremely short stature often appear to have big heads.

Lastly, the paradox of huge Raven gains (even at the high end) while no comparable high end gains on the Verbal SAT is resolved. Verbal-numerical ability is virtually untouched by nutrition, though the rise in education has improved crystallized forms of such skills at the low end (literacy rates) but not the high end because the gifted have always been educated. By contrast nutrition improves the ENTIRE distribution, hence even the gifted show huge Raven Flynn Effects. As for the Math SAT, that’s a mix of Verbal-numerical ability and NON-VERBAL problem solving, so it shows more high end gains than the Verbal SAT, but less than a pure non-verbal culture reduced test like the Raven.

Regardless of whether one thinks the Flynn Effect is caused by nutrition, schooling, video games, or all of the above, a common belief is that it disproportionately impacts the left half of the bell curve. A blog reader known as “Greying Wanderer” writes:

One aspect of the nutrition idea is nutrition might not have improved for everyone to the same degree i.e. nutrition among the upper and upper middle class may already have been fine and stayed the same while the average nutrition in the classes below went up.

Say for the sake of argument with a population average of 100 IQ those to the right of the median have an average of 115 and those to the left have an average of 85 then I guess you could have a situation where only the left side of a population like that was malnourished giving them say a depressed average of 70 IQ thus giving the combined population a depressed average of 93 (70+115). If so then improving the nutrition of the left side might lead to them reaching their natural average of 85 and the whole population increasing their average to 100 (85+115). So an increase in 7 points of the population’s average IQ even though the people on the right side of the Bell curve haven’t changed at all.

Just a thought.

That would fit an increase in average IQ without an increase in innovation.

Greying Wanderer’s argument is plausible in his example of only a 7 point Flynn Effect, but on the Raven IQ test, the Flynn Effect has been over 30 points. In order to get a 30 point gain driven only from the left or the curve, you would need to assume that before 20th century nutrition, IQ 85 people had non-verbal IQ’s of only 25! The notion that the left part of the curve was that severely disabled seems unlikely.

But we don’t have to speculate because we have actual data. The average British born in 1967 got 55 items on the Raven test right. Only 5% got 39 or less right (see Figure 2 and Table 1 of this document). From these data points we can crudely equate a Raven raw score of 55 with IQ 100 (for this age/cohort) and a Raven raw score of 39 with IQ 75, and assign all other scores very rough IQ’s through linear extrapolation (they don’t recommend this, but what choice do we have?).

Now, for the British born in 1877, the average score (IQ 100 for their cohort) was 24 and the top 10% (IQ 119 for their cohort) scored 39. But by the standards of the newer cohort, those are IQ’s of 51 and 75 respectively. Now James Flynn suggests that the IQ’s of the older cohort may have been depressed by up to 10 points by age (they took the test at age 65, while the new cohort took it at 25). The new cohort also got to take the test home, while the old cohort had to sit supervised in some room, which may have depressed their IQ’s another 5 points. Correcting for these factors brings them up to IQ 66 and IQ 90 respectively on the new norms. So summing up, an IQ of 66 on new norms equals and IQ of 100 on old norms (a difference of 34 points) and an IQ of 90 on new norms equals an IQ of 119 on old norms (a difference of 29 points). So maybe the Flynn Effect is 15% smaller for the brightest 10%, but it’s still roughly 2 standard deviations over 90 years.

But haven’t height gains been stronger at the low end?

Because psychologist Richard Lynn hypothesized that nutrition is responsible for the 20th century rise in both height and IQ test performance, height is a useful analogy for the Flynn Effect. According to Arthur Jensen (see page 354 of his book The g Factor), a sample of 8,585 British adult males was measured without shoes in 1883 and had a mean height of 67.46 inches. Looking at the actual frequency table (see pg 25 of this document), I was able to infer the tallest and shortest 5% and compare these figures with a 21st century sample of non-Hispanic white men from a Western country (see table 12 of this document):

Year 1883 Year 2003-2006 Difference

Shortest 5% 63 inches 65.9 inches 2.9 inches

Average 67.46 inches 70.4 inches 2.94 inches

Tallest 5% 72 inches 74.9 inches 2.9 inches

So it seems height gains have affected virtually the entire height distribution roughly equally, rather than disproportionately impacting the low end. However I can’t be too sure of that because I don’t know how representative the 1883 sample was because another source claims British men in the 19th century were 166 cm tall (which equates to 65.35 inches) so maybe there were huge masses of very short people that were missed in some studies. However James Flynn claims that at least in Norway, height gains have been greater for tall people than for short people.

There also seems to be a myth that only the poor lower classes showed height gains since the 19th century, and that elites have always been tall. But British scientist Francis Galton collected height data on various occupations and found that males aged 26+ of the Professional occupation (which was very elite in those days) averaged 67.91 inches tall (see table 10 in HBD Chick’s blog post) which is well below the average of young white men today and probably even more below the average if compared to today’s elite young white men. This is strong evidence that all social classes were strongly affected by nutrition.

There’s been much discussion about a paper by scholars Michael A. Woodley, Jan te Nijenhuis, and Raegan Murphy, published in the prestigious journal Intelligence, arguing that genetic IQ in Western populations has declined by about 1 standard deviation since the 19th century. Although conventional IQ tests indicate people are getting smarter, the paper argues that simple reaction time (measured in milliseconds) is better for comparing people across centuries because although it’s a very crude measure of intelligence, it’s much less sensitive to the non-genetic factors that have caused the Flynn Effect.

A blogger named HBD Chick argued the paper went wrong by using poor sampling. The oldest study the paper cites was conducted by Francis Galton circa 1889. The paper was criticized for using this study because the sample is too elite to represent Victorians (only 4% of the sample was unskilled laborers, even though 75% of Victorians were). But if the vast majority of Victorians were unskilled laborers, then I suggest we use the mean reaction time of just this occupation to represent Victorians. Yes, excluding the top 25% of occupations might bias the estimate downward, but the fact that these were unskilled workers intellectually curious enough to volunteer for Galton’s study would bias the estimate upward, so it cancels out.

An analysis of Galton’s sample (see figure 10 & 11) shows that unskilled laborers aged 14-25 averaged reaction times of 195 milliseconds in males and 190 in females (an average of 192.5).

Do we have a modern sample of similarly aged people that are equally representative of a Western population? Yes. A 1993 study by W.K. Anger and colleagues (see table 2) found that a sample of American postal, hospital, and insurance workers, aged 16-25 (in three different cities) had reaction times of 260 in males and 285 in females (an average of 273). Just as unskilled labor was an average job in the 19th century, working for the post office, hospital or insurance company seems pretty average in modern times. Thus, by subtracting 192.5 from 273, we can estimate the average Western reaction time has slowed by 80.5 milliseconds since the 19th century. Since the standard deviation for reaction time is estimated to be 160.4 milliseconds (see section 3.2), reaction time has slowed by 0.50 standard deviations in over a century (equivalent to a drop of about 8 IQ points). This is actually virtually identical to the effect size the paper found using far more data points, but then they statistically adjusted the effect size, making it implausibly large. So in my humble opinion, the problem with the paper was not the samples they cited, but the statistical corrections they made.

The paper argued that since simple reaction time has a true correlation of NEGATIVE 0.54 with general intelligence, they needed to divided the effect size by 0.54 to estimate the true decline in general intelligence. The logic was that since reaction time is a very rough measure of intelligence, it underestimates the true decline in genetic intelligence. I disagree. Such inferences only make sense if you know a priori that there’s been direct selection for lower intelligence, thus dragging reaction time along for the ride, but that’s an assumption the paper was supposed to test, not rest upon. It could be the opposite. There’s been selection for slower reaction time, thus dragging intelligence along for the ride, in which case the effect size should be multiplied by 0.54, not divided. Most likely, there’s just been recent selection for more primitive traits in general, and both reaction time and genetic intelligence reflect this dysgenic effect to parallel degrees, so the change in one equals the change in the other.

To illustrate the point further, consider that height has increased by 1.5 SD since the 19th century. Height correlates only 0.2 with IQ. Would it make sense to argue that since height is such a weak proxy for intelligence, we need to divide the 1.5 SD increase by 0.2 to estimate how intelligence has changed since the 19th century? By such logic, intelligence would have increased by 7.5 standard deviations since the 19th century (equivalent to 113 IQ points)!

So clearly, the paper was unjustified in dividing the effect size by 0.54.

Without, the adjustment, Victorians were genetically 8 points smarter than moderns, which sounds a lot more believable than 14 points.

But if Victorians had a genetic IQ of 108 (by modern standards) , how could they score only 70 on the Raven IQ test? In a previous post I argued that the Raven is a culture fair IQ test and thus the low Victorian scores must be biological (Richard Lynn’s nutrition theory). Citing Richard Lynn, I also argued that malnutrition stunts non-verbal IQ (the Raven) more than Verbal-numerical IQ (I estimate by a factor of 31). So if sub-optimum nutrition stunted their non-verbal IQ by 38 points, their verbal-numerical IQ’s would have been stunted by only 38/31= 1 point. Thus the Victorians had a verbal-numerical IQ of 107, however because verbal tests are so culturally biased, their verbal scores would be artificially depressed by lack of schooling and exposure to mass media. But on a culture reduced measure of verbal-numerical ability like Backwards Digit Span, they might have scored the equivalent of 107.

So with a culture fair verbal-numerical IQ (Backwards Digit Span) of 107, and a culture fair non-verbal IQ (the Raven) of 70, their overall IQ’s would have been 86, which is 1.5 standard deviations below their genetic IQ of 108. That 1.5 SD gap between phenotype and genotype is perfectly explained by Richard Lynn’s nutrition theory, since average height in Western countries was also 1.5 standard deviations lower in the 19th century. And the great accomplishments of Victorians can be explained by their verbal-numerical IQ of 107.

One of the biggest mysteries in psychology is the Flynn Effect; the fact that over the 20th century, people have been performing better and better on IQ tests. Of course, the average IQ in Western countries by definition is always about 100, however because people keep scoring higher every decade, the tests routinely have to be made more difficult and the norms must be regularly updated. Now if this only happened on culturally loaded tests like General Knowledge and Vocabulary, we could simply conclude that the tests are just culturally biased against past generations who had less access to schooling and media. But some of the biggest gains have been found on tests like the Raven which were explicitly designed to be culture fair.

In one study (see figure 2) the top 10% of British people born in 1877 (by definition those with IQ’s above 120 for their era) performed the same on the Raven as the bottom 5% of British people born in 1967 (by definition those with IQ’s below 75 for their era). In other words, performance on the Raven had increased by the equivalent of 45 points in less than a century! Of course it wasn’t a level playing field because those born in 1877 took the test when they were a somewhat elderly 65 while those born in 1967 took the test when they were young sharp 25 year olds, however Flynn cites longitudinal studies showing that Raven type reasoning declines by no more than 10 points by age 65. That still leaves us with 35 points to explain.

Another source of inaccuracy was that although the test was not timed for either group, those born in 1877 took the test supervised while those born in 1967 got to take the test home. This could potentially make a large difference; not necessarily because the unsupervised group would cheat, but because they would probably take more breaks since they were in the comfort of their homes. They would probably return to challenging items after they had time to relax and see those items from a fresh perspective, while those who took the test supervised in some strange room were probably more likely to rush through the tasks so they could go home. I would estimate that being allowed to take an test home improves test performance by about 5 IQ points on average, though this is just a guess.

But that still leaves a huge difference of 30 IQ points. It’s important to note that the British born in 1877 probably completed no more than eight grades of schooling on average, while those born in 1967 probably averaged more than 12 years of schooling, and not attending high school may reduce IQ scores (though probably not real intelligence) by 8 points. It may seem unlikely that schooling could influence a test that seems as culture fair as the Raven, but some people argue that the Raven is actually culturally biased. Richard Lynn argues that it requires basic math skills like addition and subtraction and believes the rise in education explains part of the adult Flynn Effect. At the very least, people with more schooling might be more likely to take the test from a mathematical perspective or with more motivation, confidence, and persistence.

So if we subtract the 8 point schooling effect, that still leaves 22 IQ points unexplained. Is it possible that real intelligence could have improved by 1.5 standard deviations since 1877? Scientist Stephen Hsu recently noted that male height in Europe also increased by 1.5 standard deviations since 1870, and he compared this to the Flynn Effect. It may seem unlikely that a variable as genetic as intelligence could be so improved by the environment, but height is even more genetic than intelligence is, so it’s obviously quite plausible. This brings us to Richard Lynn’s nutrition hypothesis for the Flynn Effect, which claims that not only has better nutrition made us taller, but smarter too. Included in Lynn’s nutrition hypothesis is disease reduction, since diseases impede nutrients.

How does nutrition make us smarter? Most obviously by increasing brain size, and cranial capacity may have increased an astonishing 150 cubic centimeters in the last 150 years. Of course as Arthur Jensen noted in his book The g Factor, brain size is only moderately correlated with intelligence, so the increased brain size could only have caused less than half of the Flynn Effect. However if nutrition has improved brain size by over 1.5 standard deviations, it likely improved other properties of the brain to the same degree; they’re just harder to measure and notice than something as visible as head size, but perhaps IQ tests are detecting them. For example Lynn notes animal studies showing nutrition affects the growth and number of glial brain cells, as well as neuron myelination, dendrite growth, and synaptic connections and points to human autopsy studies suggesting similar effects.

But if biological intelligence has increased by 1.5 standard deviations, then the average Victorian would have a mean IQ below 80 by today’s standards. An intelligence researcher found this hard to believe given their superior scientific and literary output. In response I made some of these points:

1) The effects of nutrition may have much less impact on the parts of the brain responsible for language (literature) and academic learning. Despite the 20th century rise in schooling, the Flynn Effect appears to be much more pronounced on non-verbal tests of abstract reasoning than on tests of crystallized knowledge, vocabulary and arithmetic. Lynn discusses the same pattern in identical twins, where one twin is born less nourished than the other. By age 15, the twins are equal on verbal-academic abilities, but the less nourished, smaller brained twin is worse on non-verbal fluid problem solving. In my opinion this makes perfect sense. Human survival depends more on our ability to take advantage of generations of cultural knowledge (crystallized intelligence) than on our ability to figure things out for ourselves (fluid intelligence), so when malnutrition strikes, evolution would prioritize and preserve the crystallized functions of the brain at the expense of fluid abilities. And perhaps no crystallized ability is more important than language and literacy, so it’s no surprise that literary abilities would be relatively preserved in the malnourished mind.

2) By what objective standard can we conclude that 19th century humans were more intellectually accomplished? Since the 19th century, humans have invented television, put a man on the moon, grown human ears on mice, used DNA to solve human crimes and map ancient human migrations, invented the internet, Iphones, Ipads, GPS systems, military drones, video games, word processors, microwave ovens,magnetic resonance brain imaging, computer animation, 3-D printing, nuclear weapons, holograms…the list goes on and on. Of course we are building on the accomplishments of those who came in the 19th century, but every generation builds on past generations. Extrapolating from the rate of accomplishments through most of history, I suspect that 20th century accomplishments would exceed mathematical predictions, though I have no idea how to test such a hypothesis.

In addition to scientific advances, since the 19th century humans have become less superstitious (declining religiosity), less violent, and more moral (i.e. civil rights movement, feminism). Moral non-violent behavior indicates intelligence. Also, popular movies and television shows have become far more sophisticated, subtle, and nuanced in the last half century.

3) The fact that average 19th century intellect was much lower does not preclude the existence of 19th century geniuses anymore than the fact that average 19th century height was much lower precludes the existence of 19th century giants like John Rogan or Edouard Beaupre.

There was an interesting article in The Telegraph several years ago about an Australian study found that height was positively correlated with income. For example 6ft tall men typically earned about 1.5% more than 5’10” men. Taller women also out-earned shorter women, though the effect was only half as large. The study also found that fat people earned just as much as skinny people, which is surprising because being fat would seem to lower productivity more than being short, particularly in jobs that require energy and stamina. This may suggest that there’s more discrimination against short people than there is against fat people. This may be because fatness is seen by many as a lifestyle choice while shortness may be seen as an intrinsic flaw. Obesity is also a relatively recent epidemic caused by the advent of modern mass food production, and there may not have been enough time for humans to evolve an innate prejudice against it.

Despite the correlation between height and income, the article notes that Bill Gates and Warren Buffet (both of whom have been America’s richest man) both have a fairly average height of 5’10”. I suspect this is because both men earned their wealth through business income, not salary. It’s the latter that’s probably more correlated with height because high salaries often require being promoted to positions of leadership where people want someone they can look up to (figuratively and literally) while successful entrepreneurs are often shorter because they started the business themselves and don’t have to be promoted.