Heritability of IQ

2 posts / 0 new
Last post

Heritability of IQ

Although the word “intelligence” has no single definition, it is universally associated with terms such as knowledge, mental capability, reasoning, judgment, imagination, cognitive function, and occasionally, adaptation. Over the years, the most common notion with regard to “intelligence” is that it simply is a desirable, advantageous, and sought-after attribute.

Studies on the heritability of intelligence started over a century ago can be dismissed with new research in Molecular Biology, Cell Biology and Neuroscience connecting the brain to the immune system. Immune molecules are actually defining how the brain is functioning. So, what is the overall impact of the immune system on our brain development and function?

The truth is that if you get a particularly nasty virus when you are young, then you are a prime suspect for developing the overactive brain that is edging towards the schizophrenia brain patterns depending on which proteins are over or under produced once the virus starts operating it’s control functions.

After digging through the research on various "geniuses" throughout history, I learned a higher percentage of them had childhoods that more than likely resulted in a similar over expressed memory encoding activity in their brain due to viruses and the resulting anxiety of the heightened immune system pushed them to solve problems they had in front of them, any and all problems they could think about because that is how the brain works when it is stressed - it tries to solve problems. However, the amount of energy required to maintain that excessive brain activity and immune system will likely kill you. I don’t think humans were meant to ponder the whole universe their entire lives, aka philosophize and grandiosize their own take on what it means to be alive… they are meant to learn new things and how to survive but the brain isn’t supposed to be on all the time.

The hereditarian theory of IQ is basically a math equation based on conjecture from old sociological and psychological papers. Heritability is defined as the proportion of variance in a trait which is attributable to genetic variation within a defined population in a specific environment.

What we immediately notice is a long list of enormous variations in the tested IQs of genetically indistinguishable European peoples across temporal, geographical, and political lines, variations so large as to raise severe doubts about the strongly genetic-deterministic model of IQ favored by white spermicide and perhaps also quietly held by many others.

Consider, for example, the results from Germany obtained prior to its 1991 reunification. Lynn and Vanhanen present four separate IQ studies from the former West Germany, all quite sizable, which indicate mean IQs in the range 99–107, with the oldest 1970 sample providing the low end of that range. Meanwhile, a 1967 sample of East German children produced a score of just 90, while two later East German studies in 1978 and 1984 came in at 97–99, much closer to the West German numbers.

These results seem anomalous from the perspective of strong genetic determinism for IQ. To a very good approximation, East Germans and West Germans are genetically indistinguishable, and an IQ gap as wide as 17 points between the two groups seems inexplicable, while the recorded rise in East German scores of 7–9 points in just half a generation seems even more difficult to explain.

Can we really increase our intelligence? The answer is yes. It's basically adjusting our thought-flow to work millions of times more efficiently. A renowned article published in the journal Nature by Price and her colleagues challenged this immutable view of intelligence. The study had 33 adolescents, who were 12 to 16-years-old when the study initiated. Price and her team gave them IQ tests, tracked them for four years, and then tested them again with the same measurement tools. The fluctuations in IQ were outstanding: not about a couple points, but 20-plus IQ points. These changes in IQ scores, according to the researchers, were not random — they tracked elegantly with structural and functional brain imaging. Thus, there is also an important group of scientists that maintain that many of the changes in IQ are correlated to changes in the environment, particularly schooling.

“It’s analogous to fitness. A teenager who is athletically fit at 14 could be less fit at 18 if they stopped exercising. Conversely, an unfit teenager can become much fitter with exercise.”

Furthermore, there is also a certain number of studies that have shown brain changes after several kinds of educational regimens. The study about Tokyo taxi drivers is a especially distinguished one. Scientists conducted memory, visual and spatial information tests and took brain scans using MRI of 79 male trainee Tokyo taxi drivers at the beginning of their training regimen. At the beginning of the study, no variance was found in their brain structure or memory. Three to four years later, however, scientists found a considerable increase in grey matter in the posterior hippocampi, among the 39 trains who performed as taxi drivers. Naturally, this change was not observed in the non-taxi drivers. Thus, this kind of studies suggest that the brain can change to accommodate new knowledge, so future programs for lifelong learning are possible.

To sum up, the old paradigms are crumbling under the onslaught of new research. Neuroscientists have, in a sense, simply taken over the elite, almost clerical office once held by analysts. It is not fully clear what intelligence is, and hence how to directly increase it. Nonetheless, we can consider intelligence, for practical purposes, as a starting point in life. I tend to use the term "smart" for people who are quick witted and use what they know with high accuracy - smart enough to pass tests that require quick and correct answers. Remember "computer" was a term originally applied to human workers doing math work. Smart or knowledgeable or intelligent all just apply to utilizing what is learned to some set of problems.. Knowledgeable is just how much one learns and can reference, not necessarily useable level of knowledge though as so many college students seem to demonstrate. And intelligence, without a qualifier, is just a term applied to knowing enough to survive in an environment; any agent that doesn't make the cut isn't very intelligent.

Naturally, we are born with certain capacities and particular features, but it is later in life when we discover and develop them, regardless of our individual genetic background. Thus, instead of frustratingly trying to increase your “G” factor (since we do not have a general consensus and determinant scientific evidence yet), what you can do is focus in your multiple crystallized intelligences: the ability to use skills, knowledge, and experience. If you are a scientist, observe and analyze information; if you are a philosopher, organize it and turn it into knowledge; if you are an artist, interpret it. Different areas of intelligence have different weights of importance in each person’s occupational life, and you can definitely get better at specific activities through practice and discipline.





They have found zero genes. Those studies on intelligence are not genetic studies, they are simply associations based on assumption. They assume effect sizes and then squeeze out as much as they can get with all kinds of formulas based on more assumptions. They provide zero genetic evidence, they are basically allele counting and making unsubstantiated claims.

Polygenic scores may be all the rage, but they are a fudge. A decade’s efforts to identify individual bricks that build the house have failed: let’s just throw a few thousand suspicious bits of material together to see if they at least make a start. In a host of assumptions like additivity and linearity of effects, it entails a highly naïve – indeed outmoded – view of the gene and biological systems.

Polygenic scores can be slightly interesting from the point of view of animal/plant breeding, where they might actually be usable/measurable. but human GWAS is too much of a hot mess for these things to be even slightly interpretable. What conceivable use does a measure have when it maybe sometimes can explain 1% of the variance? Recall that polygenic scores are, basically, linear combinations of dozens of features–when there is no plausible reason to believe that these effects are statistically independent–so you’ve basically overfit the crap out of what little signal there is, and gee whiz, turns out the prediction is crap also. The method entails a formidable battery of assumptions, data corrections, and statistical maneuvers. But the most fatal assumption is that human societies can be treated as random breeding populations in randomly distributed environments with equally random distributions of genes.

On the contrary, human populations reflect continuous emigration and immigration. Immigrants with related genetic backgrounds tend not to disperse randomly in the target society. In their flow to jobs they concentrate in different social strata. This creates (entirely coincidental, non-causal) correlation between social class and genetic background persisting across many generations. For example, the Wellcome Trust’s “genetic map of Britain” shows strikingly different genetic admixtures among residents of different geographic regions of the United Kingdom.

This is what is called “population structure.” As Evan Charney notes, it is “omnipresent in all populations and it wreaks havoc with assumptions about ‘relatedness’ and ‘unrelatedness’ that cannot be ‘corrected for’ by the statistical methods [devised]”. History shows that anyone committed to a “genes as destiny” narrative, and a mythological meritocracy, based on nothing but mountains of correlations, needs to tread very cautiously.


Ken Richardson’s new book Genes, Brains and Human Potential: the Science and Ideology of Intelligence, will be published in early 2017 by Columbia University Press. Can't wait!

Log in or register to post comments