Heritability of IQ

2 posts / 0 new
Last post
#1

Heritability of IQ

Although the word “intelligence” has no single definition, it is universally associated with terms such as knowledge, mental capability, reasoning, judgment, imagination, cognitive function, and occasionally, adaptation. Over the years, the most common notion with regard to “intelligence” is that it simply is a desirable, advantageous, and sought-after attribute.

Studies on the heritability of intelligence started over a century ago can be dismissed with new research in Molecular Biology, Cell Biology and Neuroscience connecting the brain to the immune system. Immune molecules are actually defining how the brain is functioning. So, what is the overall impact of the immune system on our brain development and function?

The truth is that if you get a particularly nasty virus when you are young, then you are a prime suspect for developing the overactive brain that is edging towards the schizophrenia brain patterns depending on which proteins are over or under produced once the virus starts operating it’s control functions.

After digging through the research on various "geniuses" throughout history, I learned a higher percentage of them had childhoods that more than likely resulted in a similar over expressed memory encoding activity in their brain due to viruses and the resulting anxiety of the heightened immune system pushed them to solve problems they had in front of them, any and all problems they could think about because that is how the brain works when it is stressed - it tries to solve problems. However, the amount of energy required to maintain that excessive brain activity and immune system will likely kill you. I don’t think humans were meant to ponder the whole universe their entire lives, aka philosophize and grandiosize their own take on what it means to be alive… they are meant to learn new things and how to survive but the brain isn’t supposed to be on all the time.

The hereditarian theory of IQ is basically a math equation based on conjecture from old sociological and psychological papers. Heritability is defined as the proportion of variance in a trait which is attributable to genetic variation within a defined population in a specific environment. The old paradigms are crumbling under the onslaught of new research. Neuroscientists have, in a sense, simply taken over the elite, almost clerical office once held by analysts.

Naturally, we are born with certain capacities and particular features, but it is later in life when we discover and develop them, regardless of our individual genetic background. Thus, instead of frustratingly trying to increase your “G” factor (since we do not have a general consensus and determinant scientific evidence yet), what you can do is focus in your multiple crystallized intelligences: the ability to use skills, knowledge, and experience. If you are a scientist, observe and analyze information; if you are a philosopher, organize it and turn it into knowledge; if you are an artist, interpret it. Different areas of intelligence have different weights of importance in each person’s occupational life, and you can definitely get better at specific activities through practice and discipline.

Data:

https://nootropix.com/can-we-really-increase-our-intelligence/

Exactly

They have found zero genes. Those studies on intelligence are not genetic studies, they are simply associations based on assumption. They assume effect sizes and then squeeze out as much as they can get with all kinds of formulas based on more assumptions. They provide zero genetic evidence, they are basically allele counting and making unsubstantiated claims.

Polygenic scores may be all the rage, but they are a fudge. A decade’s efforts to identify individual bricks that build the house have failed: let’s just throw a few thousand suspicious bits of material together to see if they at least make a start. In a host of assumptions like additivity and linearity of effects, it entails a highly naïve – indeed outmoded – view of the gene and biological systems.

Polygenic scores can be slightly interesting from the point of view of animal/plant breeding, where they might actually be usable/measurable. but human GWAS is too much of a hot mess for these things to be even slightly interpretable. What conceivable use does a measure have when it maybe sometimes can explain 1% of the variance? Recall that polygenic scores are, basically, linear combinations of dozens of features–when there is no plausible reason to believe that these effects are statistically independent–so you’ve basically overfit the crap out of what little signal there is, and gee whiz, turns out the prediction is crap also. The method entails a formidable battery of assumptions, data corrections, and statistical maneuvers. But the most fatal assumption is that human societies can be treated as random breeding populations in randomly distributed environments with equally random distributions of genes.

On the contrary, human populations reflect continuous emigration and immigration. Immigrants with related genetic backgrounds tend not to disperse randomly in the target society. In their flow to jobs they concentrate in different social strata. This creates (entirely coincidental, non-causal) correlation between social class and genetic background persisting across many generations. For example, the Wellcome Trust’s “genetic map of Britain” shows strikingly different genetic admixtures among residents of different geographic regions of the United Kingdom.

This is what is called “population structure.” As Evan Charney notes, it is “omnipresent in all populations and it wreaks havoc with assumptions about ‘relatedness’ and ‘unrelatedness’ that cannot be ‘corrected for’ by the statistical methods [devised]”. History shows that anyone committed to a “genes as destiny” narrative, and a mythological meritocracy, based on nothing but mountains of correlations, needs to tread very cautiously.

---

Ken Richardson’s new book Genes, Brains and Human Potential: the Science and Ideology of Intelligence, will be published in early 2017 by Columbia University Press. Can't wait!