Human Intelligence And Economic Growth From 50,000 B.C. To The Singularity

  • PDFRising human intelligence has been the underlying driver of long-term economic growth. Before the Industrial Revolution, gains in intelligence were the product of cultural, and perhaps genetic, changes.
  • Once humanity was freed from the Malthusian trap, a virtuous cycle began in which higher levels of material prosperity led to improved environmental conditions, including better nutrition and education. This, in turn, led to further gains in intelligence.
  • Although these environmental factors, at least in the developed world, appear to be stalling, technology offers a way forward.
  • If Moore’s Law continues to hold, computers will be able to simulate a human brain by 2030. This, along with breakthroughs in genetics and nanotechnology, could boost human intelligence by a previously unfathomable magnitude, ultimately triggering a “technological singularity”.
  • Long-term investors should overweight tech stocks, whose valuations now stand at multi-decade lows compared with the broader market. Within the tech sector, BRAIN stocks – Biotech, Robotics, Artificial Intelligence, Nanotechnology – will be the leaders of the next great tech boom.

The Dawn Of Plenty

For over 50,000 years, humans lived on the edge of subsistence. Then, within the course of a few hundred years, living standards skyrocketed. Why?

This question has occupied economic historians for decades. The reigning explanation, which has not changed much since the writings of Adam Smith, hinges on the interaction between the expansion of market institutions and technological change. Sometime around the turn of the 18th century, a virtuous circle began to emerge. A rise in agricultural yields, largely on account of better farming techniques, permitted more people to migrate to cities and towns. There, they engaged in a division of labor that allowed them to specialize in all manners of work, helping to further expand the technological frontier. The resulting economic growth allowed even more people to move to cities, leading to an even greater expansion of commerce, and so on.

There is much truth to this theory, but it has some loose ends. Why, in particular, did this dynamic unfold in the 18th century, rather than, say, the 12th century? One answer is that England was a very different country at the dawn of the Industrial Revolution. For example, by 1800, 60% of men and 40% of women in England were literate, up from 10% three centuries earlier.

The threat of violence also receded. Chart III-1 shows that homicide rates in England fell by over 90% between 1200 and 1800, a pattern that was repeated in continental Europe. In
addition, real interest rates declined significantly, both as property rights improved and cultural norms that emphasized thrift over instant gratification took hold. Lower real rates, in turn, allowed for greater capital accumulation. The groundwork for the Industrial Revolution had been laid.

Chart III-1: The Long-Term Decline In Homicide Rates

Chart III-1 The Long-Term Decline In Homicide Rates

All these observations, however, just push the question back one layer. What exactly caused this social transformation? It was not because people became richer. As Chart III-2 shows, these developments took place during a period when England was still stuck in a Malthusian trap, with gains in output being largely absorbed by a rising (though more urbanized) population. The average English worker in 1800 may have been more literate, peaceful and thrifty, but as measured by daily calorie consumption, household possessions, life expectancy and physical height, he was no better off than a worker 600 years earlier.

Chart III-2: Escape From The Malthusian Trap

Chart III-2 Escape From The Malthusian Trap

The invention of the printing press in 1439 clearly helped to facilitate the spread of literacy. But Gutenberg’s invention did not occur in a vacuum. Rather, it was part of an acceleration in scientific knowledge that began with the Renaissance and continued through the Enlightenment. What was the underlying driver of these developments?

In his book, A Farewell To Alms, economic historian Gregory Clark offered a novel theory. Using historical data gathered from thousands of wills, Clark showed that members of skilled professions, who tended to be the most literate, had about twice as many surviving children as unskilled workers. Indeed, the fledgling middle class of the time had even more surviving children than the aristocracy, who were often out fighting wars. Since real incomes were broadly unchanged in the centuries leading up to the Industrial Revolution, there must have been downward social mobility within the population, with sons, on average, being less wealthy than their fathers. This downward mobility, in turn, spread the virtue of literacy and thrift to the bottom rungs of the income ladder. Indeed, so powerful was this transmission mechanism that the relative wages of craftsman to laborers fell by a third between 1800 relative to 1200, implying that the supply of skilled labor was growing more quickly than the demand for skilled workers over this period.

A Smarter World

Although Clark’s work focuses on how middle class values diffused throughout the population, he does note that the transformation may have also occurred at a genetic level. The idea is not as crazy as it sounds. A few years ago, a group of scientists, using data newly available from the HapMap project, published a paper arguing that human evolution accelerated by 100-fold starting around 10,000 years ago and has continued unabated to this day.1

This acceleration was the result of two things. Firstly, the shift from a hunter-gatherer to an agrarian society created a massive evolutionary disequilibrium, causing all sorts of new selective pressures to emerge. Secondly, the accompanying surge in the human population increased the number of potentially favorable mutations. Some of these mutations – such as the one for lactose tolerance that allows modern humans to consume milk into adulthood – are well known. Others are just now being identified. Considering that about half the human genome relates to various aspects of brain function, it is likely that many of these mutations had a bearing on how humans think and act.

Could people have actually gotten smarter in the centuries leading up to the Industrial Revolution? It is a provocative question and, so far, the evidence is far from conclusive. But one thing we know is that IQ is highly heritable. In a comprehensive review of the available literature, the American Psychological Association issued a statement in 1996 stating that heredity accounts for about 75% of the variation in IQ after adolescence. A more recent meta-analysis of various IQ studies concluded that genes account for about 85% of the variation in adult IQs, suggesting that IQ is about as heritable as height.2 We do not know if more intelligent people, on average, were richer in the Middle Ages, but we do know that this is true today (Chart III-3). If the same relationship existed in the pre-Industrial era, as Clark’s works suggests was the case, then it is possible that human intelligence evolved in a way that facilitated the eventual economic explosion that we associate with the Industrial Revolution.

Chart III-3: IQ Tends To Be Positively Correlated With Income And Wealth

Chart III-3 IQ Tends To Be Positively Correlated With Income And Wealth

The Flynn Effect

Over the course of the 19th century, fertility rates in the U.K. and many other countries declined among higher income households. At the same time, lower income households saw a rapid improvement in health, mainly on account of better nutrition and sanitation. This led to a reversal in historic fertility trends, as the poor began to have more surviving children than the rich. This transformation was not lost to much of the intelligentsia of the time, and with the publication of Darwin’s The Origin of Species, the eugenics movement was born. Although the movement is now largely associated with the far-right, in its heyday, it was mainly a progressive obsession, with Theodore Roosevelt, George Bernard Shaw, Margaret Sanger and John Maynard Keynes among its adherents.

Yet, a funny thing happened on the way to idiocracy. Contrary to the eugenicists’ fears, instead of becoming dimmer, human beings became even smarter. Psychologist James R. Flynn was amongst the first to document this phenomenon. Flynn noted that each generation to take an IQ test tended to do better than the previous generation. He calculated that, on average, IQ scores were increasing by several points per decade, and that the gains in IQ scores were the same for men and women as well as across all racial groups.

What accounts for the Flynn effect? Since natural selection, if anything, would have worked in the other direction over recent history, the Flynn effect must have been entirely driven by environmental factors. But which ones? Better schooling is clearly part of the explanation. The human brain is a bit like a muscle: the more you use it, the stronger it gets. However, there appears to be more to the story than that. For one thing, most of the IQ gains that Flynn documented show up by the age of four, which is about as early as IQ can be reliably measured and before most children start school. In addition, the increase in scores has been concentrated in components of IQ tests that deal mainly with abstract logic and spatial skills. In
contrast, scores in arithmetic, writing and general knowledge – the sort of subject matter most affected by schooling – have seen only modest improvements.

Perhaps most tellingly, the increase in IQ scores has occurred alongside an increase in average brain mass. Forensic evidence from the U.S. suggests that the average volume of adult human skulls has increased by 7% since the late 1800s, or roughly the size of a tennis ball. Considering that dozens of studies have confirmed a positive correlation between IQ and brain mass, this suggests that the Flynn effect may be partly physiological in nature.

The increase in brain mass, in turn, was almost certainly driven by better health and nutrition. Over the past century, the average height of adults in developed economies has increased by around 10 centimeters, largely, one presumes, because of better health (Chart III-4). A reduction in the prevalence of infectious diseases – made possible by better sanitation and vaccinations – has also helped (Chart III-5). For example, a number of studies have documented a strong relationship between the timing of malaria eradication in the U.S. and other parts of the world and subsequent observed gains in childhood IQs.

Chart III-4: Health Gains Have Led To Height Gains

Chart III-4 Health Gains Have Led To Height Gains

Chart III-5: Living Longer

Chart III-5 Living Longer

The End Of The Flynn Effect?

The problem with environmental effects is that, sooner or later, they eventually run into diminishing returns. This
appears to be happening with the Flynn effect. Several studies of Danish and Norwegian conscripts suggest that IQ scores stabilized in the mid-1990s and have declined somewhat since then. Interestingly, this appears to have coincided with a leveling off in the average height of incoming recruits. A recent study by Flynn himself indicated that IQ among British teenagers has fallen by about six points since the 1980s.3

How much of a problem does this pose for the long-term outlook for economic growth? To be sure, despite some
evidence that suggests the Flynn effect has stalled, underlying productivity growth has held up reasonably well over the past 20 years. That said, one should keep in mind that the composition of today’s labor force reflects the educational achievement of graduates from as far back as the 1950s. A recent analysis by James Heckman suggests that the U.S. high-school graduation rate, if properly measured, peaked about 40 years ago.4 The current generation will almost certainly be the first in American history that is less educated than the preceding one. This is bound to have an adverse impact on growth.

Better schooling can help matters. However, one should be realistic about what can be achieved. Test scores on standardized tests of math, science and reading among high school students in the U.S. have shown no improvement over the past 20 years, despite a 50% increase in real spending per pupil (Chart III-6). More money, of course, is not always the answer. Yet, despite all the rhetoric, the available data on the efficacy of ‘school reform’ are not particularly encouraging. A recent high-level report by the Department of Education implicitly concluded as much, noting that “The panel did not find any empirical studies that reached the rigor necessary to determine that specific turnaround practices produce significantly better academic outcomes.”5 For the most part, if there is a problem with U.S. schools, the problem is with the students and their parents. Public policy can do little about that.

Chart III-6: Spending More But Not Learning More

Chart III-6 Spending More But Not Learning More

The analysis above would seem to support the views of productivity-sceptics such as Robert Gordon, who recently argued that trend U.S. growth was likely to fall by at least half over the coming decades.6 Yet, there is another side to the coin, a potentially much more optimistic one.

Firstly, while the Flynn effect may be disappearing in developed countries, it likely has much farther to go in developing economies. For example, two recent studies have documented significant gains in measured IQs in Kenya and Brazil. Even small steps such as fortifying salt with iodine (which costs about five cents per person per year) have been shown to boost IQ by as much as 15 points. As health standards in developing countries improve, further gains in IQ are likely. And to the extent that the world economy increasingly draws from a global talent pool of labor, this will help prop up productivity growth.

Secondly, while the environmental catalysts for the Flynn effect in industrialized economies may have disappeared, another catalyst – a potentially much greater one – looms on the horizon: technology.

The Path To The Singularity

Twenty miles outside Shenzhen, the future is unfolding. BGI-Shenzhen, formerly known as the Beijing Genomics Institute, is one of China’s leading genomics firms. It employs 4000 scientists and owns more next-generation DNA sequencers than any firm in the world. The company’s rather lofty goals include extending human lifespan by five years, decoding half of all genetic diseases, and increasing global food production by 10%. But it is its cognitive genomics project that may end up proving to be its most revolutionary. BGI is sequencing the entire genomes of 1000 ultra-high-IQ individuals, hoping to isolate the alleles that contribute to human genius.

They will probably succeed. We know that certain gene variants tend to generate higher IQs. In some cases, scientists have stumbled upon these genes because they are associated with known genetic diseases. For example, carriers of the gene for torsion dystonia appear to have IQs that are around ten to twenty points higher than for the general population. In keeping with the analysis in the previous section, many of these gene variants appear to be quite new (in many cases, no more than a 1000 years old) and so, have not had time to be optimized by evolution. No doubt, scientists will be looking for ways to push nature along.

In 2001, it cost $95 million to sequence a human genome. In 2008, the cost had fallen to $3 million. As Chart III-7 shows, the cost is now only $7,000. As plunging prices make genetic technologies increasingly accessible, they will be deployed on an ever grander scale. Databases will be created that correlate thousands of individual genomes with life outcomes. Using existing statistical techniques, prospective parents will be able to receive information about the likely physical and psychological traits of their children well before they are born.

Chart III-7: Massive Decline In DNA Sequencing Costs

Chart III-7 Massive Decline In DNA Sequencing Costs

Given the horrific history of eugenics in the 20th century, concern about how such new technologies will be used (or abused) is entirely justified. But the point of this report is not to discuss what should happen, but what will happen. Once people have the ability to select for traits in their children that they find desirable, the reality is that many will do so.

Some governments may try to forestall such developments. Others may actively encourage them, if only due to geopolitical imperatives. If widely adopted by an entire country, even an incredibly blunt strategy that selects for the “highest potential IQ” embryo from a sample of ten has the potential to raise average IQs by five points per generation. Over the course of a century, that would correspond to a 40-fold increase in the share of a population that could be classified as “genius” (IQ over 145). In the 20th century, battles were fought with tanks and missiles. In the 21st century, they will be waged with test tubes and gene sequencers.

In any case, many of these developments will happen as the by-product of medical progress, making it difficult for governments to impede them. For instance, by adding an additional copy of the NR2B gene, scientists have been able to significantly boost the memory of common field mice. There are now over 30 variants of such genetically-modified Doogie mice (yes, named after Doogie Howser, MD). This research will likely help treat debilitating diseases such as Alzheimer’s. But, in time, it will also be used to boost the memory of otherwise perfectly healthy people.

Rise Of The Machines

Progress is sometimes hard to see when it is unfolding before one’s eyes. When new technologies become available, we are quick to notice their flaws. Yet, by the time the bugs have been ironed out, we are often so used to the new technology that we no longer regard it as novel. Such was the case with the first generation of cell phones, which were as big as a briefcase and notoriously unreliable. The fact that people now walk around with mobile phones that resemble Star Trek “communicators” just does not seem all that special.

The next generation of cell phones will begin to blur the distinction between man and machine. Google is set to launch a pair of “internet glasses” early next year. Eventually, such devices will evolve to the point that images are directly projected onto the retina. After a while, neural implants will allow users to comfortably surf the web from the comfort of their own brains. As the famed futurist Ray Kurzweil has noted, people in the future might think it bizarre that anyone could go a day without backing up their memories and thoughts to the cloud for future usage.

The driving force behind these advancements has been the declining cost of information technology. Most people are aware that the number of transistors on a computer chip has doubled roughly every two years since the 1960s; an observation known as Moore’s Law. However, as Kurzweil points out, the era of integrated circuits is the fifth in a series of computing paradigms that stretch back to the late-nineteenth century. The first paradigm, best exemplified by Charles Babbage’s “Analytical Engine” in 1900, was based on fairly simple electromechanical devices. These were eventually supplanted by solid-state relays, followed by vacuum tubes, then transistors, and finally microprocessors. The upper limits of microprocessor technologies, based on ever-finer photolithography, will probably be exhausted by the end of this decade, ushering in a sixth paradigm, perhaps based on optical, quantum or DNA computing.

As computers become more powerful, they will also shrink in size. IBM’s Watson, which made history by beating two world champions at Jeopardy! in 2011, was the size of a large bedroom. Now it is the size of a pizza box, has nearly three times the processing speed of its predecessor, and spends much of its day diagnosing cancer.7 Miniaturization, in turn, will lead to a whole host of new applications. Already, neural implants are successfully being employed to treat Parkinson’s disease. Specially-engineered nanoparticles have also been used to cure Type 1 diabetes in mice. Eventually, nanobots will stream through our bloodstream, repairing defective genes and obliterating cancer cells.

A Phase Transition For Humanity?

Nobody cares if you can run the 100 meter dash in 11 or 12 seconds. However, people will care if you run it in 9 seconds or 10 seconds, because the difference between the two is an Olympic gold medal. By the same token, water is just water if its temperature is 80 or 90 degrees. But when the temperature hits 100 degrees, a “phase transition” occurs: it becomes steam. Humanity may be approaching such a phase transition.

A modern desktop computer barely has the computational power of a mouse. It took over a century just to achieve that. Given the nature of exponential progress, however, computers will have the processing power of a human brain by the mid-2020s (Chart III-8). This, it should be noted, is within the time horizon of many long-term money managers.

Chart III-8: Moore’s Law: Over 199 Years And Going Strong

Chart III-8 Moore's Law Over 199 Years And Going Strong

Once this threshold is crossed, the goal of creating human-like artificial intelligence – once a science-fiction dream – may become a reality. In fact, scientists have already modeled and simulated over 2000 regions of the brain, including the cerebellum and the auditory and visual cortexes. By 2030, whole-brain emulation will be possible.

The widespread use of genetic technologies and implantable neural chips that enhance mental performance will feed on itself. Once humans become smarter, they will figure out ways to become smarter still, leading to unimaginable exponential technological progress. The resulting “technological singularity”, an idea first articulated by John von Neumann – who himself may have been the smartest person to ever live – will be the most profound development in human history.

Investment Conclusions

We do not know, and cannot know, what will happen once we cross this event horizon. Technology is a double-edge sword. Fire can keep you warm, but it can also burn down your home. For all its promises, exponential technological progress also means that humans may invent increasingly ghastly ways to annihilate themselves. The fact that there are 300 billion stars in our galaxy and we have yet to find evidence of intelligent life near any of them suggests that post-singularity civilizations, if they occur, do not last very long. Then again, the earth has been around for 4.5 billion years, roughly a third of the age of the universe. Our galaxy will be pumping out stars for another 10 trillion years. Humans may have just arrived to the cosmic party very early.

If we do survive the singularity, the resulting economic boom will be spectacular. Humanity has, in fact, experienced two quasi-singularities in the past: the Agricultural Revolution around 10,000 B.C., which raised real output growth by around 100-fold over the Paleolithic era; and the Industrial Revolution, which raised growth by a further 30-fold, from around 0.1% between 1000 B.C. and 1850, to about 3% over the past century (Chart III-9). If the singularity leads to a further 15-fold increase in global growth, that would allow real GDP to double every two years. Such a rate of growth would be tantamount to applying Moore’s Law to the entire economy. If corporate profits grew in line with GDP, then after 20 years, the Dow would reach roughly 30 million. Booyah.

Chart III-9: Global Growth From The Dawn Of Humanity

Chart III-9 Global Growth From The Dawn Of Humanity

Such rates of growth may seem unimaginable, if not outright silly. Perhaps, but from the standpoint of someone living in 1800, most everything that transpired over the subsequent 200 years would have seemed pretty silly. The future is unlikely to be any different. Take one concrete example: Every time a cell divides, the bits of DNA at the end the chromosome – telomeres – get shorter. Stop the shortening, and you stop aging. Already, gene therapy using telomerase, an enzyme that slows this process, has been shown to safely extend the lifespan of mice by 24%. What would be the GDP impact if humans could stop, or even reverse, aging?

As the singularity draws near, the market will take notice. Chart III-10 shows that the price-earnings ratio of U.S. tech shares relative to the broader market is at a multi-decade low, despite the fact that tech EPS has grown at twice the rate since 1980. Accordingly, long-term investors should overweight technology stocks. And within the tech universe, investors should increasingly focus on “BRAIN” stocks – Biotech, Robotics, Artificial Intelligence, Nanotechnology. These are likely to be the leaders of the next great tech boom.

Chart III-10: U.S. And Global I.T. Stocks Are Relatively Cheap

Chart III-10 U.S. And Global I.T. Stocks Are Relatively Cheap

Peter Berezin
Managing Editor

Interested in a trial to The Bank Credit Analyst? Click here.

1          John Hawks, Eric Wang, Gregory Cochran, Henry Harpending, and Robert Moyzis, “Recent Acceleration of Human Adaptive Evolution”, Proceedings of the National Academy of Sciences of the United States of America, 2007, 104 (52).

2          Thomas Bouchard, “Genetic Influence On Human Psychological Traits – A Survey”, Current Directions in Psychological Science, 2004, 13 (4).

3           James R. Flynn, “Requiem For Nutrition As The Cause Of IQ Gains: Raven’s Gains In Britain 1938–2008”, Economics and Human Biology, 2009, Vol. 7.

4          James Heckman and Paul La Fontaine, “The American High School Graduation Rate: Trends and Levels”, The Review of Economics and Statistics, 2010, 92 (2).

5          “Turning Around Chronically Low-Performing Schools”, Institute Of Educational Sciences, U.S. Department of Education, May 2008.

6          Robert J. Gordon, “Is U.S. Economic Growth Over? Faltering Innovation Confronts The Six Headwinds”, Centre for Economic Policy Research, Policy Insight No. 63, September 2012.

7          “IBM’s Watson Gets Its First Piece of Business in Healthcare”, Forbes.com, February 8, 2013.