Dare to Be 100: Brain, Cost/Benefit Study, Complexity

This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

The Oct. 16, 2015 issue of Science contains a central article, "Virtual Rat Brain Fails to Impress its Critics." It was written by Kai Kuperschmidt. This report captured my interest because the issues that it displays are congenial to many large topics in science. The paper goes to great depth about the Human Brain Project that was funded by a €1 billion grant from the European Commission. The stated intent of this monstrous grant according to the PI Henry Markram was to simulate the entire human brain in the computer. As such it represents a build out of a prior study titled

"The Blue Brain Project" also led by Prof. Markram, and funded generously by the Swiss government. The first results of this smaller more modest rat brain study have been recently published in a Cell paper. They represent the most detailed digital reconstruction ever reported, a simulation of 30,000 neurons connected by almost 40,000,000 synapses in a piece of rat brain about the size of a grain of sand. In contrast, the human brain is 2 million times larger. Christof Koch commented about the remarkable progress that has been achieved in synthesizing the behavior of neuronal networks by hard-nosed engineering.

However, this work has come under severe criticism by a cascade of neuroscientists. In an open letter relating concern about bad management and status of the scientific data the critics comments resulted in a curtailing of the grandiose focus that had shocked all by the immensity of connections. Consequently they markedly reduced their reach to a small focus. This effort has blunted much concern, but inevitably reduces the hoped for generalizability since many other relevant influences are omitted.

The fundamental issue here concerns the complexity of the entire proposition itself, as the numbers are so huge as not to be susceptible to reasonable inquiry. This has previously been termed the "catastrophe of complexity," and raises the issue of whether reductionism has gone too far. Not everything is computable, particularly as the conditions are in constant flux. This was the topic of Laughlin and Pines in their conclusion that reductionism has reached its limit (1). The bottom line to me is simply that we must restrain our insistence on knowing everything and always. IMPOSSIBLE.

Those charged with the responsibility for judicious expenditure of the public treasury must beware of appropriate allocation of societal resources to projects of such daunting complexity.

For me, the issue achieves even greater un-doabity because of the central role played in all things human by the vagaries of human behavior. All the king's computers are inept as behavior is excluded in our search for universal knowledge.

Modesty in an essential ingredient in progress.


Laughlin, R. Pines, D.Theory of Everything PNAS 1999, 97:28-31.