The Brain Supremacy

Neuroscience has grown from a subdiscipline of biology to a field in its own right, with its own proliferating subdisciplines. In coming decades, it will rival and then surpass the influence of the older physical sciences. This is the era of the brain supremacy.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Imagine if, once every century, a person was born who could travel forward one hundred years in time. For most of human history the learning curve faced by those intrepid travelers would have been manageably shallow. Not so the last few centuries. We live in a world transformed by science and technology. The rate of transfer from science fiction to science fact is amazing, and accelerating. Scientists can now probe individual atoms, see objects round corners, put a robot on Mars, and much else besides.

My examples come from physics. In our science-saturated world, however, the balance of power is shifting towards the life sciences, and especially brain research. With modern neuroimaging techniques like fMRI (functional magnetic resonance imaging), plus advances in genetics, and greater computer power, the study of human brains is at last becoming a fully-fledged science. Neuroscience has grown from a subdiscipline of biology to a field in its own right, with its own proliferating subdisciplines. In coming decades, it will rival and then surpass the influence of the older physical sciences. This is the era of the brain supremacy.

What neuroscientists can do is already astonishing. Brain cells can be activated or shut down using light. Proteins can be tagged and tracked. The cross-talk of many neurons can be recorded in real time at the level of individual cells. Researchers can detect genes switching on or off within a neuron. They can also analyze the epigenetic processes and complex regulatory networks which affect whether those genes are 'read' to make proteins.

The motive for much of this work is clinical. Neuroscience offers hope, at last, for some of our most feared diseases. Slowly but surely, the mechanisms of disorders like Alzheimer's and Parkinson's are being unpicked. The ability to decipher and alter brain function with unprecedented precision will open up even more startling possibilities. From decoding brain activity with scanners to thought-controlled wheelchairs, from neural implants for pain to selective memory erasure for post-traumatic stress, ancient fantasies of mind reading and mind control are taking shape as medical realities.

Yet the prospects offered extend far beyond the clinic. One day, among other things, we may be able to record and share dreams, buy artificial experiences -- 'mind movies' -- and take self-improvement to a new level by editing unwanted thoughts and desires. The age of physics has brought remarkable changes, but compared with what will happen as we turn the power of science on our minds, they were only a beginning.

There's a problem, though. The ethics developed by doctors, over centuries, to deal with human suffering, are different from those developed by scientists trying to understand how the world works. They're still more different from the ethics of businesses keen to cash in on the new technologies, for example by marketing fMRI 'lie-detectors.' And as the products of the brain supremacy have begun to move from clinic and lab to marketplace, the ethical principles don't necessarily move with them. Techniques created to heal can also be employed for other purposes, and the ability to get data from living brains is a holy grail for many interested parties other than neuroscientists and doctors.

Why the discrepancy? At base, our ethical instincts are of two kinds, depending on whether we grant the object in question some moral status or treat it as merely something to be used -- or sold. Until recently, Western science was mainly physical science, and the things it investigated were not of moral interest. Even in the special case of medicine there has always been tension between utilitarian impulses and the demands of human dignity, especially when resources are stretched.

The utilitarian model of user-versus-world was challenged, or rather reorganized, by the animal rights movement, which aimed to move animals into the 'ingroup' of moral entities. Wrongs done to people, like the human vivisections and mass killings of the second World War, also drove ethical advances which emphasized human dignity and rights. Yet the assumption that scientist X can study independent object Y without needing to consider Y's feelings on the matter still works for much of physics, chemistry and biology. Even a large part of animal research is done on bacteria, worms, flies, and other organisms whose feelings, if they have any, concern only the most committed activists.

When studying humans and their brains, however, the model fails, not only for ethical but for pragmatic scientific reasons. Human systems are always changed by their interactions with others, and in hard-to-calculate ways. What a volunteer says and does in a research lab may be altered not only by the lab environment or the phrasing of a question, but by who the experimenters are and how they behave. The human person thus needs to be considered. Technologies which directly scan or manipulate brains cannot be neutral tools, as open to commercial exploitation as any new gadget. The brain supremacy offers chances to improve human dignity, but it also risks abuse.

Privacy is an example. What if the claims already being made for neuroimaging's ability to read minds can be extended to portable, covert surveillance? If we've already revealed ourselves on Facebook, would it matter if companies could scan our brains in real time, observe that we're hungry, and change their targeted marketing accordingly? Would those companies be obliged to tell us, or our insurers, if they found evidence of brain disease -- or of dubious beliefs? Would governments be justified in having ideologically hostile individuals 'adjusted' before they committed any terrorist act? And so on.

As the brain supremacy takes hold, neuroscience will affect us like never before. Ultimately, we may be able to manipulate brains -- ours and other people's -- as easily as we now manipulate electronic information. If we had time travel, jumping forward a century from now might land us in an almost unimaginable world.

We already live, as the old saying goes, in interesting times. The brain supremacy will make them more interesting still.

'The Brain Supremacy' is due to be published by Oxford University Press on 25 October.

Popular in the Community


What's Hot