We have grown so accustomed to vast collection of our personal data and breaches of our privacy by both government agencies and private companies that new revelations no longer come as a surprise. However, we cannot pass over in silence the secret agreement made in September 2015 (but first reported by New Scientist on April 29), giving Google's subsidiary DeepMind access to confidential medical records of 1.6 million Britons.
These records from the U.K.'s National Health Service (NHS) apparently include such details as a person's full name, HIV status, results of pathology and radiology tests, past drug overdoses, and logs of their hospital stays, including who visited them and when. NHS has given assurances of "anonymity" for these data, but that offers little comfort. In fact, the agreement states that "pseudonimisation is not required," and besides, even anonymized data may be used to reveal private information.
Concerns have been raised over why DeepMind needs such a wide range of data when its stated goal is to develop an app for the prevention of a specific kidney disease. Furthermore, these sensitive data are given to DeepMind without either informing patients of seeking their consent. Questions have been asked whether in order to have access to these data Google was obligated to receive a regulatory approval under a U.K. law, the NHS Act 2006. Google has not applied for such an approval and has argued that it was not necessary, but critics believe that in fact the law does require it.
What makes this case especially troubling is that Google is the world's largest information technology company and one of the world's two biggest companies overall. Today, given our collective addiction to technology, Google wields tremendous power over our tastes, our behavior, and our future.
Health care is surely an area in which technology companies have an important role to play. In particular, properly supervised data analysis using novel learning algorithms has the potential to improve the diagnostics and treatment of major diseases.
Google has done the world much good with its search engine and other products. However, the company's secrecy and lack of oversight are of grave concern. When it acquired the artificial intelligence startup DeepMind two years ago for a reported $650 million, Google promised to set up an ethics board to deal with the issues of artificial intelligence. Well, I Googled "Google ethics board" -- and there is still no information about it. That is troubling.
Artificial intelligence is invading our lives through various programs and devices, but it's a double-edge sword. For example, a facial recognition program can be used to quickly organize your photo album, but it can also be used in a lethal autonomous weapons system that identifies suspects and strikes them without human supervision. Thus, along with great promise, AI holds the potential for unprecedented risks to the humankind. That's why transparency and oversight are crucial in this area.
Alas, Google hasn't been particularly forthcoming with information about its AI projects. But here's what we know about DeepMind. Founded by three brilliant young scientists, the company was in the news recently because a deep learning algorithm AlphaGo it had developed beat a human Go champion -- no doubt, an impressive achievement. Unfortunately, the company's CEO then proceeded to declare that DeepMind's algorithms could lead to a "meta-solution to any problem."
That is a wild exaggeration, given what we know about the limitations of such algorithms. As a mathematician, I can attest that although there are some elements of 20th-century math in them, they are really based on 19th-century mathematics, cleverly adapted. The math itself is beautiful, and I salute my fellow mathematicians and computer scientists pushing the boundaries of what such algorithms can do. But to believe that everything about life can be explained in this way is akin to the exuberance of an 11-year old who has learned trigonometry and is so excited he thinks the whole world is trigonometry.
Pure and simple, this is hubris. And, I am sorry to say, it is reflected in how DeepMind has acted in acquiring the NHS medical data: not bothering to ask for people's consent and not following ethics rules and regulations. What these actions communicate is that DeepMind views people's medical histories merely as a bunch of data it wants to feed into a learning algorithm, in the same way as it used the old Go games for training the AlphaGo algorithm. And if a company treats people as pieces in a board game, why would it care about privacy and ethics? Well, that is precisely why we shouldn't give DeepMind and its parent company a free hand in using our private data without proper supervision.
Unlike a human, a company is an algorithm, set to maximize a utility function: profit. Therefore, companies often become secretive, evading oversight. But we shouldn't set the bar low. We must demand transparency and oversight -- especially of technology companies that exert such profound influence on our lives. Ultimately, it all depends on us. This is not about robots or algorithms. It's about us, humans; it's about who we are, who we want to be.
Science and technology are essential parts of our culture, as well as the means for improving our lives. Humans have an insatiable desire to explore and to understand. In this quest for knowledge, we keep pushing the envelope. But life is not a game of Go. Simplistic solutions can, and will, lead us astray, especially when driven by big money.
In his prescient book The Myth of the Machine, Lewis Mumford has warned us: "On the terms imposed by technocratic society, there is no hope for mankind except by 'going with' its plans for accelerated technological progress, even though man's vital organs will all be cannibalized in order to prolong the megamachine's meaningless existence."
DeepMind's motto is "Solve intelligence." But more than intelligence, we need wisdom. Without it our proverbial "left brain" will run amok, to devastating results.
Technology companies will treat us as pieces in a board game only if we let them do so. It's time to wake up to the new reality and create a system of checks and balances in which the kind of secret agreement that Google made to access citizens' private data will not be allowed.