With the release of the feature film The Imitation Game starring Benedict Cumberbatch, a long overdue spotlight is being shone on Alan Turing, arguably one of the most unsung heroes in scientific research.
Alan Turing was a British mathematician and a principal architect of the team of code-breakers at Bletchley Park during World War II. Turing and his team faced the seemingly impossible task of analyzing the big data of their day, that is, the billions of possible combinations of codes generated by Nazi Germany's Enigma. Turing helped develop the "Bombe," an electro-mechanical machine that was able to decipher these codes. A key part of Turing's brilliance was effectively using the information that the Bombe yielded to thwart the enemy's plans, without revealing that the Enigma codes had been broken, thus hastening the Allied victory. Although primitive by today's standards -- High Performance Computer Cluster (HPCC) Systems, for example, is an open source, big data processing platform that can handle 30 million transactions per hour -- the Bombe was revolutionary in its day and it is still regarded as a breakthrough technological achievement.
Turing's World War II work and his 1937 paper on the theoretical computing machine known as the "Turing Machine" are remarkable on many levels. His concept of universal computation laid the foundation for modern computing and the IT revolution, as well as far-sighted predictions on artificial intelligence. The so-called "Turing Test", an industry standard for measuring artificial intelligence, is a challenge to see if a panel of human beings conversing with an unknown entity (via keyboard, for example) believes that that entity is human. If the entity is believed to be human, but is actually a computer, it has passed the "Turing Test." Turing's papers are still cited in many of today's computer science research and his life and work are the subject of several recent books such as the Elsevier-published Alan Turing His Work and Impact, and Alan Turing: The Enigma upon which The Imitation Game is based.
Turing and his team worked in Hut 8, at Bletchley Park, a far cry from today's sophisticated scientific cloud computing and its virtual, infinite environment that allows modern day "Turings" to "rent" space as needed, and share and analyze data on cloud-based platforms. The value proposition that scholarly publishers like Elsevier can provide to researchers working in this environment are cloud-based tools that provide better, faster and more efficient search capabilities. These can include online collaboration tools, research data repositories, newsfeeds, and recommended reading lists, as well as access to the massive amounts of cutting-edge papers that have been properly vetted through a recognized peer review process.
The management of such data and how one uses it within the cloud on smaller but ever more powerful, nimble devices, is still in a nascent stage. Computing science and analysis of big data have certainly advanced to a higher level than those clever code breakers of World War II likely imagined, but at the end of the day, it still comes down to the human factor to provide context. Important questions regarding what can, or even should, be shared and related privacy concerns- -- an issue that was undoubtedly a painful part of the analysis behind Turing's WWII work -- are still being debated and may never be fully resolved to everyone's satisfaction.
At a recent UK meeting with prominent leaders of the Cancer Research Community, I reflected upon how cloud-based tools and billions of data are being used to evaluate research performance. In some countries, number of publications (a measure of journal quality the articles were published in) as well as the number of received citation counts, are part of the eco-system of performance evaluation metrics. But Alan Turing and his legacy are pivotal reminders of the limitations of data analysis without the context. How would he be evaluated today looking only at some of those metrics? He published just a few articles in his too short life, but Turing's work has had profound impact upon computer science that still resonates. Even in our own domain, Alan Turing continues to teach us valuable lessons.
Turing's life did not have a happy ending. Convicted of the (then) crime of being a homosexual, Turing committed suicide in 1954 at age 41. He was granted a posthumous pardon in 2013 by Queen Elizabeth II and in March 2014, the British government announced funding for the Alan Turing Institute to help make the UK a world leader in big data and algorithm research. In November, 2014, it was announced that the ACM A.M. Turing Award, the most prestigious award in computer science, will be raised to $1 million next year, thanks to Google.
As of this writing, "The Imitation Game" was just nominated for a Golden Globe and is likely to be an early favorite in the upcoming Academy Awards. As someone in the science/technology profession who sees the important work that researchers do every day, it's great to see recognition of Turing's impact on computing. His tragic death deprived the world of an extraordinary mind at the very apex of its power, but his legacy lives on. As Turing himself wrote in his 1950 Mind paper on Computing Machinery and Intelligence: "We can only see a short distance ahead, but we can see plenty there that needs to be done."
Sixty years on, Alan Turing is rightfully getting his day in the sun.