POLYMATH MINDS: British mathematician, code-breaker and pioneering computer scientist Alan Turing (1912-1954) (left) was just 41 when he died. Widely considered to be the father of theoretical computer science, he was cited Sunday at a conference on theoretical computation at the Institute for Advanced Study along with the Hungarian-born John von Neumann (1903-1957) who also died at the relatively young age of 53. While the conference speakers focused on current and future challenges, they paid homage to ground-breaking work of their predecessors.
In an age when four year-olds have their own handhelds and their grandparents are Facebooking and Tweeting, it’s astounding to think that the digital revolution has come about in just the last four decades. And it’s inconceivable to imagine future scientific progress without computers.
In the past few decades many natural processes in biology and physics have been viewed as “information processes.” Examples include the workings of the cell and the immune system, even the flocking of birds.
A day-long conference at the Institute for Advanced Study on Sunday took a look at the impact of computational methods on a range of scientific disciplines including economics and social science. Titled “Lens of Computation on the Sciences,” it was hosted by Avi Wigderson, the Herbert H. Maass Professor in the Institute’s School of Mathematics.
According to Mr. Wigderson’s introduction in the conference brochure, “Interactions with biologists, physicists, economists, and social scientists have found that this computational lens on processes and algorithms occurring in nature sheds new light on old scientific problems in understanding, e.g., evolution and markets.”
Top theorists in computation from Harvard and MIT joined their IAS peers in examining the impact of their youthful discipline and to discuss the challenges and benefits of interactions between computation and other fields of study.
In turn, each speaker first paid homage to the founding fathers whose work brought about the current revolution: the British mathematician, code-breaker, and pioneering computer scientist Alan Turing and the Hungarian born polymath John von Neumann.
Alan Turing (1912-1954) gained his PhD at Princeton University and is widely considered to be the father of theoretical computer science for his model of a general purpose computer, known as the “Turing machine.” During World War II, Mr. Turing worked at Britain’s code-breaking center, Bletchley Park. Winston Churchill said that Turing had made the single biggest contribution to Allied victory in the war against Nazi Germany.
Prosecuted as a homosexual in 1952 and treated with estrogen injections as an alternative to prison (homosexual acts then being a crime in Britain), Mr. Turing committed suicide two years later. His life is the subject of a new film, The Imitation Game, starring Benedict Cumberbatch.
John von Neumann (1903-1957) needs little introduction in Princeton where he was one of the first faculty appointed to the Institute for Advanced Study. He is celebrated for his electronic computer project there, and the IAS Machine it developed. Mr. von Neumann was a principal member of the Manhattan Project and a key figure in the development of game theory and the concepts of cellular automata. His mathematical analysis of the structure of self-replication preceded the discovery of the structure of DNA.
“It all began with Turing’s 1936 paper ‘On Computable Numbers with an application to the entscheidungsproblem,’ said Mr. Wigderson. “Turing gave birth to the computer revolution but unlike physics, computer science was born with the knowledge of its own limitations,” he added before introducing guest speakers: Leslie Valiant, Tim Roughgarden, Jon Kleinberg, and Scott Aaronson.
Such scientists study the mathematical foundations of computer science and technology. But it wasn’t fancy devices that were being discussed at the conference, rather it was the power and the limits of solving natural computational problems in fields such as cryptography (the field that gives us the Internet and E-commerce) and machine learning (the science that enables “big data” applications).
According to Leslie Valiant, the author of Circuits of the Mind (Oxford University Press, 1994) and Probably Approximately Correct (Basic Books, 2013), the idea that computation has its own laws and limitations emerged in the 1930s. “Some of the early computing pioneers, most notably Turing and von Neumann, already understood that this idea had far reaching implications beyond technology. It offered a new way of looking at the world, in terms of computational processes.”
Speaking on “The Computational Universe,” Mr. Valiant said that since Turing and von Neumann had pursued this new way of looking at the world in such areas as genetics, biological development, cognition, and the brain, there has been much progress. “The question now is how to exploit this increasing knowledge to obtain insights into the natural world that cannot be obtained otherwise,” he said.
Addressing the connections between computer science and biology, Mr. Valiant said: “Some natural phenomena are actually computational,” and went on to describe ways in which computation can be used to understand natural phenomena in the same way as physics, and with its own laws too.
Referencing the 19th century question of how evolution, which Darwin described as a slow process, could have achieved so much in so short a time, Mr. Valiant described machine learning to explain how complex mechanisms can arise by a process of adaptation rather than by design.
Tim Roughgarden of Stanford University looked at points of contact between theoretical computer science and economics with details of the challenges of auction design. He cited a flawed example from New Zealand which had brought in just $36 million when it had been expected to yield $250 million.
Social media and the possibility of gaining insight in social science from studying collective behavior was discussed by Jon Kleinberg of Cornell University. “The emergence of cyberspace and the World Wide Web is like the discovery of a new continent,” said Mr. Kleinberg, quoting the late 1998 Turing Award Winner Jim Gray. “The online world is a phenomenon to be studied with new computational perspectives on social science questions; online social systems are partly organic, partly designed,” he said.
“The collective behavior and social interactions of hundreds of millions of people are being recorded at unprecedented levels of scale and resolution. Modeling and analyzing these phenomena computationally offers new insights into the design of online applications, as well as new perspectives on fundamental questions in the social sciences,” said Mr. Kleinberg.
Scott Aaronson of the Massachusetts Institute of Technology amped up the fun in his talk titled, “Computational Phenomena in Physics.” A popular blogger (www.scottaaronson. com/blog) Mr. Aaronson has written about quantum computing for Scientific American and the New York Times. His quirky style captured the IAS audience, even if this reporter was not always “clued-in” on the insider humor.
“I am a theorist not an engineer and this is one of the few places where I don’t have to apologize for that,” said Mr. Aaronson as his first slide showed cartoon images of some scientific advances that are the stuff of science fiction. “Why don’t we have Warp Drive, the Perpetual Motion Machine, or the UberComputer? Well, we know why we don’t have the first two but why can’t we have the third?” he asked, and launched into the quest to understand the limits of efficient computation in the physical universe. The quest, he said, has been giving us new insights into physics over the last two decades.
And questions such as “Can quantum computers be built? Can they teach us anything new about physics? Is there some new physical principle that explains why they can’t be built? What would quantum computers be good for? Can quantum computing help us resolve which interpretation of quantum mechanics is the right one?,” he said, would yield further insight.
As the last speaker of the day, Mr. Aaronson ended with panache. Something about Alice (yes, Lewis Carroll’s Alice) and black holes that brought the house down. To know more, see Mr. Aaronson’s first book, Quantum Computing Since Democritus, which was published by Cambridge University Press in 2013.
The conference talks can be viewed on the IAS website at www.ias.edu/computationconference/2014/.