Monday, 20 March 2023

George Dyson: Turing's Cathedral

Turing's Cathedral: The Origins of the Digital Universe from 2012 begins at an unlikely point in history, of William Penn and the US revolutionary war. The narrative choice leads eventually to the Olden Farm, Princeton, where the Institute of Advanced Study (IAS) would be established.

This historical prelude also leads the author to compare the explosion of the computing capacity to that of a nuclear reaction—both significant "revolutions" for the 20th century, their histories intertwined.

The book from George (Son-of-Freeman) Dyson is already much heavier than some of those "airport books" I've recently read, but it's still not exceedingly academic.

The IAS era becomes repeatedly sectioned from the angles of different personalities and computational topics, and the invention of ENIAC and EDVAC. The scale of the narrative at times focuses on details of a chosen vacuum tube solution, at other times expanding to the connections and conflicts between personalities and faculties.

Some of the heavy-hitters here are John Von Neumann, Alan Turing, Stanislaw Ulam, Nils Barricelli, Norbert Wiener, Julian Bigelow...  Women feature less, but at least Klára Von Neumann gets credit for being one of the "first coders", and not uninformed of the code's substance either.

An image emerges of a nascent digital world. A time when people with experience of using actual digital electronic computers could still be counted in dozens. A bunch of intellectuals gravitated to the topic, their thoughts projecting far ahead of what the computers could actually do.

Computers at War

It is not new to say that war was related to computation in many ways. But it is interesting to read in more detail about the people involved and how the first computers connected to these tasks.

Automated computational approaches were needed for many brute force-calculations, such as firing tables, the nuclear fusion and its optimization, and the development of thermonuclear bombs. Cybernetics arose from the problem of how to predict the airplane motions from the perspective of air defense.

To some extent the topic of weather prediction also connects to war and the choice of the moment of attack. The initial successes led to optimism about predicting weather for longer timescales and to use the knowledge for large-scale weather control. Other themes followed the same pattern: thinking machines, machine life, self-replicating machines, all seemed to portend even more radical changes to human life, all just around the corner.

Clashes emerged from the pure mathematicians' and logicians' attitudes towards engineering sciences and practical problems, which the real tubes-and-steel computers concretely represented. Despite, or because of, these difficulties, a sort of "golden age" of computers took place at the IAS during the war.

“The war had scrambled the origins of new inventions as completely as a message passing through an Enigma machine. Radar, cryptanalysis, antiaircraft fire control, computers, and nuclear weapons were all secret wartime projects that, behind the security barriers, enjoyed the benefit of free exchange of ideas, without concern for individual authorship or peer review.” (258-259)

War also muddled of the issue of who did what in relation to the first practical computers, as they were also war secrets. In case of Turing and the Brits, this took decades. Von Neumann's "First draft of a report on the EDVAC" turned out to be an influential paper, and IAS became a kind of distribution point for others to build their own "clones" in the ENIAC/EDVAC mould.

The invention of computer is a matter that can be sliced in different ways. Forgetting Leibniz and Babbage, here we follow the thread of one of the Hilbert's mathematical challenges, the Entscheidungsproblem. An extension to, or an alternative interpretation of Kurt Gödel's refutation, resulted in the Turing/Church hypothesis.

Turing's paper provided the proof of an universal (logic) machine, which through Von Neumann gained an architecture for actually building something like it in reality. It's hinted that the way Von Neumann's brain worked, with perfect recall of texts read years before, also influenced the idea of this superior digital brain.

Eckert and Mauchly were significant in getting ENIAC actually functioning, and nowadays more fully credited with the achievement. While the Von Neumanns of this world were lauded, much of Eckert-Mauchly work remained obscured behind war secrecy and its supposed practicality. (I recall Joel Shurkin's "Engines of the Mind" is a book that explored this point specifically).

The British developments included the Manchester Baby and Mark I, involving Max Newman and Turing from the cast of characters here, although they didn't design the computers. People in the US could only suspect why and how the Brits had such a good head start on the topic.

The work led to the later commercialized Ferranti Mark I. To bring this all down to the measly level of my personal experience and the themes of my blog, I can only remind that Ferranti is a familiar name from the ULA chip that drives the visuals of ZX Spectrum and various other 8-bit computers.


The digital explosion

Turing's Cathedral gives an intriguing view into history, showing that the germ of many currently hot topics in computers were already at least in the thoughts of the great minds of the 1940s and 1950s. For example, the insight that infallible machines might not be intelligent, and that machines should instead make mistakes and learn, is not a new one. Later it turned out to be a crucial insight when using neural nets to have computers "self-learn".

Given the book was out in 2012, this is actually quite insightful, but some things have advanced quite fast since then, so the author's extrapolations look like dismissing decades of "lesser" research in the areas of machine intelligence and autonomous agents.

Interestingly, Dyson says that Turing’s O-machine does not get much attention, even if it is closer to what we now understand with machine “intelligence”. The O-machine is in fact somewhat tricky to understand, but I supposed it could be considered a Turing-machine with a non-Turing component, the "Oracle". The author again extrapolates, that in Internet we sort of have a giant machine, linking answers to questions. The human clicks represent the Oracle, and in time the "machine" grows up, a massive analog computer, existing in its own right. 

Whether this now works in an internet now further divided into platforms, distorted by commercial interests and app ecosystems, I'm not so sure.

Machine life is explored from various angles. Nils Barricelli was concerned with life-games inside the computer and modelling evolution. DNA had been only just discovered, and it was perhaps attractive to see thought parallels between atoms of life and bits in the digital world. In the limited space of the early computer memory, his critters largely "died". 

Barricelli's research in hindsight pointed towards ideas about future AI and possibly at the significance of horizontal gene transfer. For Dyson, this provides another vision about how series of numbers necessarily live and die inside computer systems and on the internet, in symbiosis with humans. Whether inside a program that's perceived as useful or within a pornographic image, survive they must. (I'm paraphrasing a lot here.)

It all does have its dark side, which is also explored in the book. The calculations needed for the nuclear bombs were machine-led, and the insights were made by people orbiting around the first digital computers. Von Neumann contributed to understanding of shock waves, the implosion method of detonating nuclear weapons, and optimizing the altitude of airburst nuclear explosions.

When it comes to describing the lifestyle and practices at the IAU, the book appears to send a clear message: The brightest mathematical and logical minds of their generation needed their own space, free from direct obligations such as administration and teaching undergraduate studies.

Not insignificantly, through his person and in his position, Von Neumann was a kind of major node between many other intellectuals, directing people to examine the work and findings of others. Von Neumann's political opinions arose from having a first-row seat to the nuclear developments, and these views could be rather brash. The nukes had to be built, as "they" would certainly build them.


Sweet beginnings and bitter endings

Von Neumann's death seems like a passing of a small universe, leaving the mystery whether the singular thinker had something still in his sleeve or if the ideas had been exhausted. Turing's contributions for the war and other hidden developments were recognized much later.

With Von Neumann gone, the high energy collaboration and the mixing of fields in the IAS also diminished rapidly. Many saw their personal interest projects dwindling into obscurity, to be re-invented by others after more practical developments caught up and made possible the reassessment of their original thought. 

The widespread computer architecture remained as a child of Turing and Von Neumann. The author is asking why wasn't this more strongly questioned afterwards? It could be considered a massive "legacy" choice that impinges itself on every new platform.

The concepts of writing and reading were influential towards inventing the universal computer. Also, the idea of an "un-erring scribe" was already a component of philosophical debate in mathematics and logic. Practically, data was usually tabulated and inspected piecemeal by human "computers". The digitalization of this task resulted in the electronic computer. 

As the author notes, memory cells are largely passive, whereas a super-fast read/write head parses the memory contents one by one. Such computers would already be at some disadvantage when examining photographs. Perhaps multi-threading and recent GPUs have begun to erode the outlines of this architecture, with graphics memory being able to perform operations on itself.

The question is then what definition of "universal" is required—one based on late 19th century understanding of mathematical logic? Or are the other understandings of universality? What is life, what is complexity, and how does it travel across the universe? Are they little green men? Or code, a cypher to be unraveled, so to speak? Where do aliens hide in anyway?

The early days of computing was followed by the task of taming computers into banal office assistants. The book gives some feel about the motivations and lives of these people who worked on and with computers when it was still a highly academic topic and suggestive of a parallel, unlimited alien intelligence.

No comments:

Post a Comment