Monday 20 March 2023

George Dyson: Turing's Cathedral

Turing's Cathedral: The Origins of the Digital Universe from 2012 begins at an unlikely point in history, of William Penn and the US revolutionary war. The narrative choice leads eventually to the Olden Farm, Princeton, where the Institute of Advanced Study (IAS) would be established.

This historical prelude also leads the author to compare the explosion of the computing capacity to that of a nuclear reaction—both significant "revolutions" for the 20th century, their histories intertwined.

The book from George (Son-of-Freeman) Dyson is already much heavier than some of those "airport books" I've recently read, but it's still not exceedingly academic.

The IAS era becomes repeatedly sectioned from the angles of different personalities and computational topics, and the invention of ENIAC and EDVAC. The scale of the narrative at times focuses on details of a chosen vacuum tube solution, at other times expanding to the connections and conflicts between personalities and faculties.

Some of the heavy-hitters here are John Von Neumann, Alan Turing, Stanislaw Ulam, Nils Barricelli, Norbert Wiener, Julian Bigelow...  Women feature less, but at least Klára Von Neumann gets credit for being one of the "first coders", and not uninformed of the code's substance either.

An image emerges of a nascent digital world. A time when people with experience of using actual digital electronic computers could still be counted in dozens. A bunch of intellectuals gravitated to the topic, their thoughts projecting far ahead of what the computers could actually do.

Computers at War

It is not new to say that war was related to computation in many ways. But it is interesting to read in more detail about the people involved and how the first computers connected to these tasks.

Automated computational approaches were needed for many brute force-calculations, such as firing tables, the nuclear fusion and its optimization, and the development of thermonuclear bombs. Cybernetics arose from the problem of how to predict the airplane motions from the perspective of air defense.

To some extent the topic of weather prediction also connects to war and the choice of the moment of attack. The initial successes led to optimism about predicting weather for longer timescales and to use the knowledge for large-scale weather control. Other themes followed the same pattern: thinking machines, machine life, self-replicating machines, all seemed to portend even more radical changes to human life, all just around the corner.

Clashes emerged from the pure mathematicians' and logicians' attitudes towards engineering sciences and practical problems, which the real tubes-and-steel computers concretely represented. Despite, or because of, these difficulties, a sort of "golden age" of computers took place at the IAS during the war.

“The war had scrambled the origins of new inventions as completely as a message passing through an Enigma machine. Radar, cryptanalysis, antiaircraft fire control, computers, and nuclear weapons were all secret wartime projects that, behind the security barriers, enjoyed the benefit of free exchange of ideas, without concern for individual authorship or peer review.” (258-259)

War also muddled of the issue of who did what in relation to the first practical computers, as they were also war secrets. In case of Turing and the Brits, this took decades. Von Neumann's "First draft of a report on the EDVAC" turned out to be an influential paper, and IAS became a kind of distribution point for others to build their own "clones" in the ENIAC/EDVAC mould.

The invention of computer is a matter that can be sliced in different ways. Forgetting Leibniz and Babbage, here we follow the thread of one of the Hilbert's mathematical challenges, the Entscheidungsproblem. An extension to, or an alternative interpretation of Kurt Gödel's refutation, resulted in the Turing/Church hypothesis.

Turing's paper provided the proof of an universal (logic) machine, which through Von Neumann gained an architecture for actually building something like it in reality. It's hinted that the way Von Neumann's brain worked, with perfect recall of texts read years before, also influenced the idea of this superior digital brain.

Eckert and Mauchly were significant in getting ENIAC actually functioning, and nowadays more fully credited with the achievement. While the Von Neumanns of this world were lauded, much of Eckert-Mauchly work remained obscured behind war secrecy and its supposed practicality. (I recall Joel Shurkin's "Engines of the Mind" is a book that explored this point specifically).

The British developments included the Manchester Baby and Mark I, involving Max Newman and Turing from the cast of characters here, although they didn't design the computers. People in the US could only suspect why and how the Brits had such a good head start on the topic.

The work led to the later commercialized Ferranti Mark I. To bring this all down to the measly level of my personal experience and the themes of my blog, I can only remind that Ferranti is a familiar name from the ULA chip that drives the visuals of ZX Spectrum and various other 8-bit computers.


The digital explosion

Turing's Cathedral gives an intriguing view into history, showing that the germ of many currently hot topics in computers were already at least in the thoughts of the great minds of the 1940s and 1950s. For example, the insight that infallible machines might not be intelligent, and that machines should instead make mistakes and learn, is not a new one. Later it turned out to be a crucial insight when using neural nets to have computers "self-learn".

Given the book was out in 2012, this is actually quite insightful, but some things have advanced quite fast since then, so the author's extrapolations look like dismissing decades of "lesser" research in the areas of machine intelligence and autonomous agents.

Interestingly, Dyson says that Turing’s O-machine does not get much attention, even if it is closer to what we now understand with machine “intelligence”. The O-machine is in fact somewhat tricky to understand, but I supposed it could be considered a Turing-machine with a non-Turing component, the "Oracle". The author again extrapolates, that in Internet we sort of have a giant machine, linking answers to questions. The human clicks represent the Oracle, and in time the "machine" grows up, a massive analog computer, existing in its own right. 

Whether this now works in an internet now further divided into platforms, distorted by commercial interests and app ecosystems, I'm not so sure.

Machine life is explored from various angles. Nils Barricelli was concerned with life-games inside the computer and modelling evolution. DNA had been only just discovered, and it was perhaps attractive to see thought parallels between atoms of life and bits in the digital world. In the limited space of the early computer memory, his critters largely "died". 

Barricelli's research in hindsight pointed towards ideas about future AI and possibly at the significance of horizontal gene transfer. For Dyson, this provides another vision about how series of numbers necessarily live and die inside computer systems and on the internet, in symbiosis with humans. Whether inside a program that's perceived as useful or within a pornographic image, survive they must. (I'm paraphrasing a lot here.)

It all does have its dark side, which is also explored in the book. The calculations needed for the nuclear bombs were machine-led, and the insights were made by people orbiting around the first digital computers. Von Neumann contributed to understanding of shock waves, the implosion method of detonating nuclear weapons, and optimizing the altitude of airburst nuclear explosions.

When it comes to describing the lifestyle and practices at the IAU, the book appears to send a clear message: The brightest mathematical and logical minds of their generation needed their own space, free from direct obligations such as administration and teaching undergraduate studies.

Not insignificantly, through his person and in his position, Von Neumann was a kind of major node between many other intellectuals, directing people to examine the work and findings of others. Von Neumann's political opinions arose from having a first-row seat to the nuclear developments, and these views could be rather brash. The nukes had to be built, as "they" would certainly build them.


Sweet beginnings and bitter endings

Von Neumann's death seems like a passing of a small universe, leaving the mystery whether the singular thinker had something still in his sleeve or if the ideas had been exhausted. Turing's contributions for the war and other hidden developments were recognized much later.

With Von Neumann gone, the high energy collaboration and the mixing of fields in the IAS also diminished rapidly. Many saw their personal interest projects dwindling into obscurity, to be re-invented by others after more practical developments caught up and made possible the reassessment of their original thought. 

The widespread computer architecture remained as a child of Turing and Von Neumann. The author is asking why wasn't this more strongly questioned afterwards? It could be considered a massive "legacy" choice that impinges itself on every new platform.

The concepts of writing and reading were influential towards inventing the universal computer. Also, the idea of an "un-erring scribe" was already a component of philosophical debate in mathematics and logic. Practically, data was usually tabulated and inspected piecemeal by human "computers". The digitalization of this task resulted in the electronic computer. 

As the author notes, memory cells are largely passive, whereas a super-fast read/write head parses the memory contents one by one. Such computers would already be at some disadvantage when examining photographs. Perhaps multi-threading and recent GPUs have begun to erode the outlines of this architecture, with graphics memory being able to perform operations on itself.

The question is then what definition of "universal" is required—one based on late 19th century understanding of mathematical logic? Or are the other understandings of universality? What is life, what is complexity, and how does it travel across the universe? Are they little green men? Or code, a cypher to be unraveled, so to speak? Where do aliens hide in anyway?

The early days of computing was followed by the task of taming computers into banal office assistants. The book gives some feel about the motivations and lives of these people who worked on and with computers when it was still a highly academic topic and suggestive of a parallel, unlimited alien intelligence.

Saturday 4 March 2023

Cheating in Wordle

A few years back, I used Processing to examine palindromes in 5-letter English words, in order to explore the Sator Square.

Weird that even random things like that might have re-uses.

Evil uses, that is! Muahahaha!


The popular daily word guessing game Wordle uses 5-letter words, and I already had a ~10000 word dictionary for generating those squares.

You know that annoying moment when there's a couple of letters in place, but alternate words don't come into mind. (Wordle doesn't allow words outside its dictionary.)

Sometimes I cop out and suggest words that already have letters known not to belong to the solution.

But occasionally I can't even come up with a valid English word to fit! All this variety in such a small game is what makes it exciting, I guess.

The "Cheat"

I have no motive to extensively cheat in Wordle, but found it interesting enough to try. I bet there are already similar articles and blog posts elsewhere.

In Wordle, you have to guess the word of the day in five guesses. After typing a suggestion, the app will tell you if correct letters were in correct position (green), or a correct letter was in a wrong position (yellow), and any incorrect letters (grey).

The keyboard view is also updated to reflect the situation, so you'll always see which letters have been used.

Here I have guessed LOWER; then CLIMB, which is stupid as L was already known. Also, at 4th step I forgot that A should feature.

First I make a genuine guess or two, to get an idea what letters there are, what positions they are in, and importantly, which letters are unused. 

There's some common sense about which words are more likely, but as the words are chosen by humans they can also go other ways. It's still usually worth to pick a starting word with no repeating letters. Exhausting the common vowels in two first words might also help.

If the second guess has a correct letter, this together with the bunch of not-present letters, can narrow the possible words in my dictionary to about 10.

If the second guess only reveals a letter, but no location, the list can be still quite long.

I solved the word KIOSK on 19.2.2023, SWEAT on 20.2.2023, RIPER on 22.2.2023 and VAGUE on 23.2.203, improving the program a little each time. SYRUP, WORSE, MOOSE, ABOVE and TREND followed.

Example: Solving VAGUE

On 23rd of February, 2023, I started my guess with TUNES.

I learned that U and E are present but at incorrect positions.
My next guess was IMBUE, showing U and E are at the end.

This was already good enough to run the first routines with.

I set the used letters string as "tnsimb" and word filter at "***ue"

The dictionary is run through so that the word has to match the filter, but not contain any of the used letters. Used letter list doesn't include discovered letters, because the word may have more of these.

This gave me:

value
argue
vague
queue
vogue
rogue
fugue
revue
deque
roque

Deque and roque I suspect would not be plausible Wordle solutions, despite being perfectly cromulent words.

I went with ROGUE as I like Rogue-likes.

This wasn't correct, but gave me G at correct position, so I revised the used letter string to "tnsimbro" and word filter to "**gue".

My dictionary only had these two to offer:

vague
fugue

...from which I picked VAGUE which was correct at step 4. Even if had been incorrect, I would have been right at step 5.

Improvements

I thought at first the filter could be improved by checking how a consonant or vowel can't exist at a particular location. But this probably wouldn't achieve anything, as the dictionary only contains valid words anyway.

A simpler and better addition was to exclude words with letters at positions where they are known not to be, and a requirement for the dictionary words to have these known letters.

Even then the puzzle doesn't become a total pushover, because the first two guesses might not yield anything conclusive and the suggestion list is quite long with dozens of words. Often the way forward is to prefer probable Wordle-style candidates and ignore really obscure words (like the above degue).

Generally the solution is found in the 4th or 5th guess.

After about 10 solved puzzles I was satisfied and stopped doing this.