James Gleick wrote Chaos. The book that inspired me to write my own fractals (the function of functions z -> z + c where z and c are complex numbers and c is a complex constant). I wrote it in Java and displayed it in an applet in the browser. Slow as hell but it worked. Chaos is a great read, up to the last word.
The experience is similar reading The Information. My notes.
From African drums to the OED
Gleick guides the reader through the development of information and communication systems over the past centuries.
The book sets off with the messaging system of tribes in Africa using drums. Gleick then continues to writing and how that forms and changes the process of thinking.
“The written word – the persistent word – was a prerequisite for conscious thought as we understand it.”
The next step in the development is logic and mathematics, the development of language and dictionaries and formalization of spelling.
Gleick tells the story of the development of the Oxford English Dictionary (the OED). The first creators of the OED used Milton and Shakespeare as the foundation for this English dictionary. Shakespeare stands out as a massive contributor to English. As an inventor or inventor or first recorder of thousands of words as we have seen, he is the most quoted author in the OED as well, with no less than over thirty thousand references.
As a sidenote (and not the last for this article), where Shakespeare in English is a central foundational reference for English Language, the Statenvertaling of the Bible holds a similar position for Dutch. You could write a PhD thesis on the cultural consequences of this fundamental differences, and similarities in these language foundations: one with a creative, theatrical, literary background, the other a formal, religious one.
Computation and logic
Gleick continues with the development of computation, from the creation of logarithmic tables to Charles Babbage, who we could view a the prophet of the modern computer. Babbage thought of programming language and memory, in the 19th century, way before these terms existed in such context. Gleick tells the story about Babbage’s working relation with Ada Lovelace, Lord Byron’s daughter. Where Babbage seemed the inventor of the computing machine before its existence, Ada was the programmer of this non-existent machine, hitting programming problems that could only 100 years later be exercised on real computers.
“How multifarious and how mutually complicated are the considerations which the working of such an engine involve. There are frequently several distinct sets of effects going on simultaneously; all in a manner independent of each other, and yet to a greater or lesser degree exercising a mutual influence.”
(As a sidestep: two recent books have been published on Lovelace and Babbage that I have not yet have time to read. The Thrilling Adventures of Lovelace and Babbage by Sydney Padua – a graphic novel I am really looking forward to. And Ada Byron Lovelace and the Thinking Machine by Laurie Wallmark.)
Leaving Babbage, Gleick brings us to the development of the telegraph, a first electric apparatus speeding communication over distances. Communications were coded, and morse code becoming a standard at some point.
The limitations of logic
The need for secrecy was needed lead to the development of cryptography. Entree Claude Shannon who introduced the science of Information theory. Shannon worked on predictability and redundancy in streams of data representing encrypted texts. Claude Shannon invented how logical operations could process information and how to build these operations in systems with relays. Shannon wanted to build these systems to prove theorems.
At about the same time, Kurt Gödel came around and proved that the ideal mathematical world of Russell and Whitehead’s Principia Mathematica, where all mathematical theorems could be proved by logic, was false. Gödel proved that any logical system is inconsistent or incomplete. GEBHofstadter has explained this counter-intuitive conclusion in Gödel, Escher, Bach extensively and illustratively and Gleick makes no attempt to improve on that.
Turing at the same time proved a similar notion, the Entscheidungsproblem – by Hilbert – can every proposition be proven to be true or false – and Turings answer was no. He did this through the invention of a theoretical computer, the Turing Machine.
Interestingly, the main protagonist of the book, Claude Shannon, is a secluded mathematician working for Bell Labs. At the same time as Alan Turing, and incidently or not they both worked on cryptanalysis during the war without knowing this from eachother (classified, Turing from England, Shannon from the US), and they even have worked some time at Bell Labs and met up with lunch now and then. (The same Bell labs that is the subject of Douglas Coupland’s Kitten Clone, and the company that still today provides the backbone of our information highway, The Internet.)
The computer
All this work by eventually culminated in the creation of the information processing machine, nowadays knows as the computer.
Shannon continued to develop his Information Theory, looking at quantification of predictablility and redundancy to measure information content. He conducted some tests with his wife, using a Raymond Chandler’s detective Pickup on Noon Street,
“… put his finger on a short passage at random, and asked Betty to start guessing the letter, then the next letter, then the next. The more text she saw, of course, the better her chances of guessing right.”
Shannon and Schrödinger bring physics and information theory were together in the notion of entropy. Information processing, thinking and artificial intelligence notions develop.
DNA and information theory
Information theory is found to apply to nature itself: DNA is discovered. The development of thinking of biology in term of computability, algorithms, procedures gives more insight into the building blocks of life itself. (And as an aside, if we are able to think of the biological mechanisms in terms of algorithms, can we do so too for societal mechanisms to which a human belongs. And to the intellectual developments, meaning can we also build a recipe for the development of information to knowledge to intelligence? Which would be logical in the context of the characteristic of life to move towards negative entropy.)
Richard Dawkins develops his ideas about the Selfish Gene. Which has much in common with the Antifragility thinking of Taleb. Chaitin and Kolmogorov develop a theory to measure how much information is contained in a given ‘object’. Complexity is described in computability terms. And complexity has computability problems, like Gödel’s theory and this was the Chaitin version of incompleteness.
Lastly, Gleick brings us to quantum computing, making computations on an atomic scale.
Dealing with information abundance
The book closes with a view on the proliferation of information, describing the development of Wikipedia. The amount of information we have access to nowadays is becoming a challenge in itself. There’s information in abundance, but to find useful information in the overwhelming pile is the trick. Dissemination, filtering, ordering and search becoming essential tools. This is still something we do not have under control yet.
Gleick leaves the reader with a challenge to self. Learn to deal with the amount of information available. Then I mean not to manage the information, but to being psychologically able to handle information abundance. The FOMO and threat of total information procrastination is real. We will need to learn to ignore. We will also need to our own ways to store, record, share the information we find useful or interesting.
How to manage Borges’ library of Babel.
The book is an achievement on itself. Admirable how much information (no pun intended) Gleick has been able to pack in a book.
This website is an attempt to record, organise and share information that comes to me.
Pingback: Thrilling footnotes in Science History: Lovelace and Babbage illustrated – Niek de Greef