Every time I delve into the topic of computer science, I can’t help but recall the perplexing answer “42” given by Deep Thought, the supercomputer that appears in Douglas Adams’ renowned novel “The Hitchhiker’s Guide to the Galaxy“.

It was in response to the ultimate question about “life, the universe, and everything” which the computer took seven and a half million years to process. Interestingly, the history of computing is a perfect candidate for humorous and mystical fictionalization, bearing a remarkable resemblance to that of agnostic religion. Both are characterized by a labyrinth of questions so profound that the very meaning of existence hinges upon their answers.

I believe that at the end of the century the use of words and the opinions of experts will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted

Alan Turing

Electronics has gone through two fundamental transitions in the last hundred years: the first one from analog to digital, and the second one from vacuum tubes to solid-state devices (transistors).

However, just because these transitions happened at the same time does not mean they are inherently connected. Just as digital computing was originally implemented using vacuum tube components, it is possible to implement analog computing using solid-state devices.

There is no clear-cut distinction between analog and digital computing. Generally, digital computing involves working with integers, binary sequences, deterministic logic, and time that is broken down into discrete increments. In contrast, analog computing involves real numbers, non-deterministic logic, and continuous functions, including time as it exists in the real world as a continuum. Hybrid systems that incorporate both computing systems are common in nature.

Credits: Massimiliano Vurro & GenAI – A space and time traveller

To simplify this comparison, consider a tree as a hybrid system that takes in a wide range of inputs defined by continuous functions. However, if you were to cut down the tree, you would find that it has always measured time digitally by counting years. The tree’s complexity lies not in the code, but in the topology of the network, which is analog.

We must continue to invent new techniques and earn our bread not only with the sweat of our muscles, but with the metabolism of our brains

 Norbert Wiener

The genetic system present in every living cell can be compared to a stored program computer, as proposed by John von Neumann. Nature uses digital encoding to store, replicate, and recombine sequences such as nucleotides. However, it relies on analog computing for functions like the nervous system, intelligence, and control.

Abstract image representing a cilinder and a disk as a genetic system in every living cell compared to a stored program computer

Credits: Massimiliano Vurro & GenAI – A genetic system present in every living cell compared to a stored program computer

Digital computation is intolerant to error or ambiguity, and therefore, it requires error correction at every stage of the process. In contrast, analog computation tolerates errors, allowing us to coexist with them.

Digital computers perform transformations between two types of bits – bits that represent variations in space and bits that represent variations in time. The transformations between these two types of information are governed by computer programming, and as long as computers require human programmers, we will remain in control of the machines.

Analog computers, on the other hand, mediate transformations between two forms of information – structure in space and behavior in time. In this case, there is no code or programming involved. This is incredibly similar to how human beings function.

Nature has evolved nervous systems that resemble analog computers in some way, even though we don’t fully understand how. These systems can gather information from the external world and learn to control it.

Nervous systems have the ability to learn to control their own behavior and their environment, to the extent that they are able to act freely.

While people discuss the intelligence of digital computers, analog computing is quietly advancing through neuromorphic processors inspired by human neurons. This is similar to how thermionic valves were repurposed to build digital computers right after World War II.

Finite state deterministic processors, which execute finite codes individually, are coming together to form large-scale, non-deterministic, non-finite, and above all, free-to-roam multicellular organisms in the real world.

These resulting hybrid analog-digital systems are capable of processing bit streams collectively, like the flow of electrons treated in a thermionic valve, and individually, like the bits processed by discrete-state devices that generate the flow itself.

In simple terms, bits have become the new electrons, and analog computing can take control of everything from the flow of goods in daily life to ideas.

If we are concerned about the rise of artificial intelligence, we should pay attention to the emergence of control.

An example of this is seen in systems that map highway traffic based on users reporting their speed and position in real-time in exchange for access to the map. This creates a decentralized control system where there is no external control model except for the system itself.

A 4 blocks image representing 4 different images of a modern analog future Mona Lisa in a cyberpunk style made with AI

Credits: Massimiliano Vurro & GenAI – Cyberpunk Mona Lisas

In conclusion, the future of computing will see a rise in analog systems that will be beyond the control of digital programming. Have you recently heard of prompt engineering and Generative AI?

To create a machine capable of processing the meaning of all human knowledge, we would need to decode non-logical information that has the most significance to humans, going beyond Moore’s law.

As the system collects and maps out possible responses and connections, it will also construct meaning, eventually controlling it like a traffic map controls traffic flow. This brings us to the third law of artificial intelligence, which states that any system that is simple enough to understand will not be intelligent enough, while any system complicated enough to behave intelligently will be too complex to understand. This law provides comfort to those who believe that as long as we don’t comprehend intelligence, we don’t need to worry about machines becoming intelligent. However, there is a loophole in the third law, as it is possible to build something perfectly functional without understanding it.

Which happens frequently to me.