In accounting for the furniture of the universe, several intellectual traditions have settled upon two great organizing principles:
Materialism: the world's made of stuff and only stuff. If you can't kick it, it don't exist.
Idealism: the world's conjured by the mind. If you can dream it, it exists.
While the two positions are at odds with each other, there's always been a dialectical dance between them. Hegel was an idealist. Marx a materialist; yet Marx borrowed heavily from Hegel. Those German thinkers 🙃.
It doesn't have to be so. Today, as energy (aka matter stuff) and information (aka mind stuff) are two sides of the same coin, we have the language and the ideas to set aside the materialist-idealist distinction entirely. Further, we have machines that constantly challenge our desire to swerve left towards materialism and right towards idealism.
I am talking about computers of course.
I am not talking about the theoretical computing devices conjured by Turing and his followers. I am also not talking about the machines explicitly called "computer," i.e., the devices in your pocket or in front of you. Turing and Jobs have roles in my movie but the star is yet to be discovered. There's no hurry to discover the hero either. We have just found a new idiom for describing the world around us; let's get comfortable with it before we start labeling things for real.
Marc Andreesen, the infamous Silicon Valley mogul, has a saying: "software is eating the world." He means it in the way capitalists usually mean such things: that anyone making money selling some widget or the other is going to be disrupted by a kid out of California writing a program on which zillions of people can sell those widgets to zillions of other people. Cue AirBnB, Uber, Google....
We don't have to agree with Silicon Valley triumphalism to recognize the importance of computing to our lives. Most white collar work is done front of a screen and we increasingly spend the rest of our day flipping though another. We live, love and work on a computer; so much so that the matrix has gone from being a shocker to being a taken for granted presence in our lives.
So let's agree we are in a Software Eaten World. A natural question arises: how to chronicle it?
The story of the digital is primarily told inside-out, i.e., through the lens of the computer and how it's changing everything. The computer is undoubtedly a revolutionary device - it's the first human artifact that combines pointing 👉🏾and pushing 🤜🏾. By pointing, I mean the functions of text which is to represent the world, and by pushing I mean the functions of machines, which is to move stuff from A to B. Robots are both pointers and pushers. They can be surrogate eyes on the surface of Mars, looking for gravel that has traces of water and they can also transport water laden stones back to Earth.
However, to focus on the specific technical advances is to miss the woods for the trees. Consider the printing press - it too was a revolutionary device in its day and went through several technological breakthroughs before it became the foundation of modernity. But do we view the modern world as a press eaten world (which it is, by the way)?
No, we don't. Instead, we focus our attention on the world itself, on acts that require the written word as background infrastructure. Writing equations, telling stories, keeping accounts - these are the activities that absorb our attention.
You might say that print took a few hundred years to eat the world while the computer revolution is both new and ongoing. While that may be true, consider that there are several momentous shifts for which the computer is necessary but its influence remains in the background. I am talking about the two most important developments of the past fifty years:
Neither is possible without the computer. We can't coordinate worldwide supply and distribution chains without IT infrastructure. Moving stuff from A to B is the biggest cause of carbon emissions - transportation, food etc. Information management is key to organizing such complex systems of demand and supply.
When you think of climate change you think Exxon or Saudi ARAMCO but Microsoft never comes to your mind does it? How does ARAMCO fulfill its orders? Which spreadsheets are its financial projections written in? When the crown prince travels abroad to market his vision of the kingdom, what presentation software does he use? When his minions texted Jamal Khashoggi and lured him to the consulate, which messaging app did they use?
Inquiring minds want to know.
The point being that a hereditary aristocrat's machinations are as mediated by computing as this year's startup rocketship. In answering why the world's made that way, we need an outside-in understanding of the digital, not as a series of technical advances but as a series of material and mental transformations that assume the presence of computing. No different from how the printing press can be understood by studying the wars between Catholics and Protestants - the primary story isn't about technology but about computing's role in changing social (i.e., human-human) relations and the relation between humans and the non-human world.
For these reasons, we need to study computing not as a thing-in-itself but as a thing-in-the-world, as both digital and physical, as both material and ideal. While it's important to borrow ideas from computer science and physics - note the interest in quantum computing as a way of bringing the material and the informational together - their ideas are abstract, living in equations and laboratories. In contrast, our focus will be on the world, i.e., the actual reality that we share with other creatures on the only planet we have ever known.