Computing Climate Change
In my previous essay, I arrived at an unoriginal idea: that the computer is a mental telescope through which we can view the world. I didn’t say the universe is a computer because that would be really unoriginal but more importantly, because I am not interested in the universe. I am interested in the world, or more accurately, worlds, the personal and collective realities experienced by humans and other creatures.
To be honest, my love for the world can be traced to the German word for it: umwelt. If you tell someone you are interested in the world, they are likely to look at you funny but if you tell them you’re a weltist, and suddenly you are halfway important. I make worlds for a living, what about you?
Back to the mental telescope: now that I have a hammer, I am looking for nails.
I am reading Paul Edwards’ wonderful book on the role of computer models in the creation of climate science. There are many many things to say about the book. If I have the time, I might review the book in detail one day, but for now, let me stick to the main insight I gleaned from reading the tome: all the structures of modern climate science are products of the computer era. Obviously true for the data collected in observatories and sensors across the world and the computer models that crunch that data, but less obviously so for the international organizations created to make sense of that data and the political controversies surrounding the impact of global warming.
Don’t believe me? Consider two pieces of evidence:
The main line of criticism: climate change is “only a theory.”
The White Space of Truth
Click on the link to the IPCC report, check out the summary for policy makers and if you’re feeling really masochistic, download the PDF and read it. And then step back and ask yourself: how come I am able to access years of work by a network of scientists across the world with a click of a button? Do you think any of this would have been possible without the internet?
Computing via design has also changed our perception of reliability and truthfulness. This is what the IPCC website used to look like:
IPCC website October 2018
I am not talking about 1995. This is what the website looked like in October 2018 when the IPCC released the blockbuster 1.5° C report. It’s as if the scientists intoned to themselves: “we are in the business of producing the truth and we don’t care how it looks.” When I first downloaded the 1.5° C report, the document was clearly a PDFed word document. Yes, the future of the planet depends on the design sensibilities of Bill Gates.
Here’s how the website looks like today:
IPCC website April 2019
They must have hired a UX consultant: clean layout, lot’s of white space, readable fonts. The new design sensibility is reflected in the report as well; it’ no longer a PDFed word document. Bill Gates has morphed into Steve Jobs.
What’s my point?
The IPCC report is an artifact of the computer era: its manufacture and distribution follows the patterns of knowledge production in the 21st century. Truth needs the facts, but it also needs a feeling: in the age of alternative facts, any vehicle of the truth should feel trustworthy and accessible. White space and clean lines promote trustworthiness and accessibility. The shift in design sensibilities reflects a new awareness of the terrain the report inhabits: that it’s inherently a political document and therefore must seduce its readers as much as it conveys the facts.
It’s Only a Theory
The US right wing pioneered a line of critique that one might call “it’s only a theory,” first to dispute evolution and then to cast doubt on anything that endangered the bottomline of their capitalist masters: smoking, pollution, climate change. The doubters recognize that everyone believes two things:
If a claim is deductively true (or appears close to being so) everyone believes it. If a claim can be directly verified, everyone believes it. This is not the place to ask about the meaning of “deductively,” “verify” and “belief.” I will pretend as if these are terms that you may not be able to define but you know a case of deduction, verification or belief when you see it.
Magical science happens when 1 and 2 above combine seamlessly. Physics manages to make that happen every once in a while. The General Theory of Relativity is a good example: Einstein himself proposed three tests of the theory that followed directly from his theoretical principles and in 1919 the bending of light was directly verified to Einstein’s everlasting fame. Theoretical physics of the Einsteinian variety is a priestly science: rational magic that dazzles the faithful.
In comparison, climate science is a proletarian discipline, grounded in thousands of data collection efforts and an even larger number of computer models. Unlike General Relativity which sprung unaided from the mind of a genius, climate science exposes its innards to the world. Like other entrails, the sight isn’t pretty. The data sets are noisy. They are in different, incompatible formats. Ocean acidity measurements collect one kind of data. Air temperature readings collect another kind of data. Historical records are full of gaps. Similarly, the computer models have to simplify the real world in order to be tractable. You need higher order models to calculate whether the simplifications of the lower order models omit important parameters. The edifice is laid one brick at a time. Infinite regress looms on the horizon. Both the facts and the models are manufactured with great effort.
Climate Science is an archetype of the late twentieth century-early twenty first century knowledge economy where the widget is manufactured via the labor of thousands (if not millions) of unknown workers, mediated by computing devices and managed by large multinational institutions that not only have a marketing and sales budget, marketing and sales may be as important as manufacture.
It’s not romantic science but it’s eminently useful and eminently political. That’s a big part of what the computer is doing to knowledge.
No different from Google.