Some weekend thoughts on causation and climate

I’m reading The Book of Why by Judea Pearl and Dana Mackenzie, which explains Pearl’s career-long struggle to put causation back into the science that was once statistics and is now becoming something larger. By “was once statistics” I mean the school devoted to radical empiricism: there is nothing but the data, who are (perhaps unfairly) represented by the cliched phrase, “correlation does not imply causation”. (I once had an anonymous reviewer write this to me on a manuscript I published.) I say “something larger” because I suspect Pearl isn’t attacking the corollary-causal barrier as much as he’s setting up an argument for ignoring it and looking for something more interesting. In some fundamental sense nothing really can be said to be causal in a way that would satisfy a layman, or even someone like David Hume. Yet there are other things — macroscopic ones — that are self-evidently causal. So it’s not so much a great mystery as it is a paradox, and therefore not due to complete misunderstanding but incomplete knowledge.

Pearl, helpfully, points out that choosing to see “only the data” requires arbitrarily deciding where to set walls and how to ordinate the data. Arranging data in a co-ordinate space, and fitting a function to it (what statisticians call a model) requires selecting so-called independent and dependent variables; and then — in the final form of complete understanding of, and mastery over, data — writing the function into an equation. Equations are wonderful things, but they’re premised on an axiomatic understanding of … not nature but, I guess, the rational nature of the universe. For the ancient Greeks, who didn’t use symbolic equations as we do, it was important that abstract numbers, and algorithms that operated on them, could be equated to geometry. It took more than 2000 years, and Bertrand Russell, to try and universalize these self-evident, axiomatic “truths” in mathematics.

My dog bothering me in my hammock.

I discovered math through physics and physicists are infamous (among mathematicians) for playing fast and loose with math rules. (Young physicists learn that the surest way to understand a thing is to take it apart and build it again, hoping it works. In terms of mathematics, as applied to physical theories of nature, this amounts to intentionally breaking theories, beauty be damned.) In physics, self-evident truthy-truths have a history as old as natural philosophy. When Galileo set up his famous experiment showing that all objects fall at the same rate — that we are all subject to some universal ruleset — and thereby disproved what Aristotle said. (This must be Aristotle’s laziest statement about observed reality, and I’ve always had trouble believing it was he who wrote it.) Copernicus saw, and Kepler computed, that the planets orbit the sun in a pattern that depended on the inverse squares of their distances from the sun. But no one knew why the planets should move at all. They speculated about some Prime Mover, leaving room for gods, though who knows whether they believed it. For Galileo and Copernicus, and to some degree Kepler who was quite a wily survivor, merely pointing out cracks in Aristotle’s elegant theory, which was by then doctrinal in the Roman church, was dangerous. Newton showed that the same rules were self-consistent everywhere and the Prime Mover of the planets was the same force the caused Galileo’s weights to fall. But they didn’t explain where this force — gravity — came from.

Almost. Actually there was some indication of an underlying source, or what we would today term structure of the universe: the gravitational constant, written as G. Newton and his contemporaries were frustrated by their inability to make measurements of the absolute mass of large objects, like the sun and planets. Instead, they came up with proportionalities. It was Cavendish’s experiment, which was (I think?) the first in a long trend toward ever more complex and careful measurement apparatuses, that found a number for G. And we still believe G speaks for the fundamental structure underlying nature. One of the greatest moments in the history of human knowledge must have been when Coulomb wrote his equation to represent the interaction between charged bodies in the same form as Newton’s gravity. Suddenly two utterly unlinked (it seemed then) forces of nature were shown to operate basically the same way, and with a similarly baffling constant, k.

James Clerk Maxwell was the brilliant physicist from Edinburgh who unified the strange phenomena of electricity and magnetism mathematically. Maxwell didn’t even have the math in his day to adequately represent electromagnetism, so later mathematicians reformatted his theory into six inter-related (field) equations. Maxwell’s equations were regarded by Einstein as the most beautiful and complete theory in science. (Until perhaps his own General Relativity, although I don’t know that he did, or would have, said as much, as his esteem for Maxwell was so great. I also placed “field” in parentheses because they do not all deal with fields directly, but their representation of a single unified E&M theory presaged the direction physics would take until today — its pure essence captured in equations.) In combination, Maxwell’s equations produced a new number with the dimensions of speed. This was the first time the speed of light, termed c, which had previously been measured roughly by some ingenious methods, emerged from pure theory. So here are at least three important constants, c, G, k (written in terms of vacuum permittivity, ε0, related to electricity), and the vacuum permeability μ0, related to magnetism.

I’ve used the universal constants above to illustrate how raw numbers have emerged from experimentation, and contributed to ever growing understanding of the underlying structure of nature. This empirical-theoretical model of nature is as fundamental as it gets: the Standard Model. The long process of assembling the Standard Model, finding new particles experimentally and linking these with new symmetries theoretically, has depended on quantum mechanics (QM). QM lays out recipes for calculating particle (generally, field) interactions that scaffold our understanding of nature. And yet no one really knows what QM means, in a way that I would have considered satisfying before learning all about it. Instead, it requires that we accept some basic axioms that appear to be true and, the joke goes, “shut up and calculate”. Feynman’s commonly repeated warning that no one really understands QM isn’t exactly true; rather we just don’t understand it in a way that’s satisfying.

Roughly speaking, QM followed from a guess Erwin Schrödinger made that particles behaved sufficiently like waves that their measurable quantities (usually energy) could be calculated from a continuity equation: the rate of oscillation of the wave was proportional to its energy. The equation worked if one plugged in an abstract “wave function” (ψ) with non-continuous, quantized (hence, quantum) frequencies of oscillation. Max Born conjectured that the square of the wave function (ψ2) could be interpreted as the probability density for a particle, i.e., the likelihood that a particle would be found in some state, ψ2(x). In nature, we tend to observe things (like particles) in localized states. For instance, opening a box and finding a dead cat would be surprising; opening a box and finding a fuzzy image of a cat in a simultaneous live-dead state would be impossible. This so-called Schrödinger’s cat paradox was meant to highlight the absurdity of the “measurement problem” in the Copenhagen Interpretation, pointed out by Erwin himself. A QM description of the cat in the box required it to be a fuzzy superposition of probabilistic alive- and dead-states; making a measurement by opening the box caused the wave-function ψ2 to “collapse” into a single (local) alive- or dead-state, necessarily one or the other. This has profoundly weird implications.

Einstein felt this instinctively; or so I was taught that he felt it, which is to say he did not know it. I have since come to see Einstein’s objections to QM (the “God does not play dice” stuff) as more serious and at least as well grounded in logic as orthodox QM itself. As an undergraduate, I was only ever taught orthodox QM, later called the Copenhagen Interpretation. I regret this a lot, because it was the moment I ceased to believe anyone could ever really understand physics, and I fell out of love with it for a couple of decades. I credit Sean Carroll’s excellent Something Deeply Hidden with re-introducing me to a more complete way of appreciating QM; namely, in terms of the so-called Many-Worlds interpretation first proposed by Hugh Everett (Everettian QM). But there are others, too. The so-called pilot wave interpretation (or de Broglie-Bohm QM) is one I’ve not spent sufficient time on. I had to convince myself that the universe really was just statistical in nature. This heaped enormous significance on statistics, because it meant that statistical rules are somehow fundamental.

When I came back to using statistical methods in my professional life, I treated statistical tests like mathematical proofs. But in fact they are nothing of the sort. Statistics is a human-derived means of making quantifiable arguments. And this is why I’m finding Pearl’s book so interesting. As a computer scientist, who required statistical tests for efficient programming, he was not afraid to push back against orthodoxy; to risk heresy by layering causal inference above correlated data.

Back on planet Earth now.

I just read a Washington Post piece, “The climate scientists are not alright” that got me thinking — and apparently writing down some thoughts. When I became a climate scientist, I brought with me my physicist’s skepticism and instinct to break established theories. Although I believed in what the IPCC was telling the world, the “scientific consensus” smacked to me of the “Copenhagen Interpretation”. I was deeply bothered by how some climate scientists had politicized themselves. I took comfort in orthodox statistics, that measurements are sacred and causation is impossible to prove from the data. But while I understood there were well-funded fossil fuel interests pushing back against the science, I didn’t really grok what that meant in practice. Their strategy was not to disprove the science, but to sow doubt; to draw the scientists into a political, litigious arena where the fossil fuel lobby was strongest.

Now I am comforted, somewhat, by the explosive growth of climate change attribution science, particularly by the World Weather Attribution Group (WWAG). They have used new reams of climate data to prove, in the simple language of orthodox statistics, that the likelihood of observed climate change would be next to impossible without anthropogenic GHGs. That GHG emissions can now be connected to sectors, and soon sectors to individual industries (and major legacy emitters), is all part and parcel. So, to my fellow humans, don’t despair climate change too much.

That’s it. No inspired conclusion. Just thoughts.

Featured image: Along the pathway between Pershing Park and Santa Barbara City College, nasturtiums on the hillside.

%d bloggers like this: