Surprise! Trump may have gotten something right
2 June 2017 - In another non-news development, future ex-President Trump made good on another pointless campaign promise by promising (again) to pull the United States out of the Paris Climate Accord. As usual with political activity in the third millennium A.D., this action is all smoke and no fire. Howsomever, since the Paris Accord, in fact the whole climate change debate, is all smoke and mirrors to begin with, walking away from it amounts to a perversely positive step. I'm going to break with current journalistic practice by giving you some background about why I think so.
As with most chaotic systems, Earth's climate depends on a constantly shifting balance between forces of light (solar power coming in) and dark (heat radiated away into space).
It all started for me in the early 1970s when, as an astrophysics graduate student bent on understanding how the Sun works, I started paying attention to how much power our favorite celestial campfire was putting out. Back then, folks generally imagined that the Sun was a stable star. They used silly terms like "The Solar Constant" to refer to the Sun's total output.
We knew even then that stars' total outputs change to various extents and over various time scales. We imagined, however, that, since the Sun was in a relatively stable part of its life cycle called "the main sequence," and looked about the same brightness from day to day and year to year, it was probably stable. Real scientists (not politically motivated hacks) know from experience that whenever something seems reasonable and obvious, it's probably just not true.
There were hints, of course, that the so-called solar constant wasn't actually constant at all. For hundreds of years, we'd known that sunspot activity varied from nothing to looking like a bad case of pubescent acne on an approximately 11-year cycle. Astronomers speculated that maybe hinted at some variability in total solar output, but didn't know what the connection might be. They imagined that sunspots were like clouds, and that more sunspots obscuring the surface meant lower total output reaching out toward Earth.
As usual, they were both right and wrong. One of the many things we learned by the 1980s was that sunspots were a symptom of increased solar output, but that's not germain to the climate debate. What's important is that we learned thirty years ago that, for sure, the Sun's output is not constant and anyone who tells you it is either doesn't know what they're talking about, or is actively blowing smoke.
So, how much power does the Sun put out? The only way a scientist can know that is to measure it. And, the first thing a trained scientist does when trying to measure something is find out what others have already done to measure it.
It turned out that folks had been trying to measure the Sun's output - and publishing their results - since before the invention of the telescope. In researching the history of solar-output measurement, I found two things stood out:
Intrigued, I spent a couple of years looking into solar irradiation vs. climate records. Most significantly, the historical records show two decidedly cold-weather periods, which coincided with periods of notably absent sunspot activity (called the "Maunder" and "Sporer" minima). Nobody'd paid much attention because they were blinded by the notion that more sunspots correlated with less total output. When it dawned on astronomers that the actual correlation went the opposite way, the penny dropped!
That was the time when mass media stopped blubbering about pollution causing a catastrophic "nuclear winter," and started ranting about global warming.
Of course, folks trying to drive us into panic for their own political and economic gain pointedly ignored the fact the Maunder and Sporer minima were decidedly disasterous for the agricultural economies of their times. Mass media simply promoted the syllogism:
GLOBAL WARMING = BAD
Of course, history shows that ain't necessarily so. Then again, what liberal politician ever paid attention to history?
By the 1990s, things had progressed to the point where scholarly literature was full of pseudo-scientific reports about climate research. Still trying to keep an open mind, I read them carefully.
Again, two things stood out:
Finally, Al Gore promoted his famous "hockey stick" graph, which coupled a long period of clearly flat historical climate data with a just as clearly fictitious exponential ramp up of future global temperatures. The former was apparently compiled from past measurements. The latter necessarily made up from whole cloth because nobody can measure what hasn't happened, yet.
After a year or so of being bombarded by such drivel, I stopped reading.
When you find somebody's lied to you in the past, the only rational response is to expect anything they say to you in the future to be a lie as well.
Fast forward to the second decade of the third millennium.
We have an international contract that says we're supposed to take a bunch of actions we don't really know how to do, and give away a bunch of money to bribe people to make technology decisions that they aren't capable of implementing, anyway, all based on pseudoscience published by people who have already been caught lying to us. Note that the lion's share of what's to be given away comes out of U.S. taxpayers.
And, oh yeah, the contract's stated goal is to do something we have no power to control.
Walking away sounds like the right thing to do.