The stock market’s recent gains have at least three plausible explanations: corporate earnings growth, the prospect of tax reform, and deregulation. Tax reform and deregulation are stated priorities of the Trump Administration and have the potential to lift the economy and generate additional earnings. Investors obviously like that prospect, though regulation itself is a tool used subversively by crony capitalists to stifle competition in their markets. Conceivably, some of the large firms that dominate major stock indices could suffer from deregulation. And I have to wonder whether the economic threat of Trumpian trade protectionism is not taken seriously by the equity markets. Let’s hope they’re right.

It’s no mystery that high taxes and tax complexity can inhibit economic growth. Let’s face it: when it comes to productive effort, we can all think of better things to do than tax planning, crony capitalist or not. The same is true of regulation: the massive diversion of resources into non-productive compliance activities stifles innovation, growth, and even the stability of the status quo. Regulation creates obstacles to activities like new construction and the diffusion of telecommunications services. And it discourages the creation of new products and services like potentially life-saving drugs and slows their introduction to market. The sheer number of federal regulations is so spectacular that one wonders how anything productive ever gets done! Patrick McLaughlin of The Mercatus Center and several coauthors tell of “The Impossibility of Comprehending, or Even Reading, All Federal Regulations“.

Regulation is more than a mere economic burden. It is the product of an administrative apparatus that is not subject to the checks and balances that are at the very heart of our system of constitutional government. That is a threat to basic liberties. Barry Brownstein offers an instructive case study of “The Tyranny of Administrative Power” involving violations of property rights in New Hampshire. The case involves the administrative machinations surrounding an installation of high-power lines.

Governmental efforts to spur innovation ordinarily take the form of spending on research, subsidies for certain technologies or favored industries (e.g., alternative energy), and large government programs dedicated to the achievement of various technological goals (e.g., NASA, DARPA). Together with regulatory rules that influence the allocation of resources, these governmental efforts are called industrial policy. An unfortunate recent example is Trump’s decision to retain the renewable fuel standard (RFS), but on the whole, industrial policy does not seem central to Trump’s effort to stimulate innovation.

It’s clear that a deregulatory effort is well underway: the so-called “deconstruction of the administrative state” hailed by Steve Bannon not long after Trump took office. First came Trump’s 2-for 1 executive order (also see here) requiring the elimination (or modification) of two rules for every new rule. In the Wall Street Journal, Greg Ip writes about changes at the FDA and the FCC that could dramatically alter the pace of innovation in the pharmaceutical and telecom industries. (If the link is gated, you access the article on the WSJ’s Facebook page.) Speedier and less burdensome reviews of new drugs will greatly benefit consumers. An end to net neutrality rules will support greater investment in broadband infrastructure and access to innovative services. There is a new emphasis at the FCC on enabling innovative solutions to communications problems, such as Google’s effort to provide cell phone service in Puerto Rico by flying balloons over the island. The Trump Administration is also reining-in an aggressive EPA, the source of many questionable rules that weaken property rights and inhibit growth. (Again, the RFS is a disappointing exception.) Health care reform could offer much needed relief from overzealous insurance regulation and high compliance costs for physicians and other providers.

But deconstructing the administrative state is hard. Regulations just seem to metastasize, so deregulatory gains are offset by continued rule-making. This is partly from new legislation, but it is also a consequence of the incentives facing self-interested regulators. With that in mind, it’s impressive that regulation has not grown, on balance, thus far into Trump’s first year in office. According to Patrick McLaughlin, zero regulatory growth has been unusual going back at least to the Carter Administration. In quoting McLaughlin, The Weekly Standard says that Trump might well earn the mantle of “King of Deregulation“, but he has a long way to go. Brookings has this interactive tool to keep track of his deregulatory progress. One item on the Brookings list is the President’s intention to withdraw from the Paris Climate Accord. That represents a big save in terms of avoiding future regulatory burdens.

I can’t help but be wary of other avenues through which the Trump Administration might regulate activity and undermine economic growth. Chief among these is Trump’s negative attitude toward foreign trade. Government interference with our freedom to freely engage in transactions with the rest of the world is costly in terms of both foreign and domestic prices. With something of a history as a crony capitalist himself, Trump is not immune to pressure from private economic interests, as illustrated by his recent cow-tow to the ethanol lobby. Nevertheless, I’m mostly encouraged by the administration’s deregulatory efforts, and I hope they continue. The equity market apparently expects that to be the case.

“If the facts don’t suit your agenda, change them! The 18-year “hiatus” in global warming, which has made a shambles of climate model predictions, is now said to have been based on “incorrect data”, according to researchers at National Oceanic and Atmospheric Administration (NOAA). Translation: they have created new data “adjustments” that tell a story more consistent with their preferred narrative, namely, that man-made carbon emissions are forcing global temperatures upward, more or less steadily.“

The last link provides detail on the nature of the manipulations. Perhaps surprisingly, rather large downward adjustments have been made to historical temperature data, reinforcing any upward trend in the late 20th century and hiding the current 18-year pause in that trend. Suffice it to say that the “adjustments” made by these agencies are at fairly detailed levels; some of the before-and-after comparisons shown by gifs at this link are rather astonishing. Some climate researchers have started to refer to the temperature series as “reconstructions” instead of “data”, out of respect for the legitimacy of actual data.

In the meantime, the “warmist” propaganda keeps flowing from NOAA and NASA, and it is hungrily swallowed and then regurgitated by media alarmists. The media love a good scare story. They are so complicit in reinforcing the warmist narrative they will ignore the revelation of a faulty temperature sensor at National Airport in Washington, D.C. (another hat tip to John Crawford). It has been recording temperatures averaging 1.7 degrees Fahrenheit too warm for the past 19 months. Now that the sensor has been changed, NOAA states that it will not make any adjustments to the past 19 months of recorded temperatures from the National weather station, despite the fact that they have routinely made many other changes, often without any real explanation.

Here is a recent opinion from Duke University Professor Robert Brown on the divergence of satellite and NASA/NOAA surface temperatures and the adjustments to the latter:

“The two data sets should not be diverging, period, unless everything we understand about atmospheric thermal dynamics is wrong. That is, I will add my “opinion” to Werner’s and point out that it is based on simple atmospheric physics taught in any relevant textbook. …

This does not mean that they cannot and are not systematically differing; it just means that the growing difference is strong evidence of bias in the computation of the surface record.“

Every new report issued by NOAA/NASA on record warm temperatures should be severely discounted. They are toiling in the service of a policy agenda; it will cost you dearly, and it will severely punish the less fortunate here and especially in less developed parts of the world; and it will reward the statist elite, bureaucrats and Green crony capitalists. Ronald Bailey in Reason recently weighed in on the consequences of this “apocalyptic anti-progress ideology“. Or read the wise words of Matt Ridley on “The recurrent problem of green scares that don’t live up to the hype“. Hey greens, relax! And don’t waste our resources and our well being on precautions against exaggerated risks.

If we are ever visited or contacted by agents from an extraterrestrial civilization, what kind of society will they come from? The issue is given scant attention, if any, in discussions of extraterrestrial life, at least according to this interesting piece in The Freeman by B.K. Marcus. The popular view, and that of many scientists, seems to be that the alien society will be dominated by an authoritarian central government. Must that be the case? Marcus notes the negative views taken by such scientific authorities as Neil deGrasse Tyson toward laissez faire capitalism, and even Carl Sagan “… could only imagine science funded by government.” Of course, Tyson and Sagan cannot be regarded as authorities on economic affairs. However, I admit that I have fallen into the same trap regarding extraterrestrial visitors: that they will come from a socialist society with strong central command. On reflection, like Marcus, I do not think this view is justified.

One explanation for the default view that extraterrestrial visitors will be socialists is that people uncritically accept the notion that an advanced society is a planned society. This runs counter to mankind’s experience over the past few centuries: individual freedom, unfettered trade, capitalism and a spontaneous social order have created wealth and advancement beyond the wildest dreams of earlier monarchs. Anyone with a passing familiarity with data on world economic growth, or with F.A. Hayek, should know this, but it Is often overlooked. Central planners cannot know the infinitely detailed and dynamic information on technologies, resource availability, costs and preferences needed to plan a society with anything close to the success of one arranged through the voluntary cooperation of individual actors.

Many of us have a strong memory of government domination of space exploration, so we tend to think of such efforts as the natural province of government. Private contractors were heavily involved in those efforts, but the funding and high-level management of space missions (NASA in the U.S.) was dominated by government. Today, private space exploration is a growth industry, and it is likely that some of the greatest innovations and future space endeavors will originate in the private sector.

Another explanation for the popular view is the daunting social challenges that would be faced by crews in interstellar travel (IST). Given a relatively short life span, a colonizing mission would have to involve families and perhaps take multiple generations to reach its destination. There is a view that the mini-society on such a ship would require a command and control structure. Perhaps, but private property rights and a certain level of democratization would be advantageous. In any case, that carries no implication about the society on the home planet nor the eventual structure of a colony.

A better rationale for the default view of socialist ETs involves a public goods argument. The earth and mankind face infrequent but potentially catastrophic hazards, such as rogue asteroids and regions of strong radiation as the sun orbits the center of the Milky Way galaxy. These risks are shared, which implies that technological efforts to avert such hazards, or to perpetuate mankind by colonizing other worlds, are pure public goods. That means government has a classic role in providing for such efforts, as long as the expected benefits outweigh the costs. The standard production tradeoff discussed in introductory economics classes is “guns versus butter”, or national defense (a pure public good) versus private consumption. IST by an alien civilization could well require such a massive diversion of resources to the public sector that only an economically dominant central government could manage it. Or so it might seem.

As already noted, private entrepreneurs have debunked the presumed necessity that government must dominate space exploration. In fact, Elon Musk and his company SpaceX hope to colonize Mars. His motives sound altruistic, and in some sense the project sounds like the private provision of a public good. Here is an interpretation by Tim Urban quoted at the link (where I have inserted a substitute for the small time-scale analog used by the author):

“Now—if you owned a hard drive with an extraordinarily important Excel doc on it, and you knew that the hard drive pretty reliably tended to crash [from time to time] … what’s the very obvious thing you’d do?You’d copy the document onto a second hard drive.That’s why Elon Musk wants to put a million people on Mars.”

Musk has other incentives, however. The technology needed to colonize Mars will also pay handsome dividends in space mining applications. Moreover, if they are successful, there will come a time when Mars is a destination commanding a fare. Granted, this is not IST, but as technology advances through inter-planetary travel and colonization, there is a strong likelihood that future Elon Musks will be involved in the first steps outside of our solar system.

While SpaceX has raised its capital from private sources, it receives significant revenue from government contracts, so there is a level of dependence on public space initiatives. However, the argument made by Marcus at the first link above, that IST by ETs is less likely (or impossible) if they live under a socialist regime, is not based primarily on recent experience with private entrepreneurial efforts like Musk’s. Instead, it has to do with the inability of socialist regimes to generate wealth, especially the massive wealth necessary to accomplish IST.

Discussions of ETs (or the lack thereof) often center around a question known as the Fermi Paradox, after the physicist Enrico Fermi. He basically asked: if the billions and billions of star systems, even in our own galaxy, are likely to harbor a respectable number of advanced civilizations, where are they? Why haven’t we heard from them? My friend John Crawford objects that this is no paradox at all, given the vastness of space and the difficulty and likely expense of IST. There may be advanced civilizations in the cosmos that simply have not been able to tackle the problem, at least beyond their own stellar neighborhood. No doubt about it, IST is hard!

I have argued to Crawford that there should be civilizations covering a wide range of development at any point in time. In only the past hundred years, humans have increased the speed at which they travel from less than 50 miles per hour (mph) to at least 9,600 mph. The speed of light is approximately 270,000 times faster that that! At our current top speed, it would take almost 50% longer to reach our nearest neighboring star, Alpha Centauri, than the entire span of human existence to-date. With that kind of limitation, there is no paradox at all! But I would not be surprised if, over the next 1,000 years, advances in propulsion technology bring our top speed to within one-tenth of the speed of light, and perhaps much more, making IST a more reasonable proposition, at least in our “neighborhood”. There may be civilizations that have already done so.

Answers to the Fermi Paradox often involve a concept called the Great Filter. This excellent HuffPo article by Tim Urban on the Fermi Paradox provides a good survey of theories on the Great Filter. The idea is that there are significant factors that prevent civilizations from advancing beyond certain points. Some of these are of natural origin, such as asteroids and radiation exposure. Others might be self-inflicted, such as a thermonuclear catastrophe or some other kind of technology gone bad. Some have suggested that the Large Hadron Collider in Switzerland could be a major hazard to our existence, though physicists insist otherwise. Another example is the singularity, when artificial intelligence overtakes human intelligence, creating a possibility that evil machines will do us in. The point of these examples is that some sudden or gradual development could prevent a civilization from surviving indefinitely. These kinds of filters provide an explanation for the Fermi Paradox.

More broadly, there could be less cataclysmic impediments to development that prevent a society from ever reaching an advanced stage. These would also qualify as filters of a sort. Perhaps the smart ETs lack, or failed to evolve, certain physical characteristics that are crucial for advancement or IST. Or their home planet might be light on certain kinds of resources. Or perhaps an inferior form of social organization has limited development, with inadequate wealth creation and technologies to transcend the physical limitations imposed by their world. On a smaller than planetary scale, we have witnessed such an impediment in action many times over: socialism. The inefficiencies of central planning place limits on economic growth, and while high authorities might dictate a massive dedication of resources toward science, technology and capital-intensive space initiatives, the shift away from personal consumption would come at a greater and greater cost. The end game may involve a collapse of production and a primitive existence. So the effort may be unsustainable and could lead to social upheaval; a more enlightened regime would attempt to move the society toward a more benign allocation of resources. Whether they can ever accomplish IST is at least contingent on their ability to create wealth.

Socialism is a filter on the advancement of societies. ETs capable of interstellar travel could not be spawned by a society dominated by socialism and central planning. While government might play a significant role in a successful ET civilization, one capable of IST, only a heavy reliance on free-market capitalism can improve the odds of advancing beyond a certain primitive state. Capitalism is a relatively easy ticket to the wealth required for an advanced and durable civilization, and conceivably to the reaches of the firmament.

Unfortunately, there is absolutely no guarantee that capitalistic ETs will be friendly toward competing species, or that they will respect our property rights. They might be big, smart cats and find us mouse-like and quite tasty. Their children might make us perform circuses, like fleas. In any case, if ETs get this far, it’s probably because they want our world and our resources. My friend Crawford says that they won’t get here in any case. He believes that the difficulty of IST will force them to focus on their own neighborhood. Maybe, but on long enough time scales, who knows?

I would add a caveat to conclusions about the strength of the filters discussed above. A capitalistic society might reach a point at which it could send artificially intelligent, self-replicating machines into space to harvest resources. Those machines might well survive beyond the end of the civilization that created them. Conceivably, those machines could act autonomously or they could take coordinated action. But we haven’t heard from them either!

It’s easy to make big headlines that serve a policy agenda when you can control the process generating “scientific” data. Here’s the latest in an ongoing fraud perpetrated by NASA, NOAA and a few other organizations. The disinformation is happily scooped up and reported by the unsuspecting news media, in this case The Wall Street Journal. The headline says that 2014 was the warmest year on record back to 1980, but there are several important respects in which the report from NASA and NOAA is misleading.

The surface temperature records maintained by NASA and NOAA (and others) utilize the same source data (despite NASA’s claim that the two series are “independent”), but they are heavily adjusted by the respective agencies. We can all probably agree that more recent temperature measurements (the raw data) are more reliable due to the availability of better and more numerous instruments (particularly for ocean surface temperatures). However, combining recent measurements with older data in a way that assures comparability is difficult over more than a few decades. Weather stations come, go, and relocate, environmental conditions around stations change with urbanization and airport expansions, and new measurement techniques are introduced.

Constructing a consistent temperature series over 130+ years at the world or regional level is therefore subject to much controversy. Here is a page with links to several good posts of the problems inherent in these efforts. Data is “infilled” and sometimes deleted, and statistical techniques are often applied in an effort to achieve consistency over time. However, it is curious that the NASA and NOAA adjustments over time seem to pivot around the levels of the 1950s and 1960s, as if to suggest that the temperatures measured in those decades are the most reliable part of the series. Take a look at the “gifs”in this post, which show temperatures before and after adjustments. An apparent consequence of the NASA / NOAA statistical techniques, which may seem even more curious to the casual observer, is that new observations can influence the entire temperature series. That is, adding 2014 temperatures to the series may lead to fresh downward adjustments to 1936 temperatures, if it suits the agencies. By the way, 1936 was a very warm year, but according to these agencies, it’s been getting less warm.

1) The range of uncertainty cited by NOAA in background documents indicates that the small margin (0.04 deg C for NOAA, 0.02 deg C by NASA) by which the reported 2014 global temperature exceeds the previous high is within the confidence interval around the previous high. By their own standard, it was “more unlikely than likely”that the 2014 temperature was the warmest on record, but that is not what the agencies report in their “Highlights.”

2) The report states that “This is the first time since 1990 the high temperature record was broken in the absence of El Niño conditions at any time during the year in the central and eastern equatorial Pacific Ocean….” Yet there were El Nino conditions elsewhere in the Pacific in 2014.

3) “NOAA failed to discuss the actual causes of the elevated global sea surface temperatures in 2014, while making it appear that there was a general warming of the surfaces of the global oceans.”

Tisdale notes elsewhere that the tiny margins of “record warmth” reported by NASA and NOAA contribute to a growing disparity between reported “actual temperatures” and those projected by climate warming models. The “Warmist” community will view the NASA / NOAA findings favorably, as the new “record high” supports their narrative,” providing new fodder for the agenda to end the use of fossil fuels and to regulate activities deemed “unsustainable.” Unfortunately, the misleading reports are likely to seem credible to the general public, which is largely ignorant of the agencies’ rampant manipulation of temperature data.

A new paper reported here debunks an important feature of IPCC climate models: that the oceans absorb infrared radiation from greenhouse gases, thus heating the oceans and accounting for the “missing heat” predicted by climate models. No, they do not. The research, which appeared in the Proceedings of the National Academy of Sciences, identified several physical reasons that ocean warming from CO2 is all but impossible. From the link above:

“For all … of these physical reasons… ocean warming can only be related to solar activity and modulators of sunshine at the surface like clouds, and not increased far-IR radiation from increased greenhouse gases.

This is a death knell for conventional climate models, which falsely assume the opposite of the … physical reasons above, thus falsely claiming IR from greenhouse gases can heat the oceans (70% of Earth’s surface area) and where allegedly 90% of the ‘missing heat’ has gone.”

One of those physical reasons is related to whether water and water vapor act as “blackbodies,” which is assumed by climate models embodying AGW. They do not:

“The significance to the radiative ‘greenhouse effect’ is that the climate is less sensitive to both CO2 and water vapor since both are less ‘greenhouse-like’ emitters and absorbers of IR radiation as temperatures increase.”

So the oceans are not the massive AGW heat sinks that we hear about so often. And much of that “nasty” CO2 finds eager vegetative consumers: This article reports research suggesting that 90% of CO2 emissions are stimulating forest growth around the world:

“Even NASA’s own satellite data shows that the planet is steadily greening, by as much as 1.5 percent a year in northern latitudes. Yet in May last year, the world’s media mournfully reported that atmospheric CO2 had just passed the 400ppm mark for the first time in three to five million years, with NASA clamouring to paint the news in a calamitous light. …

Nova says ‘the northern Boreal forests are probably drawing down something like 2 – 5 gigatons of CO2 every year, and because the seasonal amplitude is getting larger each year, it suggests there is no sign of saturation. Those plants are not bored of extra CO2 yet. This fits with Craig Idso’s work on plant growth which demonstrates that the saturation point — where plants grow as fast as possible (and extra CO2 doesn’t help) is somewhere above 1000 and below 2000ppm. We have a long way to go.’”

I believe a greener world is preferable to a less green one. In fact, I believe a somewhat warmer world is preferable. That would bring many obvious benefits to mankind, not least of which is a reduction in weather-related misery and death. (No, severe weather is not an implication of a warner climate.) I therefore find it bizarre that so many have been successfully propagandized to believe that we should sacrifice vast amounts of resources to prevent AGW. It is not a danger of much significance. There are explanations for the propaganda, of course, but they will have to be the subject of another post.

In advanced civilizations the period loosely called Alexandrian is usually associated with flexible morals, perfunctory religion, populist standards and cosmopolitan tastes, feminism, exotic cults, and the rapid turnover of high and low fads---in short, a falling away (which is all that decadence means) from the strictness of traditional rules, embodied in character and inforced from within. -- Jacques Barzun