“grand minimum” could overtake the sun perhaps as soon as 2020 & lasting thru 2070, resulting in diminished magnetism, infrequent sunspot production… — all bringing a cooler period to the planet that may span 50 years [link]

Insights into Atlantic multidecadal variability using the Last Millennium Reanalysis framework [link]

Not long ago, the idea of New England relying on Russian heating fuel shipped halfway around the world would have been laughed right off Boylston Street. But the fuel situation in New England is ominous. [link]

Interesting article on Patrick Allitt’s new book A Climate of Crisis: America in the Age of Environmentalism [link]

Current estimates indicate the initial impacts of climate change may be positive (~1°C), but in the long run the negative impacts dominate. Negative impacts substantially greater in poorer, hotter, & lower-lying countries. @RichardTol[link]…

Engineering put New Orleans below sea level. Now the city’s future is at risk, [link]

We worry if a 2°C pathway is feasible, but we need to apply the same thinking & logic to baseline scenarios”. Large spread in baseline scenarios, & many good arguments that high-end scenarios are infeasible. [link]…

About science and scientists

Andrew Sullivan: problems with the university identity-based social justice movement [link]

Assimilating a study can days, weeks or months. Quality takes time and thought.

While I am here – the hydrological rainfall post processing paper uses ACCESS-S – a model not yet operational – but that uses components from a number of sources in a global project to improve seasonal forecasting.

“A WWRP/THORPEX-WCRP joint research project has been established to improve forecast skill and understanding on the subseasonal to seasonal timescale, and promote its uptake by operational centres and exploitation by the applications community. Specific attention will be paid to the risk of extreme weather, including tropical cyclones, droughts, floods, heat waves and the waxing and waning of monsoon precipitation. To achieve many of these goals the establishment of an extensive data base of subseasonal (up to 60 days) forecasts and reforecasts (sometimes known as hindcasts) has been advocated, modelled in part on the THORPEX Interactive Grand Global Ensemble (TIGGE) database for medium range forecasts (up to 15 days) and the Climate-System Historical Forecast project (CHFP) for seasonal forecasts. Developing an extensive data base for the subseasonal time scale will be a challenging task since consensus still needs to be reached on how to produce these forecasts (start dates, length of the forecasts, averaging periods, update frequency of the forecasts). For NWP forecasts, model error is not usually so dominant that a reforecast set is needed but for the subseasonal to seasonal range model error is too large to be ignored. Therefore an extensive reforecast set spanning several years is needed to calculate model bias, which in some cases can also be used to evaluate skill.”

“We will also be adding our own data assimilation and ensemble generation scheme, which is appropriate for generating multi-week forecasts. Data assimilation can be at least as complex as the model itself! Going forwards, we will be playing a significant role in contributing to the development of the UKMO model, specifically focussing on atmospheric and oceanic processes that are critical to the Australian region.” http://poama.bom.gov.au/info/access-s.html

“grand minimum” could overtake the sun perhaps as soon as 2020 & lasting thru 2070, resulting in diminished magnetism, infrequent sunspot production… — all bringing a cooler period to the planet that may span 50 years

Of course the linked article offers no explanation as why a grand minimum could take place from 2020, so it is for no particular reason. Then the article is about a paper that says absolutely nothing about when a grand minimum could take place. As usual scientific journalism is appalling.

“Using past variations of solar activity measured by cosmogenic isotope abundance changes, analogue forecasts for possible future solar output have been calculated. An 8% chance of a return to Maunder Minimum-like conditions within the next 40 years was estimated in 2010 (ref. 2). The decline in solar activity has continued, to the time of writing, and is faster than any other such decline in the 9,300 years covered by the cosmogenic isotope data1. If this recent rate of decline is added to the analysis, the 8% probability estimate is now raised to between 15 and 20%.” https://www.nature.com/articles/ncomms8535

It is relevant to changes in polar surface pressure and feedbacks in the world’s oceans and atmosphere potentially leading to significant cooling in higher northern latitudes and more upwelling in the eastern Pacific.

Ah, yes. Except that the Sun expends 17% of its time in a grand minimum state, according to Usoskin. So saying 8-20% probability is saying nothing, specially since over a 1000 years can go between SGM as they tend to occur in clusters. And as we all know, the fastest rate in X thousands of years is an empty claim when the rate is not measured in the same way and with the same resolution. This ain’t science.

Ah yes, the problem of the sources. Lockwood, 2010, “Solar change and climate: an update in the light of the current exceptional solar minimum”, is a very weak paper fraught with problems.

First, he bases the current situation in a solar grand maximum that has been shown to be an artifact of sunspot accounting. The new sunspot number since 2015 does not support a current solar grand maximum.

Second, he uses Steinhilber et al., 2008 data. Steinhilber redid its reconstruction in 2012 to correct for errors from using a single isotope.

Third, and most important, he assumes solar grand minima follow a random distribution, and thus their probability can be calculated from their frequency. But Usoskin has already demonstrated solar grand minima are not randomly distributed. Lockwood probabilities are not correct.

And fourth and last, Lockwood does not conclude what Ineson et al., 2014, yes including Lockwood, are saying. From Lockwood 2010: “Hence, although there is a chance that the current solar minimum heralds a rapid return to MM conditions in the next century, a smaller fall is actually more likely and MM-like conditions are most likely to be 100–200 years into the future.”

As we know SGM are not randomly distributed, it is most likely to be 400-500 years into the future.

As with regard to the “faster than any other such decline in the 9,300 years covered by the cosmogenic isotope data,” it is pure fantasy.

Clustering in fact suggests an analogy with Hurst effects in Nile River flows. Hurst effects are the result of chaotic dynamics that are ubiquitous – including within the Sun. And the statistics of this is approached in an entirely different way.

Hydrologists are aware of ‘clustering’ – there are a number of names for it but the origin is – chaos in internal planetary variability!!! In statistical terms it simply means that the series is stationary over a much longer period than usually contemplated. It doesn’t invalidate traditional stochastic analysis but does have implications for water resources management in the shorter term. As in the not working link to GRACE and groundwater.

And – ah yes – I am aware of the Clette et al revision of the group number.

“We finally conclude with future prospects opened by this epochal revision of the Sunspot Number, the first one since Wolf himself, and its reconciliation with the Group Number, a long-awaited modernization that will feed solar cycle research into the 21st century.”

One problem is that it doesn’t seem compatible with the isotope record. Or with other recent TSI reconstructions.

“These estimated solar irradiances for the last 400 years are based on the NRLTSI2 historical TSI reconstruction model by J. Lean and described by Coddington et al. The values of the NRLTSI2 model have been offset a small amount for agreement with recent SORCE/TIM values and replaced by SORCE/TIM annual averages from 2003 onward. The historical reconstruction provided here was computed using TIM V.17 data in January 2017. It is updated annually as new TIM data are available or as improved historical reconstructions are created.”

And – ah yes – I am aware of the Clette et al revision of the group number.

Perhaps you are also aware that after a series of workshops to discuss the revisions, the new number has been accepted as the official international sunspot number by SILSO (Sunspot Index and Long-term Solar Observations).“Since July 1st 2015, the original Sunspot number data have been replaced by a new entirely revised data series.”http://www.sidc.be/silso/datafiles

One problem is that it doesn’t seem compatible with the isotope record. Or with other recent TSI reconstructions.

Of course it is compatible with isotope records, otherwise it would have not been accepted.

The new revision suggests a lack of a 20th solar grand maxima. It is clearly not consistent with the reconstruction on the SORCE/TIM page.

I have bookmarked the isotope paper for a more in depth perusal – although with my new
digital research assistant that hardly seems necessary. You may have overstated the correspondence with the isotope record.

The obvious example is that just 3 people take over 30% of the comments and you are one of them. Then some people wonder that there is less participation in the comments section of articles. Who wants to read that article after article? I don’t. I just don’t feel like participating in the comments section if it is taken over by people like you. Bye now. Don’t bother as I won’t be reading it.

Lubin said that an upcoming grand minimum would not stop the current trend of planetary warming but might slow it somewhat. That’s because the cooling effect of a grand minimum is only a fraction of the warming effect caused by the increasing concentration of carbon dioxide in the atmosphere.

That’s pretty much how all this government-funded climate goes–e.g., we just learned of the new God-factor involved in global warming but the human-factor still trumps nature. Honest (they made me say that).

I have to admit that I have thought about the putative meaning of the cited p-values and confidence intervalls in the last post about sea level rise, esp. for the few long time series. What do these actually refer to? What is the null hypothesis? Is it always the same?
I found me left scratching my head. No idea.
So what was the meaning of that post?

What may be the underlying statistical time series model that is supposed as a reference for sea level rise questions? What may be its validity?

I’d like to understand the assumptions that are needed for this conclusion (in the Abstract):

However, in the long run the negative impacts dominate the positive ones.

I’d also like to understand what proportion of the impacts are contributed by the main impact sectors; from Tol (2013) the main impact sectors were:
agriculture, health, sea level rise, storms, fresh water, ecosystems, energy.

Tol has a flaw in his model (all of them have the same flaw, so this isn’t a big deal other than all of them get it wrong). The model fails to deal with fossil fuel depletion and the price increases beyond inflation needed to produce them in growing quantities as is assumed in the baseline scenarios. There is one other flaw in all IAMs, related to how they deal with human dynamic response (they do a poor job, but this issue is too technical for me to discuss it here). However Tol is brighter than most of the individuals doing this type of model, so maybe some day he will grasp these concepts.

Yes. BIG for oil (see several essays in ebook Blowing Smoke), medium for coal ( has to do with declining seam thickness,increasing depth, and declining rank), and least for nat gas thanks to shale source rock fracking.

Thank you for your comment. Can you please point out where the error is in the FUND impact equations; the energy impact equations are here: http://www.fund-model.org/versions – see FUND3.9 Documentation pp 9-10. Specifically which parameter is incorrect, what is it’s current value and what should it be?

Peter, none of these models include a module which takes into account fossil fuel depletion, its impact on fossil fuel prices as well as the industry ability to meet demand at any price (this last one is important, in some circumstances you can pay an infinite price and the oil won’t be produced no matter what because physical processes require time).

There’s also a need to estimate how fossil fuels are replaced as fossil fuel prices increase, by either a known or unknown source. For example, a recent document written by Spanish energy economists had a plug in called “biofuel or biomass”. They were mapping out the change needed to decarbonize, realized it wasn’t feasible with existing technology, so they created a politically correct source (biofuel and biomass are politically correct in Europe, nuclear or just writing “unknown new technology” wasn’t kosher because this work was for a government committee).

I have worked on dynamic model structures since 1991 (I forget the exact date), and I realize it’s very difficult to turn the logic around to allow a module to control the hypothetical disconnect between economic growth and fossil fuel emissions, but if this isn’t done the result can be RCP8.5, which simply blew a gasket and is useless because it has emissions from very unrealistic fossil fuel burn rates.

And notice I write burn rates, because not all the oil and gas get burned. A fraction is used for plastics, and some asphalt is used to pave roads and parking lots. This is an issue nobody discusses, even though we see that a large fraction of the increase in demand for what you call “oil” is used to make plastics.

In conclusion, an IAM that’s worth using should consider and document properly, in detail, the assumed fossil fuel sources and prices, and the way they are replaced by either existing or new technologies. When you do this, you’ll see that most of the change from fossil fuels is caused by market forces, and that market forces almost alone will result in an optimum path. The best way to spice change towards the optimum is to do more research in three areas: renewables, nuclear, and geoengineering, and possibly to help finance projects such as hydropower and geothermal which may need a lift in some countries.

Thank you. I am not convinced. The IAM’s are ‘big picture’ projections. They cannot get down to the level of detail you are suggesting. They’d have to do the same for the impact functions for all impact sectors. That’s impossible. And it is highly speculative what energy transitions will take over the next 100 years or so. They already rely on a huge number of assumptions.

It’s better to do as Tol and the other IAMs have done – apply big picture projections of rates of change.

We have virtually unlimited cheap energy available. It’s just blocked by ideology and politics. It will come eventually.

Nuclear could now be 10% of current cost http://www.mdpi.com/1996-1073/10/12/2169 if not for the anti nuclear protest movement’s success since the 1960’s in scaring the public. Transport fuels we use now (petrol, diesel, jet fuel) are also unlimited with cheap electricity and sea water.

Surging output of US shale oil has once again shattered the assumptions of Opec and the global energy establishment, pushing the crude market back into surplus and smothering the nascent boom. The International Energy Agency said the world is witnessing the second great wave of US shale growth, with production soaring by 1.3 million barrels per day (b/d) over the last year. Extra US supply threatens to match the entire growth in crude demand from China, India, and the rest of the world in 2018. …

America’s irrepressible shale frackers have been able to halve costs since the peak of the last bubble with a mix of ‘multi-pad’ sites, longer lateral bores, and precision drilling with ‘smart bits’ linked to computers. Most can flourish in a price band from $45 to $55, with the best locations in the Permian Basin of West Texas closer to $25. Output is growing as fast today as it was when oil was trading at over $100.

CNN? Did you notice the article doesn’t show a decent graph with the sea level over the satellite record? It has teeny graph showing an illegible historical record with the extrapolated forecast, but the graphics can’t be used.

Want to know something else? The University of Colorado sea level web page hasn’t been updated in over a year , probably because sea level has been dropping (i think this happens as a result of the change from El Niño to La Niña). If the work used to back the CNN propaganda piece used the hump we saw in 2014-15-16 due to El Niño to bias results, then We can concluye Its just another case of #fakenews.

Since it is a known truth that the MWP only occurred in Europe and the North Atlantic we will have to reassess where exactly Africa was physically located 1,000 years ago. Perhaps it was all this tectonic activity that contributed to the LIA and the inaccurate maps created at the dawn of the age of exploration [/sarc].

I often see statements like this: “Global average temperature fell by about a degree during the Little Ice Age, although scientists have struggled to quantify local cooling.” From “Why did Greenland’s Vikings disappear?” by Eli Kintisch in Science, 10 Nov 2016 (cited above).

But pre-instrument era global temperatures can only be reconstructed from surveys of local temperature records. How can the global numbers be more accurate or reliable than the local numbers?

Satellite altimetry has shown that global mean sea level has been rising at a rate of ∼3 ± 0.4 mm/y since 1993. Using the altimeter record coupled with careful consideration of interannual and decadal variability as well as potential instrument errors, we show that this rate is accelerating at 0.084 ± 0.025 mm/y2, which agrees well with climate model projections. If sea level continues to change at this rate and acceleration, sea-level rise by 2100 (∼65 cm) will be more than double the amount if the rate was constant at 3 mm/y.

Abstract

Using a 25-y time series of precision satellite altimeter data from TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3, we estimate the climate-change–driven acceleration of global mean sea level over the last 25 y to be 0.084 ± 0.025 mm/y2. Coupled with the average climate-change–driven rate of sea level rise over these same 25 y of 2.9 mm/y, simple extrapolation of the quadratic implies global mean sea level could rise 65 ± 12 cm by 2100 compared with 2005, roughly in agreement with the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5) model projections.

We could also say, The average is 1.1 mm/yr from 1900 to 1994, with a fair amount of in 1950 and going backwards getting worse as you had some guy with a notebook writing things down while thinking about when this job would be replaced by better one, that didn’t involve seagulls.

Of course there are criticisms, that is part of the scientific process. The point is there are other views about the question of acceleration. It is not a slam dunk like the establishment wants the public to believe. It’s not any different from other issues. If someone does a thorough investigation of the claims in the MSM there is evidence for levels of uncertainty no one wants to admit. Some authors believe up to 60 to 100 years of a record is needed before clear acceleration is apparent. 25 years doesn’t cut it.

The seagulls are currently sitting on the adjacent roofs looking for good spots to nest. It is illegal to disturb their nests and due to their size and viciousness soon those fixing tiles or clearing out gutters will refuse to go up for fear of being attacked.

I can just visualise your scenario as the person noting the tide heights looks nervously around him….

By not having physical underpinning to your curve fitting one gets quite perverse results. In this case if you test the model by hind casting (just graph the quadratic), you suddenly find that sea level is rising from the 1970s heading back in time. By 1910, as far back as this paper goes forward in its projections, you find sea level is more than 100mm above the 2005 satellite measured level when going on tidal gauges in practice it is probably 150mm below.

This is a post about global sea-level rise, but I put that message up front so that you’ve got it even if you don’t read any further.

…

Conclusion

So remember: don’t fit a quadratic to data that do not resemble a quadratic. Instead, look at the time evolution of the rate of sea level rise. And remember there is something called physics: this time evolution must be expected to have something to do with global temperature. And indeed it does.

Sealevel.Colorado.edu hadn’t updated their data from 2016 until today. Steve Nerem replied to my email asking why, said it was due to a migration to a new database. It now shows SLR at 3.1 mm/ yr. Last week the SLR was at 3.4 mm/ yr.

Figure 1 of the new Nerem-Fasullo article shows that they have gone from blaming Pinatubo for the lack of acceleration in the 2016 article to actually adding a Pinatubo correction to the data so it shows acceleration in the new article.

Shameful. A new case of conforming the data to the hypothesis. This really casts a bad image on science for being allowed and lauded.

Without questionable corrections, the data still shows no acceleration in the altimeter age.

Using a 25-y time series of precision satellite altimeter data from TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3, we estimate the climate-change–driven acceleration of global mean sea level over the last 25 y to be 0.084 ± 0.025 mm/y2. Coupled with the average climate-change–driven rate of sea level rise over these same 25 y of 2.9 mm/y, simple extrapolation of the quadratic implies global mean sea level could rise 65 ± 12 cm by 2100 compared with 2005, roughly in agreement with the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5) model projections.

Anyone who believes that a quadratic curve-fit to a 25-yr time series of geophysical data can be realistically extrapolated from now to the year 2100 is either analytically naive or an outright charlatan.

The dip was caused by rain on continents (transferring water from the oceans to the continents,) which means a great deal of that water would cycle right back to the oceans; the other is caused by AGW. That is why the trend rebounded so quickly.

Taleb’s Incerto. He is almost becoming a ‘black swan’ cult. It is true ever since Benoit Mandlebrot of IBM (fractal fame) analyzed commodity and stock prices, that fat tail ‘black swans’ were likely in non-normal probability distributions. Benoit just proved ‘normality’ was ‘rare’ in nature. What matters greatly is the shape of the statistical non-normality, not some alarmist BS.
Decades ago, I did the grunt programming for HBS prof Lindner’s log normal SP 500 stock return hypothesis. mainframe—boy, dating myself! Upon which Taleb’s investment prowess is based. There is no magic here. There are many well known fat tailed distributions. Log normal stocks is symmetrical. Logisics was Hubbert (false) symmetric peak oil. Hubbert got the function but not the peak wrong. Best probability function based on all fields/basins (conventional) is a gamma function—same peak, but slower decline tail. Makes common economic sense given prices will rise sharply with scarcety. See essay Peeking at Peaks for more math details in footnotes.

That nails it. Unfortunately the investment biz was slow to accept implications of the non-normality of stock prices.

In 1987 tactical asset allocation was hot (aka “portfolio insurance”, convex portfolio rebalancing). Jack Treynor spoke at the San Francisco Society of Security Analysts, informally discussing several things – mostly ad lib. He mentioned that he had tested one of the major TAA packages — but fed it a randomized set of stock prices (i.e., not in chronological order). It generated a wonderful but dysfunctional set of recommendations.

The problem, he said, was that TAA inadequately adjusted for the non-normality of stock prices. In October 1987 the industry received experimental validation of this analysis.

It was a wonderful example of simple disproofs of complex theories. Like Richard Feynman dunking the Space Shuttle’s o-ring in ice water.

LK, yup. The Feynmann Space Shuttle booster O ring dunk is one of my true science youtube favorites. Even written up as a subexample with hyperlinks in my sarcastic critical thinking 2014 ebook, The Arts of Truth. Last subexample in the Disclosure chapter.
Last major Arts of Truth book chapter was on climate change, critiqued months before publishing by Lindsen himself at MIT weeks before he retired, at the cost of buying his ‘wet’ lunch that day at the MIT faculty cafeteria (therein a deeper hidden message for those who know about the MIT faculty building, and its ‘infamous’ architech, and its resulting roof leaks). See Richard Lindsen’s ‘hidden’ influence in, for example, footnote 122 to the climate chapter, or footnote 5 (Svalbard continental drift) to the recognition chapter.
Highest regards.

From the story- they had to buy it from the Russians because American climate change activists keep preventing gas pipelines to New England. And they really need those pipelines because the climate change activists want to shut down the clean, CO2-free nuclear power plants and something needs to power Boston.
A few more stories like this and people may start to realize this country doesn’t have any climate change activists who know or care anything about climate change.

On the Greenland Vikings, there is a more comprehensive telling of the story as researchers now see it published in the Smithsonian magazine. It is more complicated than climate change. In fact the Vikings were there for centuries and building a church with stained glass windows well into the little ice age. Synopsis with images and link to article:

Black carbon and sulfate are co-emitted aerosols and the warming potential of black carbon is amplified by up to 200% depending on the mixing ratio.

“Many ignore the internally mixed state of BC with other aerosols. Such mixing enhances forcing by a factor of two (ref. 39). Field observations have consistently shown that BC is well mixed with sulphates, organics and others” http://www-ramanathan.ucsd.edu/files/pr160.pdf

This would make black carbon mixed with sulfates the largest source of anthropogenic warming in the 20th century – and the one most easily and completely controlled with off the shelf technology.

I so love hydrology –
because the buckets are so obvious. People have started to make major and global increases in SWS and GWS – with implications for hydrology of streams, the surface temperature record, sea levels, economic development, food security and flood and drought resilience.

Andrew Sullivan: We All Live on Campus Now.
“problems with the university identity-based social justice movement”

Before his death in 1935 Democrat Senator, Huey Long, remarked that America probably would have Fascism some day; but, he added, “when we get it we won’t call it Fascism—we’ll call it anti-Fascism.”

There are several Fascist youth movements today spilling out of universities that brand themselves as anti-Fascist. Sadly these groups don’t know what Fascism is, but they think they know, because they’ve been purposefully educated to believe a distorted, confused, misdirected definition of it. Probably many of their professors believe it too; by default echo chambers birth intellectually inbred thought.

Huey Long was left of FDR, his sensibilities weren’t meant to warn of Fascism, but rather to taunt; he would have embraced Fascism any convoluted way he could get it. Anti-Fascism as a euphemism for Fascism, and few will know it, including the Fascists! Perfect.

In the mid 1920’s Leftist elites around the world were gushing with praise and want over Fascism, completely enamored with it. This included FDR himself; he once called Mussolini “admirable,” adding that he was “deeply impressed by what he has accomplished.” Later in the 1930’s these elites did cartwheels to distance their hard Left beliefs from Fascism; the writing was on the wall with what was going on in Europe. Thus the definition of Fascism was redirected as being of the Right. Our youth in the U.S. don’t even know what the Constitution is about. That’s how good our education system is, assuming this is purposeful. And it is.

George Will described in a New York Post essay: An American Council of Trustees and Alumni study — Based on requirements and course offerings at 75 leading colleges and universities — found that “the overwhelming majority of America’s most prestigious institutions do not require even the students who major in history to take a single course on United States history or government.” I suppose professors fret Fascism has little chance if students carry around all that unnecessary Republic nonsense about individual freedom.https://nypost.com/2016/11/20/college-kids-are-proving-trumps-point/

So while the original brand of the Fascist Party showed itself to be unhinged, the Left has no intention to throw the baby out with the bathwater; simply rehinge it under a new brand until it works! This is the way evolutionary Socialism has expressed itself (Fascism is a brand of Socialism, as was National Socialism). The antecedents to formal Fascism can be found everywhere one looks in our institutions, the strong arm of collectivism attempts to overwhelm them in a relentless drive towards hegemony. This is how I define the deep state.

The idea of “fascism” has always been somewhat vague and took different forms. Mussolini coined the word by naming his party the Fascist Party. Under Mussolini, it meant extreme nationalism, a strong centralized state, and dictatorial rule by a charismatic leader. FDR and other political leaders liked him in the beginning mainly because they were hopeful that Italy’s weak monarchy would be replaced by a strong, effective state. They soon learned that fascism was mainly an excuse for a brutal dictator.

It seems your notion of the “deep state” has more to do with a nebulous idea of “socialism” as promoted by various right wing conspiracy theorists. The most offensive part of the deep state, of course, is its leadership – democrats like Obama and the Clintons. But if you look at definitions of words like “socialism” or “collectivism”, and fascism, the “deep state” has almost nothing to do with those terms. They are mainly used as scary epithets in an effort to tar undesirable politicians, and of course, anybody who’s critical of Donald Trump.

The vagaries of Italy’s fascism doesn’t translate to how it’s been intentionally redefined by the Left. Communism, socialism, and derivatives, are forms of Marxist collectivism; they’re all based on the planned society in one form or other. Fascism is a more militant derivative of socialism, which too, is colored differently depending on the society expressing it.

But there’s certainly vagaries in defing fascism, and I use the term antecedents in expressing fascist tendencies as it relates to Huey Longs prognostications. Certainly in many instances what’s going on in our universities is more militant in nature, more spiritually driven, and expressed with cultish fervor, all hallmarks of fascism in particular; but orthodox fascism of course ultimately requires a dictator to govern a planned society.

It makes sense that after Mussolini was kicked out of Italy’s socialist party, supposedly for his hawkish stance in WWI, that he would embellish his new found Fascist Party using socialism and Marx philosophy that he was a student of as its basis, but suited to his own predispositions, i.e. embellishing it with his love for the military. In Mussolini’s wandering manifesto, he describes fascism as spiritual, centered around ones inner spiritual desire to work for the good of the state. Individualism had to be stripped away to achieve such purity, the military played a role in this regard among other things. Mussolini railed against individualism, hated capitalism, which makes it impossible for him to be associated with the right, he was a central planner all the way through.

You’re wrong about how the Left felt about fascism in its early years. The Left expressed love for the efficiency of it, as stated previously, FDR was among those who admired its efficiency. And Mussolini complimented Roosevelt’s many reforms, stating of FDR’s policies: “Reminiscent of Fascism is the principle that the state no longer leaves the economy to its own devices … Without question, the mood accompanying this sea change resembles that of Fascism.” There’s many other examples to pull from.

When you state “words like “socialism” or “collectivism”, and fascism, the “deep state” has almost nothing to do with those terms.” Of course there’s no formal tie-in, it’s a relatively new term, your comment is moot.

The tie in of fascism to the “deep state” relative to government is a bit off base I think, but it’s certainly Marx based in its fervor and desire for hegemony. Much of the community organizing we see uses coercive fascist tactics, inclusive of agitation and propaganda leveraged with a corrupt, synergistic media. These things mirror the tactics of fascism.

I’ve provided some of the provenance, that you don’t like it is understandable, that you dismiss it reverts to the usual dishonest stance, this is the seed of the Left. Obamacare architect, Gruber: “Lack of transparency is a huge political advantage. And basically, call it the stupidity of the American voter or whatever, but basically that was really really critical for the thing to pass“. This is more or less the same philosophy used to sell CAGW to the masses. The deep state is everywhere.

Interesting comment.
Certainly the slur of ‘facist’ from the left is ridiculous. In reality it is just the result of decades of mind control that started with the Frankfurt school, through works like the ‘Authoritarian Personality’, and ‘Psychology of Facism’. All of the social ‘sciences’ are continuations of this kernel, basically programming kids to think ‘anyone who doesn’t rebel against their parents, hate Western Society, and do lots of drugs and have sex out of marriage’ are proto facist.
Its all a big lie promoted by a particular powerful ethnic group within America.
But I don’t know that I completely agree with your assessment of the youth today. I don’t really view any of these people, either the Alt Right or the Antifa types as any type of actual ‘movement’, unlike the communists and facists of the 1930’s. I just think their are complete mindless puppets of the programming that is being directed at them from the Oligarchs, both through the schools, the media, and by funded groups on the ground. And by controlling the reactions on both sides.
The Oligarchs want everyone in America fighting rather than asking questions like: Why have all our jobs been shipped overseas to give corporate labor arbitrage? Why do we spend $1000 billion dollars on a military when no threat exists? Why do we allow company after company to combine into monopoly, destroying capitalism and the market?

We do seem to be entering into some sort of Corporate Totalitarianism. Whether we can call it facist or not, I think its hard to say.
But the brainless antifa dwebes are certainly acting like shock troops!

“A burst of widely publicized research over the past decade found that the depleted Arctic sea ice could be part of a chain of events weakening the stratospheric polar vortex and hiking the risk of cold outbreaks in northern midlatitudes. These findings have been far from unanimous, though.”

I’d say that they have it backwards, negative AO/NAO episodes are driving the sea ice loss. Take the almost summer warmth of March 2012 in the mid latitudes, sea ice extent was high with a delayed onset of melting, while during the little ice age cold of March 2013 where the jet stream was around 1000 miles south of normal, sea ice extent was lower, with melting starting earlier and faster.

It’s pretty rich to see Andrew Sullivan as defender of “enlightenment values” writing about bullying and coarsening of the public debate and pinning it on Donald Trump.
Sullivan is famous for his literally unhinged screeds against Sarah Palin in 2008 (when Donald Trump was a Democrat supporting Hillary). He used The Atlantic as a platform to question the parentage of Palin’s children and grandchildren and demand her OB/GYN records.
He’s also famous for declaring that anyone who disagreed with any of his pronouncements on gay marriage for any reason was a bigot to be shunned and removed from the marketplace of ideas.

Yes, exactly.
Most likely he is playing ‘guard the flank’, trying to capture the reaction from the insanity and keep it hemmed into the Enlightenment liberalism that is causing the problem to begin with.
Nobody can be as stupid as the line he tries to take.

5. Conclusions
This study first identified the tree-ring chronologies with coherent multi-decadal climate variations for the areas near the northern Pacific Ocean over the past 430 years and then interpreted their climate significance. This method confirms that the climate signals in tree rings are stable through the entire period. We found that the coherent multi-decadal tree-ring variations surrounding the northern Pacific Ocean agree quite well with the multi-decadal climate variability of the areas in and around the tropical and southern Pacific Ocean. This indicates that the multi-decadal variations for the entire Pacific Ocean are coherent over the past 430 years. The dominant Pacific multi-decadal climate variability resembles the existing oceanic and atmospheric mode of the IPO, which reflects the dominant climate variability of the whole Pacific areas without the warming trend. This suggests the linkages between the Pacific multi-decadal climate variability and the IPO-like oceanic and atmospheric anomalies. Further proxy and modeling based studies are needed to investigate the regimes of this Pacific multi-decadal climate variability.

Versus:

“We find that the AMO is linked to continental warming, Arctic sea ice retreat, and an Atlantic precipitation shift. Low clouds decrease globally. We find no distinct multidecadal spectral peak in the AMO over the last 2 millennia, suggesting that human activities may have enhanced the AMO in the modern era.“

“The Last Millennium Reanalysis (LMR) employs a data assimilation approach to reconstruct climate fields from annually resolved proxy data over years 0–2000 CE. We use the LMR to examine Atlantic multidecadal variability (AMV) over the last 2 millennia and find several robust thermodynamic features associated with a positive Atlantic Multidecadal Oscillation (AMO) index that reveal a dynamically consistent pattern of variability: the Atlantic and most continents warm; sea ice thins over the Arctic and retreats over the Greenland, Iceland, and Norwegian seas; and equatorial precipitation shifts northward. The latter is consistent with anomalous southward energy transport mediated by the atmosphere. Net downward shortwave radiation increases at both the top of the atmosphere and the surface, indicating a decrease in planetary albedo, likely due to a decrease in low clouds.”https://www.clim-past.net/14/157/2018/

“As the world’s longest instrumental record of temperature, the Met Office Hadley Centre’s CET time series represents the monthly mean surface air temperature averaged over the English midlands and spans the period January 1659 to December 2013. The record covers several episodes of natural and anthropogenic warming of multidecadal durations, and has value as one of the longest instrumental temperature records, even though it is limited in spatial extent.” https://www.nature.com/articles/srep46091

The limited spatial extent may be a strength – as is the relative precision of the data. The area is within the zone of influence of these major modes of NH atmospheric and ocean circulation. They find a 3.36 and 22.6 year signal – that they associate with ENSO and the Hale cycle – and harmonics thereof. As well as a 1000 year scale change that they associated with anthropogenic greenhouse gases. The LMR contains teleconnected proxies from a much broader geographic area with some interesting results. But the broad coverage may obscure spectral peaks as a signal propagates through the climate system with leads and lags.

The Pacific tree ring study is an effort – but not all that convincing or useful. The alternative is to use hydrology based proxies or base it on patterns of polar winds preserved in ice. They did connect the IPO with the Southern Annular Mode but failed to develop the idea. Surface pressure variability at the poles has been linked to solar UV/ozone chemistry and so varies at the scales of solar variability. Varying polar surface pressures spin up more or less sub-polar winds and gyres in both hemispheres. It seems a Lorenzian forcing that biases the system to specific states of the Atlantic Multidecadal Variation and to coherent changes in upwelling on the eastern margins of both the north and south Pacific. Because it merely biases the system rather than driving it as such – simple correlations are problematic.

This proxy is based on salt content in a Law Dome ice core. The variation in salt content is caused by enhanced or diminished circumpolar wind fields that are coupled with the state of the Pacific. More salt is associated with a cool Pacific. Correlations of the IPO with Australian rainfall in the instrumental era were used to calibrate two different models of millennial IPO evolution. It shows extended droughts in the early part of the record and in the 20th century. The Pacific state does of course have global implications for temperature, hydrology and biology.

The reconstructed IPO has an average positive phase (IPO>0.5) duration of 14 years and a negative phase (IPO<−0.5) duration of 9 years (calculated from the PLF reconstruction). This means that not only are positive phases predominant (23 positive and 13 negative phases over A.D. 1000–2003) but they also last, on average, 5 years longer than negative phases."

Pacific or Atlantic? It is all a globally coupled system with Lorenzian forcing.

This is great:
“Only priests and rulers, whether pharaohs or later, Roman or Arab leaders, were allowed to monitor the nilometers (measuring deals), and their ability to predict the behavior of the Nile was used to impress the common people. (And to determine how much money would be collected in taxes.)”

Maybe not new. Joseph told Pharaoh that his dreams came from God telling him to prepare for seven years of plenty followed by seven years of famine. The task of Pharaoh was to find a wise and honest man to put some of the abundance of the years of plenty away to provide for the years of need and avert a terrible tragedy. So who would you believe on rainfall in that region – Joseph or Obama?

D. Kondrashov and colleagues collated a record of Nile River water levels spanning from 622AD to 1922AD. They calculated the mean of high water levels at 18 cubits. This suggests that life in ancient Egypt might best be described as lived on the edge. Perhaps not surprising given Joseph’s source of information – is that they found a strong 7 year signal in the data.

Whatever the cause – global hydrological variability – extreme drought, extreme floods and extreme temperature changes such as has not been seen in the 20th century – will occur again. The solution – such as it is – is to build prosperous and resilient communities. As Joseph tells us – to avoid catastrophe in the times of need requires a wise and honest person to manage things in the times of abundance.

Susan Kaspari, from Geology, and colleague from Geography, Megan Walsh, are working separately on related topics. These links explain what they do and how. They teamed up with a double presentation at the University not long ago.http://www.cwu.edu/geography/megan-walsh