Month: January 2010

‘This is of course a common situation; that the people who have failed to clean up a subject then don’t believe that it can be cleaned up … And then if somebody comes along and says, “Look, it works,” they don’t believe. … All the old people … including Max Born and Heisenberg and Schroedinger … had radical proposals which turned out to be totally useless [but which distracted them from taking new ideas seriously, and generally cluttered up the stage with useless junk, preventing more useful ideas from being seen] … Feynman … and … Schwinger … and … I were conservative in the sense that we … actually made the mathematics work and got the right answers.’

Above: in 1996, this spin-1 (repulsive) mechanism of gravity correctly predicted that the universe is accelerating at a = Hc, which is correct within existing experimental error. The measurement was only done two years later in 1998. Nature was unable to publish the prediction in 1996 (presumably due to the mainstream belief in the false claims of string theorists like Ed Witten, about gravity being predicted by spin-2 gravitons), but it was published elsewhere (see the letter below). If the distance R between the two particles is much larger than their effective radii r for graviton scatter (exchange), then geometrically, the area of the shadow cast on surface area 4*Pi*r2 by another fundamental particle of mass with similar size is Pi*r2(r/R)2 = Pi*r4/R2, so the fraction of the total surface area of the particle which is shadowed is simply (Pi*r4/R2)/(4*Pi*r2) = (1/4)(r/R)2.

This fraction automatically contains the inverse square law of gravity, and merely has to be multiplied by the inward force generated by distant mass m undergoing radial outward observed cosmological acceleration a, i.e. force F = ma, in order to predict the gravitational force. The theory works! The inward force produces the contraction term of general relativity, as proved in the earlier post linked here. One of many false objections is the claim that general relativity predicts a departure from the inverse-square law. In fact, the effective force of gravity on planet Mercury during its elliptical orbit can be represented as a departure from the inverse square law as a rough approximation only. The average orbital velocity of Mercury is about 48 km/s, but it speeds up when near the sun and slows down when far from the sun. The variation in speed causes a relativistic variation in inertial mass, which by Einstein’s equivalence principle of general relativity is equivalent to gravitational mass. Hence, since gravitational force is proportional the product of the masses involved, the variation in the mass of Mercury as it orbits the sun causes a slight departure from the Newtonian force predicted by the inverse-square law: the force of gravity falls off faster than predicted by the inverse-square law alone, since at greater distances the planet moves more slowly and thus has less mass. The accurate way to allow for this is to simply include the relativistic term, but as a rough approximation the inverse-square law can be modified.

Charles-Augustin de Coulomb’s 1785 work, Deuxieme Mémoire sur l’Electricité et le Magnétisme, presents his law that opposites attract while similar (charges or magnetic poles) repel each other. This is obviously not the case for Newton’s law of gravity, where two masses with similar signed gravitational charges appear to attract (each creates a gravitational field which makes small test particles fall towards it).

Although this acceleration is tiny, it is important for large masses spread over immense distances, such as the distant 3 × 1052 kg mass of the 9 × 1021 stars in galaxies observable by the Hubble Space Telescope (page 5 of NASA report linked here), giving an immense outward force of the big bang under Newton’s 2nd law of F = ma = 1.8 × 1043 Newtons.

This inward force due to the repulsion of the entire mass of the surrounding universe is much greater than the mutual repulsion of the relatively trivial mass in an apple and the relatively trivial mass in the Earth. A fairly relevant example of the physical mechanism of gravity is to be found in the implosion mechanism of nuclear weapons (this analogy first suggested the predictive gravity mechanism), where an ordinary chemical explosion around the core forces the atoms in the core to approach one another more closely (i.e. compression) despite the repulsive “normal reaction force” when you squeeze the metal.

The result is apparent “attraction” between small masses: the graviton field from the immense masses of the universe gets very slightly shadowed by small particles of mass, which are thus pushed towards one another, with not just the well known Newtonian inverse square geometric law, but also with the radial contraction of mass which is correctly predicted by general relativity but not by Newtonian gravity. (For the geometrical details see the post linked here.)

So quantum gravity, like the force between similar charges in electromagnetism, is universally repulsive between similar charges, and we’re simply being pushed down by the graviton exchange force converging inwards on us as the recoil from the immense outward force from the terrific mass of accelerating galaxies above us, which exceeds the smaller amount of repulsion between our mass and that of the Earth! The planet earth below us determines how much asymmetry there is: i.e. how much cancellation there is for the upward force coming from the galaxies receding from us below our feet (the gravitons are very, very slightly shielded by the Earth). The bigger the mass of the Earth, the more shielding, so the force is more asymmetric and the bigger is the acceleration downwards that we experience. If there was no Earth below us, we wouldn’t be accelerated at all, because the downward force on us would simply equal the upward force, cancelling out. The a bigger mass below us allows more of the constant downward force from above us to go uncancelled, i.e. to become a net downward force!

The more mass below us, the more asymmetry, so the greater is the fraction of the downward force that is uncancelled and which thus causes the effects of gravity we observe. A 1 kg book sitting on a table is an example of the need for there to be a net force. It is still in the Earth’s 9.8 ms-2 accelerative field, but isn’t falling because it’s 9.8 Newtons downward directed gravitational force is being completely cancelled out by the 9.8 Newtons upward electromagnetic force due to the repulsion between the electrons of the atoms of the table and those of the book. Things are only moved by an accelerative field if there is a net force. Similarly, there has to be an asymmetry in the isotropic inward reaction force of 1.8 × 1043 Newtons, in order for effects of gravity (other than the contraction predicted by general relativity), to occur.

The point I’m making is this: two similar sign gravitational charges (each composed of mass-energy) accelerate towards one another because they’re being pushed by spin-1 gravitons converging inwards from the distant, surrounding universe. The acceleration of the universe is small but the accelerating mass is immense so we can easily estimate the order of magnitude of the inward reaction force from the 2nd and 3rd laws of Newton: 1.8 × 1043 Newtons. Only a trivial fraction of this isotropic inward force needs to be shadowed by the fundamental particles in the Earth in order to produce the 9.8 Newtons gravitational force of a 1 kg book on your table. That’s provably how gravity works. It makes checkable predictions which are accurate to within experimental error. We made the predictions and published them in 1996, two years before Perlmutter confirmed from supernovae data that masses do repel over large distances where the universal repulsion predominates over local shadowing. Like the examples in the previous post, political prejudice in physics in favour of the Pauli-Fierz spin-2 graviton paradigm – which has not one shred of evidence is is promoted with the same pseudoscientific lies as the claim that “the sun’s apparent daily orbit of the Earth disproves Galileo” – in order to secure funding for spin-2 graviton “research” (false speculations) like Witten’s stringy M-theory and Smolin’s spin-2 loop quantum gravity. Fortunately, it’s funding money they’re after, not just power of the abusive thuggery sort of fascist eugenics. Spin-1 gravitons completely change – and simplify – the nature of the problem involved in bringing gravitation into the Standard Model. Instead of looking for universal by symmetry breaking from some more fundamental theory, we merely need to modify the electroweak force symmetries U(1) × SU(2). All this is virtually totally censored out and ignored by everyone else.

They find the proved facts unattractive in the standard presentation, because they believe that they can’t be wrong because they have so much crackpot research behind them, and they’re hoping to employ the AdS (anti-de Sitter space, i.e. spacetime with a negative cosmological constant, not the positive one observed) to conformal field theory conjecture from “string theory” in order to help to calculate the strong QCD attractive force over long distances where it is universally attractive. This correspondence is exact mathematically between AdS and CFT, but it’s not true that the effects of the strong force are exactly analogous to AdS: the strong force is not universally attractive over all distances. So the AdS/CFT conjecture when applied to practical nuclear physics is at best only going to be an approximation valid for the range of distances where the QCD force between hadrons is attractive and thus similar to AdS. We already have approximations like lattice QCD to to make calculations in this area. Even if it does help with calculations, it will just be a computational trick, not proof that the AdS/CFT conjecture proves stringy CFT. It has nothing to do with confirming spin-2 graviton conjecture. Anyway, back to the title of this blog post:

Being pushed together, rather than attracting: Lori Gottlieb’s new book Marry Him

Gottlieb says that a girl, if still single by age 30, should be less fussy and should push her way into the arms of the first guy who will take her. I.e., people need to be pushed together, not wait in the eternal forlorn hope of a mythical romantic attraction that in the real world either doesn’t exist or, at least, is universally insignificant at long distances. There is an arm-waving connection between this and the spin-1 graviton (which I’m exploiting in this post, as it hopefully makes for more interesting reading than purely stating the facts on gravity again and again). At long distances, gravity is repulsive as proved by cosmological repulsion of supernovae accelerating outward; only at short ranges where the masses involved are relatively small, does an apparent “attraction” occur because the push inward from all the surrounding masses presses two small masses together more strongly than they can repel each other (their gravitational charges – e.g. masses – are trivial compared to the masses in the surrounding universe, the converging inward flux of gravitons from which pushes them together).

Many dating problems arise in the same way: many people naturally repel at first when far apart, rather than attract, so it takes special circumstances to produce apparent “attraction”. People have to be pushed together by either circumstances or friends: something has to break the ice, and to break ice someone has to push it, not pull.

Lori Gottlieb’s new book which says girls should be less fussy and more pushy to get married: waiting for perfect, fairy tale romantic attraction is like demanding a miracle. Like charges fundamentally repel each other unless pushed together (it’s a law of nature well known from electric charges and magnetic poles, but it holds true even for gravity over large distances).

“The radiation is a few millidegrees hotter in the direction of Leo, and cooler in the direction of Aquarius. The spread around the mean describes a cosine curve. Such observations have far reaching implications for both the history of the early universe and in predictions of its future development. Based on the measurements of anisotropy, the entire Milky Way is calculated to move through the intergalactic medium at approximately 600 km/s.”

“… In this essay we ask ourselves the following question: In a homogeneous condensed matter medium, is there a way for internal observers, dealing exclusively with the low-energy collective phenomena, to detect their state of uniform motion with respect to the medium? By proposing a thought experiment based on the construction of a Michelson-Morley interferometer made of quasi-particles, we show that a real Lorentz-FitzGerald contraction takes place, so that internal observers are unable to find out anything about their ‘absolute’ state of motion. Therefore, we also show that an effective but perfectly defined relativistic world can emerge in a fishbowl world situated inside a Newtonian (laboratory) system. This leads us to reflect on the various levels of description in physics, in particular regarding the quest towards a theory of quantum gravity. …

“… Remarkably, all of relativity (at least, all of special relativity) could be taught as an effective theory by using only Newtonian language. …”

A People’s Archive interview with Freeman Dyson (number 78) discusses Dyson’s work on QED showing the equivalence of the Feynman and Schwinger approaches. Dyson wanted to show Oppenheimer and the Princeton IAS the QED success. Oppenheimer’s reaction reminds me of the attitude of the superstring establishment toward alternative models (including mine). Here are some excerpts from interview 78 with Dyson:

“… we met Oppenheimer and I wanted to talk about this in the seminar at the Institute … Oppenheimer wasn’t enthusiastic at all. It came as a big shock to me that we’d done this wonderful stuff and I desperately wanted to tell Oppenheimer about it, that was the whole point in coming to Princeton. And Oppenheimer just brushed us off and said, “Well, you know, that’s not leading anywhere,’ …

… This is of course a common situation; that the people who have failed to clean up a subject then don’t believe that it can be cleaned up … And then if somebody comes along and says, “Look, it works,” they don’t believe.

So that was how it was, and so we had a very hard time to get Oppenheimer’s attention. …

All the old people … including Max Born and Heisenberg and Schroedinger … had radical proposals which turned out to be totally useless …

Feynman … and … Schwinger … and … I were conservative in the sense that we … actually made the mathematics work and got the right answers. And that came a surprise to Oppenheimer. It was very hard for him even to listen to it. …

finally Uhlenbeck interceded with Oppenheimer … “Let’s listen to Dyson,” and so Oppenheimer put on a seminar series for me …”.

Sadly, it seems that today there is no Uhlenbeck who will listen to alternatives that work.

Oppenheimer’s arrogant intolerance of physics views differing from his own was further illustrated by his statement (acting as the then-current Pope of Princeton’s IAS) about David Bohm’s interpretation of Quantum Theory:

“… if we cannot disprove Bohm, then we must agree to ignore him. …”.

The source of that quote was Max Dresden (in my opinion impeccably honest) and The Bohm biography Infinite Potential, by F. David Peat (Addison-Wesley 1997), page 133. Here are some relevant excerpts from that book:

Dresden … present[ed] Bohm’s work in a seminar to the Princeton Institute …

The reception he received came as considerable shock to Dresden. Reactions to the theory were based less on scientific grounds than on accusations that Bohm was a fellow traveler, a Trotskyite, and a traitor. It was suggested that Dresden himself was stupid to take Bohm’s ideas seriously. … all in all the overall reaction was that the scientific community should “pay no attention to Bohm’s work” … Abraham Pais also used the term “juvenile deviationism”. Another physicist said that Bohm was “a public nuisance” …”.

It seems that the silent treatment plus ad hominem attacks has used by the USA physics community against non-conformists for at least 50 years.

All in all, now I have feelings somewhat like those described by Richard Feynman about his talk at the 1948 Pocono conference, about which he said:

“… My way of looking at things was completely new, and I could not deduce it from other known mathematical schemes, but I knew what I had done was right.

… For instance, take the exclusion principle … it turns out that you don’t have to pay much attention to that in the intermediate states in the perturbation theory. I had discovered from empirical rules that if you don’t pay attention to it, you get the right answers anyway ….

Teller said: “… It is fundamentally wrong that you don’t have to take the exclusion principle into account.” …

… Dirac asked “Is it unitary?” … Dirac had proved … that in quantum mechanics, since you progress only forward in time, you have to have a unitary operator. But there is no unitary way of dealing with a single electron. Dirac could not think of going forwards and backwards … in time …

… Bohr … said: “… one could not talk about the trajectory of an electron in the atom, because it was something not observable.” … Bohr thought that I didn’t know the uncertainty principle …

… it didn’t make me angry, it just made me realize that … [they] … didn’t know what I was talking about, and it was hopeless to try to explain it further.

I gave up, I simply gave up …”.

The above quotation is from The Beat of a Different Drum: The Life and Science of Richard Feynman, by Jagdish Mehra (Oxford 1994) (pp. 245-248).

“… wisdom itself cannot flourish, and even the truth not be established, without the give and take of debate and criticism. The facts, the relevant facts … are fundamental to an understanding of the issue of policy.”

“Fascism is not a doctrinal creed; it is a way of behaving towards your fellow man. What, then, are the tell-tale hallmarks of this horrible attitude? Paranoid control-freakery; an obsessional hatred of any criticism or contradiction; the lust to character-assassinate anyone even suspected of it; a compulsion to control or at least manipulate the media … the majority of the rank and file prefer to face the wall while the jack-booted gentlemen ride by.”

– Frederick Forsyth, Daily Express, 7 October 2005, p. 11.

Frederick Forsyth was fired from the BBC for his TRUTHFUL reporting of the Biafrian story in the Nigerian Civil War. The British Government was selling arms to the Nigerian Federal Government to commit genocide against the practically unarmed Biafran side, so the BBC fired him. My father was working in Nigeria during the war and saw the effects of the BBC cover-up.

Hence, the current rate of rise of the oceans (0.2 cm/year) is less than one third the average rate which naturally occurred over the past 18,000 years (0.67 cm/year). This tells you that the current rate of global climate change flooding risks is not record-breaking, and is not unprecedented in history. This is a fact you won’t hear from the propaganda cranks.

(This sea level rise is due to the expansion of heated water and to glacier ice running off land into the sea; ice floating in the sea, such as the ice floating at the North Pole, does not cause any increase in sea levels when it melts because – as Archimedes discovered – a floating body like ice will only displace its own weight of water, and that weight is of course unaffected whether its form is liquid or solid.)

THERE has been no global warming for 15 years, a key scientist admitted yesterday in a major U-turn. Professor Phil Jones, who is at the centre of the “Climategate” affair, conceded that there has been no “statistically significant” rise in temperatures since 1995. …

Researchers said yesterday that warming recorded by weather stations was often caused by local factors [local industrial and city heat emissions] rather than global change.

The revelations will be seized upon by sceptics as fresh evidence that the science of global warming is flawed and climate change is not man-made … the UN’s International Panel on Climate Change was forced to admit its key claim that Himalayan glaciers would melt by 2035 was “speculation” lifted from a 1999 magazine article. …

Professor Jones also conceded for the first time that the world may have been warmer in medieval times than now. Sceptics have long argued the world was warmer between 800 and 1300AD because of high temperatures in northern countries.

Climate change advocates have always said these temperatures cannot be compared to present day global warming figures because they only apply to one specific zone.

But Professor Jones said: “There is much debate over whether the Medieval Warm Period was global in extent or not. The MWP is most clearly expressed in parts of North America, the North Atlantic and Europe and parts of Asia.

“For it to be global in extent, the MWP would need to be seen clearly in more records from the tropical regions and the southern hemisphere. There are very few climatic records for these latter two regions.

“Of course, if the MWP was shown to be global in extent and as warm or warmer than today, then obviously the late 20th century warmth would not be unprecedented.” [He ignores the fact that the Earth has been warming since the last ice age peaked 18,000 years ago.]

Professor Jones first came under scrutiny when he stepped down as director of the University of East Anglia’s Climatic Research Unit in which leaked emails were said to show scientists were manipulating data.

Researchers were accused of deliberately removing a “blip” in findings between 1920 and 1940, which showed an increase in the Earth’s temperature.

John Christy, professor of atmospheric science at the University of Alabama and a former lead author on the IPCC, said: “The apparent temperature rise was actually caused by local factors affecting the weather stations, such as land development.”

Ross McKitrick, of the University of Guelph, Canada, who was invited to review the IPCC’s last report said: “We concluded, with overwhelming statistical significance, that the IPCC’s climate data are contaminated with surface effects from industrialisation and data quality problems. These add up to a large warming bias.”
***

“More than £3.5 million has gone on recruiting a worldwide network of young ‘climate activists’ in over 70 countries to engage in climate change propaganda – what Marxists used to call agitprop – and to pressure their politicians to join the worldwide struggle. Under a programme called Challenge Europe, £1.1 million has been paid out to fund young ‘climate advocates’ in 17 countries across Europe, including Britain itself. But £2.5 million has been spent on a more ambitious project to recruit a global network of 100,000 activists in 60 countries across the world, led by 1,300 young ‘International Climate Champions’, to participate in “international peer networks, both in person and online, to share ideas, projects and experiences.”

* Data for vital ‘hockey stick graph’ has gone missing
* There has been no global warming since 1995
* Warming periods have happened before – but NOT due to man-made changes
* Professor Phil Jones admitted his record keeping is ‘not as good as it should be’

The academic at the centre of the ‘Climategate’ affair, whose raw data is crucial to the theory of climate change, has admitted that he has trouble ‘keeping track’ of the information.

Colleagues say that the reason Professor Phil Jones has refused Freedom of Information requests is that he may have actually lost the relevant papers.

Professor Jones told the BBC yesterday there was truth in the observations of colleagues that he lacked organisational skills, that his office was swamped with piles of paper and that his record keeping is ‘not as good as it should be’.

The data is crucial to the famous ‘hockey stick graph’ used by climate change advocates to support the theory. Professor Jones also conceded the possibility that the world was warmer in medieval times than now – suggesting global warming may not be a man-made phenomenon.

And he said that for the past 15 years there has been no ‘statistically significant’ warming.

… Professor Jones has been in the spotlight since he stepped down as director of the University of East Anglia’s Climatic Research Unit after the leaking of emails that sceptics claim show scientists were manipulating data.
The raw data, collected from hundreds of weather stations around the world and analysed by his unit, has been used for years to bolster efforts by the United Nation’s Intergovernmental Panel on Climate Change to press governments to cut carbon dioxide emissions.
More…
* MAIL ON SUNDAY COMMENT: The professor’s amazing climate change retreat

Following the leak of the emails, Professor Jones has been accused of ‘scientific fraud’ for allegedly deliberately suppressing information and refusing to share vital data with critics.

Discussing the interview, the BBC’s environmental analyst Roger Harrabin said he had spoken to colleagues of Professor Jones who had told him that his strengths included integrity and doggedness but not record-keeping and office tidying.

Mr Harrabin, who conducted the interview for the BBC’s website, said the professor had been collating tens of thousands of pieces of data from around the world to produce a coherent record of temperature change.

That material has been used to produce the ‘hockey stick graph’ which is relatively flat for centuries before rising steeply in recent decades.

According to Mr Harrabin, colleagues of Professor Jones said ‘his office is piled high with paper, fragments from over the years, tens of thousands of pieces of paper, and they suspect what happened was he took in the raw data to a central database and then let the pieces of paper go because he never realised that 20 years later he would be held to account over them’.

Asked by Mr Harrabin about these issues, Professor Jones admitted the lack of organisation in the system had contributed to his reluctance to share data with critics, which he regretted.

But he denied he had cheated over the data or unfairly influenced the scientific process, and said he still believed recent temperature rises were predominantly man-made.

Asked about whether he lost track of data, Professor Jones said: ‘There is some truth in that. We do have a trail of where the weather stations have come from but it’s probably not as good as it should be.

‘There’s a continual updating of the dataset. Keeping track of everything is difficult. Some countries will do lots of checking on their data then issue improved data, so it can be very difficult. We have improved but we have to improve more.’

He also agreed that there had been two periods which experienced similar warming, from 1910 to 1940 and from 1975 to 1998, but said these could be explained by natural phenomena whereas more recent warming could not.
He further admitted that in the last 15 years there had been no ‘statistically significant’ warming, although he argued this was a blip rather than the long-term trend.

And he said that the debate over whether the world could have been even warmer than now during the medieval period, when there is evidence of high temperatures in northern countries, was far from settled.

Sceptics believe there is strong evidence that the world was warmer between about 800 and 1300 AD than now because of evidence of high temperatures in northern countries.

But climate change advocates have dismissed this as false or only applying to the northern part of the world.

Professor Jones departed from this consensus when he said: ‘There is much debate over whether the Medieval Warm Period was global in extent or not. The MWP is most clearly expressed in parts of North America, the North Atlantic and Europe and parts of Asia.

‘For it to be global in extent, the MWP would need to be seen clearly in more records from the tropical regions and the Southern hemisphere. There are very few palaeoclimatic records for these latter two regions.

‘Of course, if the MWP was shown to be global in extent and as warm or warmer than today, then obviously the late 20th Century warmth would not be unprecedented. On the other hand, if the MWP was global, but was less warm than today, then the current warmth would be unprecedented.’

Sceptics said this was the first time a senior scientist working with the IPCC had admitted to the possibility that the Medieval Warming Period could have been global, and therefore the world could have been hotter then than now.

Professor Jones criticised those who complained he had not shared his data with them, saying they could always collate their own from publicly available material in the US. And he said the climate had not cooled ‘until recently – and then barely at all. The trend is a warming trend’.

Mr Harrabin told Radio 4’s Today programme that, despite the controversies, there still appeared to be no fundamental flaws in the majority scientific view that climate change was largely man-made.
But Dr Benny Pieser, director of the sceptical Global Warming Policy Foundation, said Professor Jones’s ‘excuses’ for his failure to share data were hollow as he had shared it with colleagues and ‘mates’.
He said that until all the data was released, sceptics could not test it to see if it supported the conclusions claimed by climate change advocates.
He added that the professor’s concessions over medieval warming were ‘significant’ because they were his first public admission that the science was not settled.
Comments (407)

“Colleagues say that the reason Professor Phil Jones has refused Freedom of Information requests is that he may have actually lost the relevant papers.”
——————-
May have lost them ? MAY have lost them ? May have LOST them ?
Doesn’t he know if he has lost them or not ?
Why did he not offer that as a reason for not producing them ? (That he does not hold any such papers is a valid reason for failing to supply them.)
This late and very inventive “explanation” by his colleagues stinks.
– Trevor, Ipswich, 14/2/2010

During the 1960s the temperature was falling, sparking some 1970s propaganda about a possible new ice age, supported by false simplistic models of a cooling feedback effect from expanding ice caps which would reflect ever more sunlight back into space, and accelerate global cooling! Now that the temperature is rising again, we get the predictable global warming propaganda for research grants and funding for environmentalists. Recent changes can only be falsely made to look as if it is “unprecedented” if the factual historical evidence is suppressed by corrupt science, as explained previously (two posts ago), a fact well known to Al Gore:

– former Vice President Al Gore (now, chairman and co-founder of Generation Investment Management – a London-based business that sells carbon credits), in interview with Grist Magazine May 9, 2006, concerning his book, An Inconvenient Truth.

This is a good clear analogy to how spin-2 lies spread in modern physics, demonstrating that mainstream pseudo-science is alive and well today. As we will see, Dr Zagoni and Dr Miskolczi proved that temperature rises due to carbon emissions warm the sea, increasing evaporation, which regulates the climate by increasing cloud cover (reflecting sunlight back) rather than causing further warming. As a result of this natural thermostat (which doesn’t occur in a greenhouse where there are no clouds caused by the rise of evaporated water), there is only a very weak, limited correlation between carbon emissions and climate. This means that the fiddled doomsday predictions are all lies. There is evidence of this in the natural history of climate change on this planet, which is literally suppressed and covered-up by the “hide the decline” mainstream media in order to falsely make recent changes appear unprecedented.

STRIKING parallels between the BBC’s coverage of the global warming debate and the activities of its pension fund can be revealed today. …

The £8billion pension fund is likely to come under close scrutiny over its commitment to promote a low-carbon economy while struggling to reverse an estimated £2billion deficit.

Concerns are growing … huge sums of employees’ money is invested in companies whose success depends on the theory being widely accepted.

The fund, which has 58,744 members, accounts for about £8 of the £142.50 licence fee and the proportion looks likely to rise while programme budgets may have to be cut to help reduce the deficit.

The BBC is the only media organisation in Britain whose pension fund is a member of the Institutional Investors Group on Climate Change, which has more than 50 members across Europe.

Its chairman is Peter Dunscombe, also the BBC’s Head of Pensions Investment.

Prominent among its recent campaigns was a call for a “strong and binding” global agreement on climate change – one that fell on deaf ears after the UN climate summit in Copenhagen failed to reach agreement on emissions targets and a cut in greenhouse gases.

Veteran journalist and former BBC newsreader Peter Sissons is unhappy with the corporation’s coverage.

He said recently: “The corporation’s most famous interrogators invariably begin by accepting that ‘the science is settled’ when there are countless reputable scientists and climatologists producing work that says it isn’t. It is, in effect, BBC policy, enthusiastically carried out by the BBC’s environment correspondents, that those views should not be heard.

“I was not proud to be working for an organisation with a corporate mind so closed on such an important issue.”

Some of the most alarmist claims about how the world is hurtling towards disaster unless we drastically curb carbon emissions have been shown to be nothing but, well, hot air. These damaging exposes may have shocked the ­public but were no surprise to Britain’s foremost ­climate change sceptic, Lord Lawson. … Now, at the age of 77, Lawson, has a new goal: to save the world from itself.

His new think tank, the Global Warming Policy Foundation aims “to bring reason, integrity and ­balance to a debate that has become seriously ­unbalanced, irrationally alarmist”.

… Their scepticism has been stoked by the recent stories about cover-ups at the University of East Anglia revealed in leaked emails between climate scientists, the IPCC’s claim that the Himalayan glaciers would ­“disappear by the year 2035”, which was based on a magazine article, and a claim that global warming could raise sea levels by 6ft by 2100, which was based on pure conjecture.

Yet these have done nothing to dampen the Government’s zeal for the green agenda. “I don’t think there’s any sign so far that they’ve had any impact on either the Government or the official Opposition but it has certainly had an impact on public opinion,” says Lawson. “I do think that public opinion does embody a great deal of practical common sense.”

The Climate Change Secretary Ed Miliband is, says Lawson, one of many politicians who has failed to get to grips with the real issues. “He’s bright but he’s not taken the ­trouble to really inform himself about the issues. ”

And what about the Tory leader? “David Cameron has taken a view that it is important for the Conservatives to be green. I have no quarrel with that but the environment is not one issue. It’s a whole lot of separate issues and the global warming issue is just one of many which, like the others, has to be judged on its own merits. The idea that you have to buy the whole green package is stupid.

“I don’t see how climate change can be a big issue at the next General Election because all the parties are on the same side and the voters are on the other side. Voters should tell candidates that they don’t buy this alarmism and its policy consequences and won’t support it.”

It is, says Lawson, the policy consequences that really matter. The 80 per cent cut in carbon emissions by 2050 that the Government is pursuing, along with other G8 nations, will cause, he says, widespread economic misery for both the developed and the developing world.

The chief argument for cutting carbon emissions is that the costs of climate change are so great that, even if the chances of it happening are very small, the economic costs of slowing down global warming are worth it. But what if, Lawson asks, the costs of slowing down global warming are far greater than the costs of global warming itself? Would it not be better to invest our resources in adapting to a warmer ­climate than in trying to stop it happening?

His basic stance is this: the climate change argument is forever being bundled up as one huge great issue, when in fact it boils down to four entirely separate questions. First, is the world warming up? Second, is the warming being caused by man? Third, even if it is warming, is this necessarily a bad thing for humankind? And fourth, what should we be doing about it? Even if the climate scientists can tell us what is happening, and why they think it is happening, they cannot tell us what governments should be doing about it.”

Yet, his critics argue, if most ­scientists and governments ­accept climate change is happening and man-made, what makes him think he is right and they are all wrong? “Well, first of all, there is a very significant ­minority, including some very ­distinguished ­scientists, who disagree about climate change.

“There are a lot of uncertainties. I don’t pitch my tent on the science. I say: ‘let us suppose that the majority view is right. Then the ­question is: what should we do about it?’

“There is by no means political or economic agreement around the world on what do. ­Governments in the West tend to go through the motions because it is felt they are wicked or evil if they don’t but there is a huge disconnect, even in the West, between what governments say and what they do.

“The collapse of the Copenhagen summit was largely because China and India, quite rightly, are saying they will not pursue these policies.

“Since the particular policy which this country and others are committed to only could make sense, even if you believed it did, in the context of a global agreement, that completely kiboshes it anyway. When we had this great G20 meeting in London, what they were trying to do was find how they could work together to get out of this recession. If they really believed that the most important thing in the whole world was curbing the growth of carbon emissions, surely they would have had a meeting about how they can prolong the recession?”

Global warming, scientists claim, will lead to mass starvation and ­disease . Lawson points out that these claims ignore man’s ability to adapt to his changing environment . “There have always been droughts, there have always been floods, there has always been disease. even the projected effects of global warming only exacerbate these to a very ­limited extent.

“The single biggest cause of death in developing countries is poverty. So if you are going to slow down economic development in countries like India and China and slow down their emergence from poverty it’s going to make more people die from disease and malnutrition and make them less able to deal with the problems of drought and flooding.”

So what should governments be doing ?

“Four things. First, we need to attack the ­specific problems, like disease. Secondly, we need to support the market in the development of new technologies. Thirdly, we should do research into geo-engineering. And finally, we’ll do what humans have always done: we will adapt to whatever nature throws at us.”

ONE tries to be forbearing but honestly, the science behind the “global warming” campaign is beginning to look like roadkill.

If you are going to start and mastermind a science-based global campaign urging governments to require their exhausted taxpayers to fork out not billions but trillions to prevent a catastrophe that may be 50 to 90 years away, you had better be sure your science is impeccable.

First we are told via leaked e-mails that the leading “scientists” have been cherry-picking only the convenient data that suit their case. That is not science: that is propaganda.

Then we learn the fanatics have tried to airbrush from history chronicled facts such as the Medieval Warm Period (still unexplained).

Next the scare story about the melting Himalayan glaciers is revealed as having been made up, and now we learn that the linkage between droughts and floods with global warming is also bunkum.

Would it not be smart to get the facts right first and then spend the trillions?

Robert Watson, chief scientist at Defra, the environment ministry, who chaired the Intergovernmental Panel on Climate Change (IPCC) from 1997 to 2002, was speaking after more potential inaccuracies emerged in the IPCC’s 2007 benchmark report on global warming.

The most important is a claim that global warming could cut rain-fed north African crop production by up to 50% by 2020, a remarkably short time for such a dramatic change. The claim has been quoted in speeches by Rajendra Pachauri, the IPCC chairman, and by Ban Ki-moon, the UN secretary-general.

This weekend Professor Chris Field, the new lead author of the IPCC’s climate impacts team, told The Sunday Times that he could find nothing in the report to support the claim. The revelation follows the IPCC’s retraction of a claim that the Himalayan glaciers might all melt by 2035.

The African claims could be even more embarrassing for the IPCC because they appear not only in its report on climate change impacts but, unlike the glaciers claim, are also repeated in its Synthesis Report.

This report is the IPCC’s most politically sensitive publication, distilling its most important science into a form accessible to politicians and policy makers. Its lead authors include Pachauri himself.

In it he wrote: “By 2020, in some countries, yields from rain-fed agriculture could be reduced by up to 50%. Agricultural production, including access to food, in many African countries is projected to be severely compromised.” The same claims have since been cited in speeches to world leaders by Pachauri and Ban.

Speaking at the 2008 global climate talks in Poznan, Poland, Pachauri said: “In some countries of Africa, yields from rain-fed agriculture could be reduced by 50% by 2020.” In a speech last July, Ban said: “Yields from rain-fed agriculture could fall by half in some African countries over the next 10 years.”

Speaking this weekend, Field said: “I was not an author on the Synthesis Report but on reading it I cannot find support for the statement about African crop yield declines.”

Watson said such claims should be based on hard evidence. “Any such projection should be based on peer-reviewed literature from computer modelling of how agricultural yields would respond to climate change. I can see no such data supporting the IPCC report,” he said.

The claims in the Synthesis Report go back to the IPCC’s report on the global impacts of climate change. It warns that all Africa faces a long-term threat from farmland turning to desert and then says of north Africa, “additional risks that could be exacerbated by climate change include greater erosion, deficiencies in yields from rain-fed agriculture of up to 50% during the 2000-20 period, and reductions in crop growth period (Agoumi, 2003)”.

“Agoumi” refers to a 2003 policy paper written for the International Institute for Sustainable Development, a Canadian think tank. The paper was not peer-reviewed.

Its author was Professor Ali Agoumi, a Moroccan climate expert who looked at the potential impacts of climate change on Tunisia, Morocco and Algeria. His report refers to the risk of “deficient yields from rain-based agriculture of up to 50% during the 2000–20 period”.

These claims refer to other reports prepared by civil servants in each of the three countries as submissions to the UN. These do not appear to have been peer-reviewed either.

The IPCC is also facing criticism over its reports on how sea level rise might affect Holland. Dutch ministers have demanded that it correct a claim that more than half of the Netherlands lies below sea level when, in reality, it is about a quarter.

The errors seem likely to bring about change at the IPCC. Field said: “The IPCC needs to investigate a more sophisticated approach for dealing with emerging errors.”

How the cloud water reflection feedback mechanism is ignored and natural climate changes are deliberately censored out by lying doom-mongering propaganda to secure research grants for politicians with a money wasting, crackpot agenda: an analogy to spin-2 graviton propaganda in physics, and to fascist/communist eugenics “science”

Once upon a time, it was widely believed that the Earth’s climate was unstable and finely balanced between:

(1) runaway glaciation caused by increased ice caps causing increased reflection of sunlight back into space, cooling the Earth even more (this was hyped in the 1970s when global temperatures seemed to be falling due to clouds of pollution which reflected sunlight back rather than absorbing it), and

(2) runaway heating due to the greenhouse effect. It is generally believed that a small increase in global temperatures due to CO2 emissions would cause ever more heating by evaporating more water from the ocean, since water vapor [I’m using the American spellings here] is a strong greenhouse gas, effectively absorbing infrared radiation.

This consensus model ignored the fact that not all vapor remains as a gas; some of it condenses into small droplets to form things technically called “clouds”, which reflect back into space more sunlight than they absorb, as we shall explain below: water turns out to be an anti-greenhouse gas, regulating the climate like a thermostat, rather than amplifying small changes into a runaway greenhouse effect.

These beliefs led even serious scientists to worry about small variations in global temperature. Venus, which is the planet next closest to the sun than Earth, allegedly has a runaway greenhouse effect: it has an atmosphere which is 96.5% CO2 and a surface temperature of 462 °C. Its surface partial pressure of nitrogen (the major constituent of Earth’s atmosphere) is only four times that on Earth, although its total pressure (mainly CO2) is about 93 Earth atmospheres. Therefore, it is tempting to imagine that Venus is simply a planet like Earth in which a runaway greenhouse effect has evaporated nearly all of the water into space (H2O molecules are less massive than CO2 so the principle of conservation of momentum means that they acquire greater velocities in collisions and are thus more likely to be in the tail segment of the Maxwell distribution of speeds, thus exceeding the planetary escape velocity), and in which all of the carbon has been oxidized into atmospheric CO2. Once sufficient heating has occurred, for instance, limestone rocks, i.e. CaCO3, will be decomposed by the high temperature into lime, CaO, plus CO2, thus releasing further CO2 into the air.

The other example is the other nearby planet, this time the one next furthest from the sun than the Earth, Mars. Mars is similar to Venus in having a large fraction of its atmosphere composed of CO2: 96% in fact. However, this is quite different to the runaway greenhouse effect of Venus. Mars has a low total surface air pressure, only about 0.64% of Earth’s, and the nitrogen partial pressure is about 5,800 times smaller than Earth’s. In particular, Mars has a mean surface temperature much smaller than Earth’s, a chilly −46 °C. This is obviously caused in part by the extra distance from the sun and in part by the low total atmospheric pressure, despite the large percentage of CO2. Nevertheless, it is clear that if there is a runaway greenhouse effect on Mars, then Mars is very cold as a result. While Venus is ideal political propaganda for global warming from CO2, Mars is less satisfactory! Even though there is no proved instance of life on Mars, the climate there is still changing as recorded by NASA’s Mars Global Surveyor:

Unlike certain other planets in the solar system, our planet is uniquely UNLIKE a greenhouse because 70% of it’s surface is covered in oceans, seas and lakes, so any initial rise in temperature (once the deep water heats up, which takes years due to the high specific heat capacity of water) increases the evaporation rate, forming more cloud which thus reflects back a larger percentage of the incoming solar radiation, thereby regulating the Earth’s temperature like a thermostat!

Evaporated water sets a limit on temperature rises, because the increase in water droplet cloud cover (reflecting radiation back into space) exceeds the greenhouse like effects from uncondensed atmospheric water vapor! The initial IPCC model ignored this completely, and the current version is an inaccurate, politically biased ad hoc fiddle which has to be “justified” by falsifying the historical data on past temperature variations by selectively using inaccurate tree ring data, as we shall see later (the “hide the decline” or “climategate” fiddle).

“It is hard for new ideas to become assimilated, and it does not happen immediately, nor would one want to lose some of the inertia that protects the scientific enterprise from being overwhelmed by unsubstantiated results and unproven theories.”

First, consider the basis for the anti-greenhouse effect of evaporated water due to an initial temperature rise from CO2. On average, today:

3% of incoming radiation is absorbed by greenhouse gases excluding water vapor,
15% is absorbed by water vapor,
5% is absorbed by clouds and
47% is absorbed by the earth’s surface;

while:

21% of incoming radiation is reflected back by clouds,
6% is reflected back by the air, and
3% is reflected back by the earth’s surface.

Water in the atmosphere thus absorbs a total of 20% of incoming solar radiation, and reflects back a total of 21%. So the amount of reflection by cloud cover exceeds the amount of absorption due to water vapor absorbing infra-red solar radiation. Overall, 70% of incoming solar radiation is currently absorbed, and 30% is reflected back into space.

Now calculate what happens to these numbers when a temperature rise due to CO2 increases occurs. You find that doubling today’s water content in the atmosphere – assuming that the vapor mass to cloud droplet mass ratio is a constant – reduces the total absorption from 70% to 64% of incoming radiation, while it increases reflection from 30% to 36% reflection. Hence, evaporated water has an anti-greenhouse effect: a “negative feedback”.

However, the situation is even worse than this for the IPCC fraudsters, since the atmospheric greenhouse water vapor (humidity) has allegedly not even been rising in step with the total evaporated water:

This is evidence that if temperature rises have been causing more evaporation of water from the oceans (as they must, once the ocean water has slowly heated up with a time lag due to its high specific heat capacity), the extra water has ended up in increased cloud cover rather than in increased humidity.

This seems to be why the IPCC has had to fiddle the historical evidence using flawed tree-ring data selectively in place of thermometer data (as we will prove below) to make its flawed model appear justified by the (fiddled) data for the last few decades.

Of course, the average global humidity at all altitudes may not be well represented by surface air measurements at a limited number of locations, but the point is that there is evidence here that is being censored out by groupthink quangos including NASA, leading to the scientists having to resign without the data even being evaluated. This is exactly how fascist “science” was done in the 30s and why Einstein left Germany: it was done by censoring the facts and making scientific life impossible for those who do not appease the mainstream.

The chairman of the UN’s Intergovernmental Panel on Climate Change (IPCC), has used bogus claims that Himalayan glaciers were melting to win grants worth hundreds of thousands of pounds.

Rajendra Pachauri’s Energy and Resources Institute (TERI), based in New Delhi, was awarded up to £310,000 by the Carnegie Corporation of New York and the lion’s share of a £2.5m EU grant funded by European taxpayers. …

The revelation comes just a week after The Sunday Times highlighted serious scientific flaws in the IPCC’s 2007 benchmark report on the likely impacts of global warming.

The IPCC had warned that climate change was likely to melt most of the Himalayan glaciers by 2035 … Last week a humbled IPCC retracted that claim and corrected its report. …

THE world’s leading climate change body was plunged into fresh controversy yesterday for wrongly linking global warming to an increase in ­hurricanes and floods.

The UN’s Intergovernmental Panel on Climate Change based its claims on an unpublished and unverified ­report.

But it then ignored warnings from scientific advisers that the ­evidence supporting the link was too weak, it was claimed.

The report’s own authors later withdrew the claim because they ­acknowledged that the evidence was not strong enough.

The latest criticism comes after the IPCC was forced to retract claims that the Himalayan glaciers would disappear by 2035. It was claimed yesterday that the IPCC’s controversial head, Rajendra Pachauri, used the claims to win taxpayer-funded grants worth hundreds of thousands of pounds.

Critics are furious that the IPCC’s claim that global warming is affecting the severity and frequency of natural disasters is now routinely cited in political and public debate.

It was a key part of discussions at last month’s Copenhagen climate summit, which included a demand by developing countries for £62billion compensation from rich nations blamed for creating the most ­emissions.

Ed Miliband, energy and climate change minister, has suggested British and overseas floods – such as those in Bangladesh in 2007 – could be linked to global warming. And US President Barack Obama said last autumn: “More powerful storms and floods threaten every continent.”

But the paper on which the IPCC based its claim, written in 2006 by a disaster impacts expert, had not been scientifically scrutinised at the time the body issued its report.

By the time it was published in 2008 it carried a warning: “We find ­insufficient evidence to claim a statistical relationship between global temperature increase and catastrophe losses.” Despite this change, the IPCC did not issue a clarification ahead of the Copenhagen summit.

The latest claims come amid revelations that former railway engineer Dr Pachauri, who is also chairman of The Energy and Resources Institute in New Delhi, has won the lion’s share of a £2.5million EU research grant.

It means that taxpayers are funding research into a scientific claim about glaciers that climate researchers have now dismissed as bogus.

The revelation follows reports of flaws in the Nobel Prize-winning IPCC benchmark report in 2007 on the likely impacts of global warming.

Dr Pachauri was forced to make an embarrassing apology last week ­after it emerged there was no scientific evidence to support his claim that climate change was likely to melt most of the Himalayan glaciers by 2035.

But reports have emerged which show how the same claim has been cited in grant applications for his institute.

One application, ­announced this month just ­before the “Glaciergate” scandal broke, resulted in a £310,000 grant from the Carnegie Corporation of New York to aid ­research into “the potential security and humanitarian impact on the ­region” as the glaciers began to disappear.

Dr Pachauri, who has defied calls for his resignation and denies any wrongdoing, has since conceded that this threat, if it exists at all, will take centuries to have any serious effect.

The second grant, from the EU, ­totalled £2.5million and was designed to “assess the impact of Himalayan glaciers retreat”. It was part of the EU’s High Noon project, launched last May, to fund research into how India might adapt to loss of glaciers.

The EU grant was split between leading research institutions, including Britain’s Met Office, with The Energy and Resources Institute given a major but unspecified share.

Today, it seems that the false ‘consensus’ around global warming is melting at a faster rate than ever before. Richard North has shown that the head of the IPCC solicited EU grants to research Himalayan glacier melt even though the claims of that melt were totally bogus; David Rose on the Mail on Sunday has evidence that the IPCC deliberately inflated melting ice theories to make the world pay attention; and the Sunday Times has further evidence that other parts of the 2007 IPCC report also deliberately falsified the records about alleged increases in the damage caused by hurricanes. It adds up to an avalanche of fraud that is building up momentum daily – for the first time the MSM is following what blogs have been saying for years.

So where is the BBC on all this? Not very far, to put it mildly. There’s virtually nothing on the BBC website that reflects the turmoil. Harrabin posted on Friday a pessimistic blog containing the old warmist lie that the lobbying firepower from oil corporations was the reason why ‘climate change’ legislation has not been passed; and Richard Black, though admitting that the ‘climate change’ suicide rush to enact globally-binding targets has faltered, still refuses to discuss or even properly mention the catalogue of lies and distortions that are now being exposed. It’s BBC cloud cuckoo land, as usual.

Update: the blogsphere has been buzzing all day with new revelations about Pachauri and his henchmen. Even warmist journalists such as Charles Clover and Geoffrey Lean are calling for Pachauri’s resignation. On the BBC, with its £700m-a-year news budget – zip.

STRIKING parallels between the BBC’s coverage of the global warming debate and the activities of its pension fund can be revealed today. …

The £8billion pension fund is likely to come under close scrutiny over its commitment to promote a low-carbon economy while struggling to reverse an estimated £2billion deficit.

Concerns are growing … huge sums of employees’ money is invested in companies whose success depends on the theory being widely accepted.

The fund, which has 58,744 members, accounts for about £8 of the £142.50 licence fee and the proportion looks likely to rise while programme budgets may have to be cut to help reduce the deficit.
The BBC is the only media organisation in Britain whose pension fund is a member of the Institutional Investors Group on Climate Change, which has more than 50 members across Europe.

Its chairman is Peter Dunscombe, also the BBC’s Head of Pensions Investment.

Prominent among its recent campaigns was a call for a “strong and binding” global agreement on climate change – one that fell on deaf ears after the UN climate summit in Copenhagen failed to reach agreement on emissions targets and a cut in greenhouse gases.

Veteran journalist and former BBC newsreader Peter Sissons is unhappy with the corporation’s coverage.

He said recently: “The corporation’s most famous interrogators invariably begin by accepting that ‘the science is settled’ when there are countless reputable scientists and climatologists producing work that says it isn’t. It is, in effect, BBC policy, enthusiastically carried out by the BBC’s environment correspondents, that those views should not be heard.

“I was not proud to be working for an organisation with a corporate mind so closed on such an important issue.”

For example, thermometer data showing wide temperature variations was suppressed in favour of data known to be inaccurate (tree ring extrapolated temperatures are unreliable since tree growth depends largely on a combination of two unmeasured variables, cloud cover and temperature, not just temperature) in order to suppress the larger directly measured temperature variations prior to 1980. The aim of the propaganda was to take factually measured recent temperature variations and deliberately present them out-of-context on a false curve based on inaccurate data, to make the recent change seem unprecedented.

Certainly temperature is rising at the moment, but it is not as unprecedented as claimed by graphs in IPCC peer-reviewed reports using inaccurate tree ring data prior to about 1980. Another deception concerns the so-called “greenhouse effect”: the planet is 70% water which is an anti-greenhouse gas, so the analogy is deeply flawed. An increase in evaporated water produces both extra water vapor or humidity and extra condensed droplets in the form of cloud cover, and it turns out that the increase in reflectivity from the extra cloud cover has a bigger effect than the increase in absorption of infrared by uncondensed water vapor, so water is an anti-greenhouse gas and sets limits on the possible increase in temperatures from pumping out carbon dioxide: as temperature begins to rise, the oceans heat up and evaporate faster, making more cloud which largely offsets further rises.

Why discuss global warming when the interest of this blog is quantum field theory? It’s an example of contemporary scientific cover-up, the “it’s peer-reviewed so it must be better than stuff that bias peer-reviewers have deliberately censored from publication”-hot air, and like spin-2 quantum gravity lies, a lot of research grant money depends on it. It is particularly interesting from our point of view to examine the behaviour of critics and the response of the mainstream to those critics. The key question is, how much of the responsibility for the cover up is due to ineffective critics? Critics of IPCC propaganda cannot debunk the propaganda by simply ignoring it, or putting forward alternative data which is simply ignored or censored-out by the peer-reviewer dominated mainstream. Science is determined solely by facts themselves, not a consensus showing a majority or minority of scientists believe or don’t believe the facts.

“It is hard for new ideas to become assimilated, and it does not happen immediately, nor would one want to lose some of the inertia that protects the scientific enterprise from being overwhelmed by unsubstantiated results and unproven theories.”

Dear Ray, Mike and Malcolm,
Once Tim’s got a diagram here we’ll send that either later today or first thing tomorrow. I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) amd from 1961 for Keith’s to hide the decline. Mike’s series got the annual land and marine values while the other two got April-Sept for NH land N of 20N. The latter two are real for 1999, while the estimate for 1999 for NH combined is +0.44C wrt 61-90. The Global estimate for 1999 with data through Oct is +0.35C cf. 0.57 for 1998. Thanks for the comments, Ray.

Above: the part of the curve in red above illustrates the recent apparent decline in temperatures obtained from tree ring data that Professor Phil Jones and others were hiding in peer-reviewed Nature journal articles and IPCC reports, while still using older inaccurate tree-ring data to try to justify the “hockey stick curve” which falsely suggests that the recent temperature rise is unprecedented and thus a great cause for concern to objective scientists, i.e., skeptical critics, for as Feynman states: “Science is the belief in the ignorance of experts“.

Above: there is actually more than one decline to hide: mean ocean temperatures have fallen by 2.4 °C over the past 2.5 million years, as shown by data from Joe Buchdahl, Pleistocene Glacials and Interglacials, in Global Climate Change Student Guide, as cited by Prof. John Baez. The research to “prove” global warming by faking the “hockey stick” graph using data manipulation at best only managed to indicate a global temperature rise of 0.74 °C over the past century (this is an exaggeration caused by the data fiddling). However, even if this is true, and even if the IPCC projected temperature rises of 2-3 °C are accurate over the next century, the planet will simply be as warm again as it was 2.5 million years ago when primate ancestors were evolved into humans! (Lying propaganda claims that the only temperature variations in Earth’s history were during the Medieval warm period.)

“The post-1960 data was deleted from the archived version of this reconstruction at NOAA here and not shown in the corresponding figure in Briffa et al 2001. Nor was the decline shown in the IPCC 2001 graph, one that Mann, Jones, Briffa, Folland and Karl were working in the two weeks prior to the “trick” email (or for that matter in the IPCC 2007 graph, an issue that I’ll return to.) …

“Contrary to Gavin Schmidt’s claim that the decline is “hidden in plain sight”, the inconvenient data has simply been deleted.

“The reason, as explained on Sep 22, 1999 by Michael Mann to coauthors in 938018124.txt, was to avoid giving “fodder to the skeptics”. Reasonable people might well disagree with Gavin Schmidt as to whether this is a “a good way to deal with a problem” or simply a trick.”

The original (faked) IPCC “data” claimed a simple “hockey stick” exponential-type temperature rise by 0.74 °C over the past century using inaccurate tree ring data where convenient and ignoring it when inconvenient to their politically funded highly lucrative “research”:

The oceans have a thermocline of mixed water from the surface down to typically 60 metres depth in the summer. The first storms of winter mix the warmed top layer down to much greater depths. Water has the highest specific heat capacity (energy required to cause a given temperature rise in a unit mass of the material) of all: water molecules are a very effective broadband absorber of infrared radiation, so water is very slow to heat up and very slow to cool down thereby retaining a great deal of thermal energy, relative to other substances. So the oceans heat up slowly. As they do so, evaporation increases, increasing cloud cover and increasing the amount of sunlight that is reflected back into space. This reduces temperature rises to levels below those based on “greenhouse models” because Earth isn’t a greenhouse: more clouds form around the Earth when the water heats up a little, reflecting heat back and regulating the temperature, unlike the simplicity of a greenhouse where no clouds form to regulate the temperature!

Why, on balance, atmospheric water cools the world instead of heating it (or, why cloud cover reflection of sunlight exceeds the effect of infra-red absorption by water vapor in the atmosphere)

Water molecules absorb infrared radiation very effectively, so in lying propaganda that ignores its anti-greenhouse (cooling) role in reflecting sunlight back by clouds (condensed water in the atmosphere!), it is the most efficient greenhouse gas of all: on average, 15% of the incoming solar radiation is absorbed by water vapor, compared to 5% absorption by clouds and only 3% absorption by other greenhouse gases. However, on balance water in the atmosphere doesn’t act as a greenhouse gas, become on average 21% of the incoming radiation from the sun is reflected back into space by the clouds, keeping the earth COOL!

This isn’t rocket science, just plain simple arithmetic. The total incoming sunlight absorption by water in the atmosphere is the 15% absorbed by water vapor and the 5% absorbed by water droplets (clouds), equalling 20%, which is less than the 21% reflected back by clouds. Therefore, the net effect of water in the atmosphere is as an anti-greenhouse gas, cooling the planet, not as a greenhouse gas which heats up the planet. So, because any initial CO2 heating of the earth increases evaporation from the sea, water in the atmosphere then therefore has a negative, not a positive, feedback: it opposes a further “global warming” effect of increasing CO2 levels, by simply reflecting more radiation back into space!

At present, 3% of incoming radiation is absorbed by greenhouse gases excluding water vapor, 15% is absorbed by water vapor, 5% is absorbed by clouds and 47% is absorbed by the earth’s surface; 21% is reflected back by clouds, 6% is reflected back by the air, and 3% is reflected back by the earth’s surface. In total, 70% is currently absorbed and 30% is reflected back into space.

For comparison, doubling today’s water content in the atmosphere, for the same ratio of water vapor to cloud droplet water, gives a total of just 64% absorption of incoming radiation and 36% reflection. Hence, as evaporation increases due to an initial CO2-induced temperature rise, the net effect of the additional water in the atmosphere is to reduce the fraction of incoming radiation which is absorbed, and to increase the fraction which is reflected, provided that the distribution of the atmospheric water between vapor and cloud cover remains the same. This opposes the greenhouse effect.

Quantitative links between CO2 and temperature

To quantitatively illustrate a relationship between temperature rise and atmospheric CO2, we can make use of Vostok ice cores which indicate the atmospheric CO2 level from air trapped in the ice, as well as the exact temperature: the ratio of deuterium to hydrogen-1 isotopes in the ice’s water molecules indicates temperature because molecules containing the lighter isotope, hydrogen-1, had the same kinetic energy per molecule and thus by conservation of momentum had higher mean velocities than heavy water molecules at the same temperature, so they were more likely to break free and escape from bonding forces, thereby evaporating or rather subliming from the ice, leaving behind a greater concentration of heavy water when the temperature increased.

The Vostok data for the past 220,000 years indicate that the absolute Kelvin temperature was proportional to the 0.061 power of carbon dioxide concentration in the air, i.e. T ~ A0.061 where T is temperature (K) and A is carbon dioxide concentration. I.e., a doubling of the carbon dioxide concentration only corresponded to an absolute temperature increase by a factor of 20.061 = 1.043. E.g., a mean temperature of 15 C (288 K) would be increased to 27 C (300 K) for a doubling of atmospheric CO2.

However, even this weak correlation may be an exaggeration because ice core data generally does not prove a casual mechanism of temperature responding to CO2 levels; e.g. many such ice core data show that temperatures appear to vary ahead of changes in CO2 levels, suggesting that the climatic temperature was not being driven by CO2, and that CO2 was simply responding to other influences. An increase in temperature, for example, will generally cause an increase in the ratio of animal to plant biomass (since animals can adapt to temperature more easily by changing their location, than rainforests can, so the animals can migrate while the rainforests die off) which increases the level of CO2 in the atmosphere since animals emit CO2 while plants bind it up. Hence the assumption that CO2 is driving the temperature change in history is generally a mistake. The Quaternary ice age began a million years ago, during which there were half a dozen or so glacial periods during which the English Channel and most of the North Sea dried up as a result of an expansion of polar ice caps, and northern continents were covered by up to 1 km thick ice sheets. During some interglacial periods, the climate was warmer than it is today.

The original version of GEOCARB suggested that the atmospheric CO2 abundance was over 15 times higher 460 million years ago than it is now, and at that time the mean global temperature was 7 C higher than now (22 C compared to an assumed global mean temperature now of 15 C). Some 210 million years ago, the CO2 level is estimated to have been 5 times the current level, and the mean global temperature was estimated to have been 5 C warmer than now (20 C compared to 15 C assumed as today’s global mean temperature). Even just 100 million years ago, there were no continuous ice caps at the poles (just winter snow): all the ice melted in the summer at the poles, and deciduous rain forests existed within 1,000 km of the poles.

One example of a climatic change caused by normal geological processes is the formation of the Tibetian plateau which effectively cooled the whole planet by strengthening the monsoon system in southern Asia and forming the Himalayas. Beginning about 50 million years ago (and continuing to the present day), the drift of the continental plates has caused the continents of India and Eurasia to collide, pushing up oceanic crust from the bottom of the sea to form the Himalayan mountain chain and the Tibetian plateau. Similarly, the Alps are the result of a collision beginning 120 million years ago between Africa and Eurasia, and also had an effect on global climate.

The temperature changes caused by such natural phenomena can cause CO2 levels to vary by killing off CO2-absorbing rainforests which can’t move, while CO2-emitting animals can migrate to compensate for the climatic change. Hence, there can be a true correlation between temperature and CO2 levels, even where there is no mechanism for CO2 levels to affect temperature: the opposite mechanism has occurred, in that a changing climatic temperature has resulted in a variation of CO2 levels!

Yet another example of a mechanism for natural climatic change is the Earth’s orbit which undergoes three cycles named after Milutin Milankovich, the Serbian astronomer who in 1941 worked out how the planets perturb one another’s orbits:

(1) the Earth’s tilt (which creates the opposite seasons in each hemisphere, and is currently 23.4 degrees and decreasing, so summers are becoming cooler and winters are becoming warmer) varies from 22.4 to 24.2 degrees over a 41,000 year cycle,

(2) the precession of the equinoxes, a cycle lasting 22,000 years (in 11,000 years time, the timing of the winter and summer seasons will have exactly reversed in each hemisphere), and

(3) the shape of the Earth’s orbit around the sun, i.e. the eccentricity, changes from a circle to a strong ellipse, over a combination of a very weak 400,000 year cycle and a stronger 100,000 year cycle that combine into an effective cycle lasting around 110,000 years.

A maximum elliptical eccentricity of 0.05 occurred 200,000 years ago, when both the 400,000 and 100,000 year cycles were at a maximum. Some 100,000 years ago, the 400,000 year cycle was no longer at maximum, but the 100,000 year cycle was at maximum, and the resulting eccentricity was 0.04. The eccentricity of the Earth’s orbit varies from 0.0034-0.058 as a result of gravitational forces between the planets, and it is currently only 0.0167 (we are 147 million km from the sun on at the perihelion on 3 January, and are 152 million km from the sun on Independence Day, the aphelion). Greater eccentricity tends to wipe out the normal seasons (which are due to the earth’s tilt, and which are opposite in opposite hemispheres), so that the entire earth (both hemispheres) will begin to freeze in winter and thaw in summer, causing very severe extremes of temperatures during the year simultaneously over most of the planet and thus preventing effective migration, by confining many species to areas much closer to the equator than they can venture seasonally today.

Again, the effect of the temperature variations caused naturally by the Milankovich cycles will be to vary CO2 levels: as eccentricity increases in the future, the Earth will cease orbiting in a nearly circular orbit and will orbit the sun more elliptically. This will cause the entire Earth to cool simultaneously to a significant extent when far from the sun each year (which doesn’t happen at present: we are nearest to the sun on 3 January and furthest from it on 4 July, but that distance effect is completely overwhelmed by the much larger effect of the 23.4 degree tilt of the earth’s spin to the plane of the its orbit around the sun, which seasonally varies the mean thickness of the atmosphere that sunlight must penetrate in each hemisphere). This change will affect vegetation, so that CO2 levels will vary. Over the past 18,000 years, the warming of the Earth and the expansion of the ocean water in consequence has increased sea levels by 120 metres, innundating low-lying vegetation and destroying it, while animals have migrated. This is a regular effect: over the past 400,000 years, there have been 4 peaks in sea level all within plus or minus about 10 metres of the current level, and four minimum sea levels of over 100 metres below the current level.

A politician or pseudo-scientist could plot the varying temperature and the varying CO2 level, and falsely do a correlation significance test before falsely claiming that statistical significance proves that CO2 levels drive temperature, when in fact the temperature variation can drive the change in CO2 levels! This is the danger of statistical pseudo-science: it is no substitute for understanding the physical mechanism. The fact that data manipulation is required by IPCC researchers to fake evidence for extreme global warming by suppressing the natural background temperature variation data, shows that the subject has fallen prey to politicians’s (tax-payer filled) wallets, and has lost rigorous, honest objectivity, sinking into the depths of mere propaganda.

Increasing levels of atmospheric carbon dioxide cause a tiny temperature rise, increasing evaporation so there is more cloud cover, which in turn reflects more solar radiation back into space, nearly offsetting the “greenhouse” effect of the carbon dioxide so that global warming’s “extreme evidence” has to be manufactured by fudging the data. In short, the world is not like a greenhouse because of the evaporation of water which causes additional cloud cover which cools the earth by reflecting sunlight back into space. The evaporation of water was totally ignored altogether in the early IPCC (Intergovernmental Panel on Climate Change) models that supported hyped disaster predictions from the “greenhouse” effect. Furthermore, they fiddled the historical data to make recent changes seem unprecedented using the false “hockey stick” diagram which manufactured by Michael Mann (the illustrations below are from Baez and Wikipedia links here, here and here, and prove that the temperature began rising before the epoch claimed by Mann’s hockey stick diagram and that the recent temperature variations are not dramatic or significantly larger than those naturally occurring over the past 12,000 years):

The IPCC ignores the increasing future depletion of fossil fuels, and predicts that spending $100 billion will constrain temperature rises by 1.5 C. In any case, the suggested “countermeasure” of throwing billions upon billions of dollars at building alternative technology such as wind power stations (which shut down in strong winds to prevent damage, and also generate no power in hot calm periods where there is a major power demand for air conditioning), will just supplement fossil fuel use and therefore will not reduce the eventual CO2 release from the use of fossil fuels, but will merely protract the rate of its release, so the politics of global warming suffer from inherent problems:

(1) CO2 is not a pollutant but is the vital source of carbon for all plant growth on land and in the sea on this planet, and rising levels of CO2 therefore promote life rather than destroying it – it is an essential gas for the life on Earth. It doesn’t lead to rapid temperature rises on any planet with large quantities of water, since any initial slight temperature rise causes more water to evaporate forming clouds, thus increasing cloud cover and protecting the planet against further temperature rises from the increasing level of atmospheric CO2. As Dr Lubos Motl points out, CO2 only becomes unpleasant for humans at concentrations of around 10,000 ppm while the current one is 388 ppm and with the depletion of fossil fuel reserves it cannot ever exceed 1,000 ppm. CO2 has a net positive impact on life on Earth.

(2) fossil fuels are not inexhaustible and are being depleted anyway, and as oil and coal supplies dwindle the remaining reserves are more expensive to tap and so the price rises, and people are pushed naturally away from using such fuels towards safe nuclear energy (which doesn’t produce collateral CO2 emissions if nuclear power is used to generate electricity to power the trains that deliver the fuel, etc.) and renewable biofuels (plants which lock up the same amount of CO2 while growing that they release on subsequent burning, so there is no net increase in global CO2), so global warming is not a long term doomsday problem anyway unless fossil fuels can be shown to be inexhaustible,

(3) the immense expenditure on trying to reduce CO2 emissions from existing sources and building wind power stations doesn’t cause a significant reduction in global carbon dioxide. For example, if the total fossil fuel reserve (oil, coal, etc.) is X tons, then supplementing it with wind power will simply mean that the carbon in the X tons of fuel is given out over a longer period of time, say 120 years instead of 100 years. Once all of the fossil fuels have been used up, all of the CO2 will be released and the “countermeasures” which consist of reducing the rate at which the CO2 is released will not affect the ultimate level of CO2 in the atmosphere. So it is a confidence trick to waste taxpayers money under false pretenses.

Large amounts of atmospheric CO2 was what fuelled the plant growth which produced much of the fossil fuels around 300 million years ago when the terrific conversion of carbon dioxide into wood released enough oxygen by photosynthesis to make the earth’s atmosphere 35% oxygen (compared to 21% today), fuelling the early inefficient lungs of the first amphibians when they moves on to the land, and also fuelling giant now extinct flying insects which utilized the high oxygen levels. It’s interesting that such high oxygen levels are associated with high ignition probabilities under today’s conditions. E.g., for typical forest fine kindling (dry leaves, etc) today, there is a 70% increase in the probability of a fire being started by lightning for every 1% rise in the oxygen percentage. However, this fire risk would automatically be compensated for over long time periods by a structuring of the forests by evolution to reduce intense fire risks: regular fires reduce ignition probabilities by clearing away kindling like deadwood and underbrush, trees would be spaced on average further apart than they are now, and so fires would spread less easily and burn less fiercely than you would expect by simply scaling up the oxygen percentage and assuming that primeval forests were similar to those today.

Between 300 and 250 million years ago, the oxygen content of the atmosphere fell from 35% to 21% and then dropped to around 15% about 200 million years ago, before rising to 27% 30 million years ago and falling to 21% now (the current level seems to be on a downward slope).

(5) human beings are not unnatural and have always been changing the world. The world can adapt to changes, as it has done many times before in the history of this planet, which has included long periods with much higher temperatures than are forecast for global warming even under the most pessimistic conditions. There was a period when do-gooders tried to stop forest fires: they extinguished all the fires, and gradually the amount of dead wood and underbrush increased until the forest had become a massive bonfire waiting to be ignited. Eventually a fire started which couldn’t be extinguished, and the forest was destroyed completely, not just to the superficial (surface charring of bark) extent that fires usually caused. Then they realized that the policy of trying to stop fires in the forest had been an error. Interfering with global warming may seem just as “obvious” as trying to stop fires in a forest. But are we sure that such interference is the right thing to do? Could the money be better spent on defenses against sea level rises and extreme weather? Global temperature naturally varies and so it is not clear exactly what value you are even trying to change the global mean temperature to. Never mind, ignorant politicians don’t care about these “technical details”, just about being seen to address a problem by flushing trillions of taxpayers money down the drain so more people will vote for them.

“Ever since writing my TV shows in the Eighties I have been talking to students, teachers and the general public and enthusing about the amazing possibilities for science and technology in the future. But over 30 years I have seen a terrible change in science education. Role models such as Dalton, Faraday and Curie are hardly ever mentioned … Kids are introduced to science as something that is life-threatening and deprived of exploration … They are being brainwashed into believing that science and technology is crippling the Earth and our future when exactly the opposite is true. Science education has been turned upside down by worry merchants and it is already costing us dearly in a widespread lack of understanding – it is ignorance that breeds fear … If we scrapped completely the foolhardy and scientifically unsound chase to reduce carbon, while still aiming for greater efficiency in energy usage, we would have all the money needed to bring the Third World out of poverty, save millions of lives year on year, and create a fairer and far more balanced world …”

– Johnny Ball, “It’s Not the End of the World”, Daily Express, 21 December 2009, p. 13.

The lies of Al Gore’s Oscar winning film, An Inconvenient Truth

1. Gore, who lost the 2000 Presidential election to Bush, claims in An Inconvenient Truth that the injury to his child by a car converted him into an genuine environmentalist. But after winning his Oscar for An Inconvenient Truth the media revealed that Gore’s household consumed 221,000 kilowatt hours of energy in 2006, which is over 20 times the American average. So Gore was proved to be a traditional “Do as I say, not as I do” lying politician, not an honest environmentalist.

2. Gore falsely claims that the only solution to global carbon dioxide increases is to reduce emissions, which is a lie, for it neglects the fact that proper sea wall defenses in Holland today permit much of the country to operate safely while being 15 feet below sea level! Gore also ignores other countermeasures such as growing crops further north as the earth warms, and instead just lies that the only solution is to reduce emissions.

3. Gore with political expediency avoids the nuclear solution to global warming explained right back in 1958 by Edward Teller and Albert L. Latter in their book Our Nuclear Future: Facts, Dangers, and Opportunities (Criterion Books, New York, 1958), page 167:

‘If we continue to consume [fossil] fuel at an increasing rate, however, it appears probable that the carbon dioxide content of the atmosphere will become high enough to raise the average temperature of the earth by a few degrees. If this were to happen, the ice caps would melt and the general level of the oceans would rise. Coastal cities like New York and Seattle might be innundated. Thus the industrial revolution using ordinary chemical fuel could be forced to end … However, it might still be possible to use nuclear fuel.’

4. Gore lies that sea levels could rise by 20 feet due to global warming causing the Antarctic ice sheet to melt. The report of the International Panel on Climate Change (which probably overestimated the effect greatly) predicted a rise of just over 1 foot by 2100.

5. Gore claims of temperature rise: “in recent years, it is uninterrupted and it is intensifying.” Actually, the “effective temperature” for tree growth (which includes cloud cover effects on sunlight) as measured by tree rings has been declining and this has been deliberately covered-up by the fraudulent “scientists” assembling the International Panel on Climate Change data, who have had to resort to data manipulation tricks to “hide the decline”.

6. Gore lies by including Hurricane Katrina and its devastation of New Orleans in 2005 as a global warming debate phenomenon: the effects of the hurricane were a random result of happening to strike a highly populated coast with poor defenses and actually imply that better sea defenses are needed for such cities, because cutting CO2 emissions can’t stop hurricanes any more than Gore’s lying hot air!

8. The film’s images of the abandoned ships on the dried-up bed of the Aral Sea are a massive irrelevancy for global warming because it is very well-known that the Soviet Union actually caused the Aral Sea to dry up by diverting the rivers which fed that sea! The Aral Sea did not dry up due to global warming!

9. Gore claims global warming threats are all real because a peer-reviewed review paper of 928 peer-reviewed articles found that none disagreed with global warming. Professor Feynman warned that such peer-reviewed pseudoscience claims about authority and consensus are actually political rubbish of no consequence to the natural world around us and are hence anti-science in their very nature:

“You must here distinguish – especially in teaching – the science from the forms or procedures that are sometimes used in developing science. … great religions are dissipated by following form without remembering the direct content of the teaching of the great leaders. In the same way, it is possible to follow form and call it science, but that is pseudo-science. In this way, we all suffer from the kind of tyranny we have today in the many institutions that have come under the influence of pseudoscientific advisers. … We have many studies in teaching, for example, in which people make observations, make lists, do statistics, and so on … They are merely an imitative form of science … The result of this pseudoscientific imitation is to produce experts, which many of you are. …. As a matter of fact, I can also define science another way: Science is the belief in the ignorance of experts.”

Science is the belief in the ignorance of expert opinion, of political consensus. Science is the rejection of everything except factual evidence. The object of science is not to achieve harmony or consensus but, on the contrary, to find the facts no matter whether the facts agree with expert opinions and expert prejudices, or not!

In case anyone doesn’t grasp this point by Feynman that statistics alone don’t prove causes, remember the example from How to Lie With Statistics of the Dutch researcher who proved a definite correlation between the number of babies in families and the number of storks nests on the roofs of their homes! This didn’t prove that storks were the cause, and delivered babies like traditional mythology! There was a simple alternative reason: the bigger families tended to buy larger, older houses which naturally tended to have more storks nests on their roofs because they were both bigger and older!

“It is difficult trying to make this decision from the statistics alone.

“An example of how this might occur is something that was presented by George Bernard Shaw … Statistics were presented to him to show that as immunization increased, various communicable diseases decreased in England. He hired somebody to count up the telegraph poles erected in various years … and it turned out that telegraph poles were being increased in number. He said, ‘Therefore, this is clear evidence that the way to eliminate communicable diseases is to build a lot more telegraph poles’.

“All I would like to say here is that the important point is that if you really want to understand it, you have to look at the mechanism of the occurrence. I think this is where the emphasis should lie.”

“It’s very clear that he can’t possibly have the slightest clue about physics, geology, and energy flows on the Earth. It’s sad that many politicians lack the basic science education. …

“Don’t get me wrong, I am no foe of geothermal energy. But it currently produces about 0.3% of the global energy demand. Only near the tectonic plate boundaries, the installation is relatively doable today. That’s why geothermal power plants may thrive in Iceland but not in the bulk of Europe or America.

“There’s surely some room for expansion of this source of energy but it doesn’t seem realistic to expect that geothermal energy will replace the fossil fuels in the bulk of their current applications.

“If you want to have a sensible idea about the amount of geothermal energy we can get by sensible tools, it’s excellent to imagine the ‘hot water bubbling up at some places’ (usually in combination with lots of fart-y gases such as methane, ammonia, and hydrogen sulfide, besides innocent carbon dioxide) – exactly the right idea that Al Gore doesn’t like because it cools the irrational hype (or downright lies) surrounding the alternative sources of energy.”

Dr Lubos Motl on global warming, a “fast” comment of his on the blog post:

“some glaciers are advancing, some glaciers are receding. The number of those that were receding in the last 30 or 100 years was almost certainly larger than the number of those that were advancing, and this fact is surely correlated with the fact that the global mean temperature increased during the last 30 or 100 years.

“The higher temperature, the more likely it is for a glacier to recede. That’s because water takes on the form of ice at lower temperature than the liquid form. Statistical statements about the glaciers are surely correlated with the global temperature, the fate of every individual glacier is surely a local effect.

“More importantly, glaciers on Earth have been either receding or advancing roughly for 4 billion years. Every hour, the clouds change their location and shape. Every day, the glaciers melt and freeze. Every year, seasons change. Every 5 years, El Nino dynamics changes the phase. Every 30 years, PDO is doing the same thing. Every 200 years or so, the Sun changes its activity. And so on.

“The Earth is always changing, it has always been changing, and while it would be convenient for a liberal sissy to have air-conditioning all over the Earth that keeps the temperature fixed, it’s not how Nature works.

“I am not saying that temperature or the volume of ice on Earth is not changing. Quite on the contrary, I am saying that it has always been a law of Nature that temperature is changing – at all conceivable timescales. What I am saying is that everyone who is surprised or worried that the temperature is changing, or who thinks that it is a sign from God telling us that the humanity is doing something wrong, or that the change proves that something is wrong with humans, or that any change means that there is something dangerous going on, is a breathtaking imbecile.

“Change is as omnipresent a fact about the Universe as the existence of time – they’re the very same thing, after all. The only thing that can be more or less “worrying” are the numbers – the trends. The graphs show which numbers are realistic and which numbers are de facto impossible physically.”

Epilogue on groupthink or Britain’s media backs the Prime Minister Gordon Brown, despite the fact he has caused massive debt, propaganda lies on global warming, and was not even elected by the population to lead the country (there is nothing anyone can do about him, so everyone tries to make the best of the situation instead of facing the unpleasant facts, since there is no widely accepted alternative to him)

Above: a still from the video below showing Britain’s Prime Minister Gordon Brown smirking at the complaints in the European Parliament by Daniel Hannan MEP about the £100 billion of taxpayers debt he developed as Chancellor, before the immense increase in debt he recently caused due to the government bail out of failing banks like Northern Rock whose failure he caused by deregulating them (which allowed massive long-term risks to be taken by the bank’s directors lending money to poor-credit risk debtors in order to maximise short-term profits and hence massive bonus payouts to the directors!).

In 2010, Britain’s government is on course to borrow £178 billion, with the total national debt forecast to double to £1,500 billion (£1,500,000 million) over the next five years. By 2015, the 62 million Britains will each have an effective debt due almost entirely to Gordon Brown’s legacy of £24,000, so the working population (only a fraction of the total population which includes kids and the retired)will have massive tax rises just to pay the interest on this £1,500,000,000,000! Don’t get me wrong, these hard facts aren’t me making a Party Political attack on Gordon Brown: the other parties in other ways are worse, and would probably have made other errors in office, but that does not excuse Gordon Brown’s legacy at all. He did a very bad job.

The fact that politicians in Britain are now generally scum was proved by their expenses scandal last year, they’re almost all bad. Groupthink means he will profit from what he does, just as Witten will escape justice for lying about string theory predicting gravity:

1. His multibillion pound ‘New Deal’ for the young unemployed has failed just as predictable (he made no effort to make it work, it was just a back-of-the-envelope media spin idea to waste money): in 2009 there were 850,000 young people who are ‘NEET': Not in Employment, Education, or Training. What a failure!

2. His tax credits system has rewarded single mothers for having as many children by different men as they can, fuelling dependency, juvenile delinquency and family breakdown.

3. He blocked Frank Field’s attempts for welfare reform, allowing alcoholics and drug addicts to live on premium-rate incapacity benefit.

4. He threw tens of billions of pounds into the unreformed National Health Service where it was poured down the drain, while he was reducing the freedoms to be offered to foundation hospitals.

5. He frustrated Tony Blairs plans to give more freedom to head teachers and more choice to parents while he was Chancellor.

6. He sank the country into debt to make the powerful Labour Party backers (the public sector unions and the left wingers) support his leadership ambitions. It was the personal greed of one man for power and glory as a leading statesman which sank Britain into crisis.

7. “Mr Brown’s expenses claim receipts, part of a batch of ministerial claims obtained by The Daily Telegraph, show that he paid his brother, a senior executive of EDF Energy, £6,577 over 26 months for cleaning services. Downing Street said that the brothers had shared a cleaner for a number of years.” – Philippe Naughton, ‘No 10 releases Gordon Brown’s cleaning contract’, From Times Online, May 8, 2009.

This proves how sinister Prime Minister Gordon Brown is: squandering vast sums of public expenses money from taxpayers on cleaning his flat after adding £100 billion to the British national debt as Chancellor and far more recently as Prime Minister. Gordon Brown was previously Chancellor of finances and is reponsible for the money wasting and UK debt over the last 12 years or so (from 1997). During the global economic boom years from 1997-2008, he squandered taxpayers money on rubbish nobody wanted like the Millennium Dome, and funded the squandering by borrowing money, adding £100 billion to the public debt. Recently newspapers have exposed that he personally has been claiming thousands of pounds in expenses for having a small flat cleaned. As Chancellor, he a decade ago deregulated the banks in the UK, enabling them to lend vast amounts to risky debtors and thus cause the banking crisis in the UK recently. Not only that, he got rid of the UK gold standard when it was at its lowest value, just before the value of gold shot up, thus making a fantastic loss for the taxpayer. But he didn’t worry because he doesn’t use his own elbow grease, let alone pay out of his own pocket, to have his flat cleaned. The taxpayer gets the bill, as always, for his incompetence and failure.

Above: this is the U-tube attack on him to his face in the European Parliament by Daniel Hannan MEP:

‘The truth, Prime Minister, is that you have run out of our money. The country as a whole is now in negative equity. Every British child is born owing around £20,000. Servicing the interest on that debt is going to cost more than educating the child. … it is true that we are all sailing together into the squall – but not every vessel in the convoy is in the same dilapidated condition. Other ships used the good years to caulk their hulls and clear up their rigging – in other words, to pay off debt – but you used the good years to raise borrowing yet further. As a consequence, under your captaincy, our hull is pressed deep into the water line, under the accumulated weight of your debt. We are now running a deficit that touches almost 10% of GDP – an unbelievable figure. More than Pakistan, more than Hungary – countries where the IMF has already been called in.

‘Now, it’s not that you’re not apologising – like everyone else, I’ve long accepted that you’re pathologically incapable of accepting responsibility for these things – it’s that you’re carrying on, wilfully worsening the situation, wantonly spending what little we have left. Last year, in the last twelve months, 125,000 private sector jobs have been lost – and yet you’ve created 30,000 public sector jobs. Prime Minister you cannot go on forever squeezing the productive bit of the economy in order to fund an unprecedented engorging of the unproductive bit.

‘You cannot spend your way out of recession or borrow your way out of debt. And when you repeat, in that wooden and perfunctory way, that our situation is better than others, that we’re well place to weather the storm, I have to tell you, you sound like a Brezhnev-era Apparatchik giving the party line. You know, and we know, and you know that we know that it’s nonsense. Everyone knows that Britain is the worst placed to go into these hard times. The IMF has said so. The European Commission has said so. The markets have said so, which is why our currency has devalued by 30% …’

(Or does it sound as “absurd” to you as Nazi pseudo-scientific evil seemed during the 30s when Hitler was lavishly praised by the British former Prime Minister David Lloyd George and visited twice by the then-Prime Minister Chamberlain? Are you sure that groupthink is always 100% correct and outsiders are always 100% wrong? There is no evidence to support that, but plenty to debunk it.)

Update (4 May 2010): Prime Minister Gordon Brown’s two-faced treatment of his supporter Gillian Duffy (who had asked him questions he felt uncomfortable with), accidentally picked up my a radio microphone a few days ago, and broadcast. His first pathetic excuse for his double-faced treatment of her, after it was broadcast by Jeremy Vine, was the lie that he called her bigoted because (he claimed) he somehow ‘didn’t have a chance’ to answer her question due to the press around him (he did answer her, so that’s an obvious lie!), then he later claimed that he had misunderstood her, which is a change of story. Either he didn’t have a chance to answer her, or he misunderstood her. So which excuse is true? His perpetual changing of excuses made him the world’s worst liar. He didn’t even bother trying to defend himself by saying honestly that the conversation in his car should have been kept secret, as it would have looked really bad.

Dr Lowell Wood, EMP expert at Lawrence Livermore National Lab and the protégé of Edward Teller, argues that the conventional thinking about how to deal with global warming is wrong, and that instead of costlycarbon-trading schemes and international treaties and political gridlock costing trillions of dollars, just $100 million a year (less than the cost of a good-size wind farm) will solve the problem: “volcano eruptions alter the climate for months by loading the skies with tiny particles that act as mini-reflectors, shading out sunlight and cooling the Earth. Why not apply the same principles to saving the Arctic? Getting the particles into the stratosphere wouldn’t be a problem — you could generate them easily enough by burning sulfur, then dumping the particles out of high-flying 747s, spraying them into the sky with long hoses or even shooting them up there with naval artillery. They’d be invisible to the naked eye, Wood argued, and harmless to the environment. Depending on the number of particles you injected, you could not only stabilize Greenland’s polar ice — you could actually grow it. Results would be quick: If you started spraying particles into the stratosphere tomorrow, you’d see changes in the ice within a few months. And if it worked over the Arctic, it would be simple enough to expand the program to encompass the rest of the planet. In effect, you could create a global thermostat, one that people could dial up or down to suit their needs (or the needs of polar bears).” – Jeff Goodell, http://www.chemtrailcentral.com/forum/msg103324.html

The U.K. House of Commons has 650 Members of Parliament “seats”, each covering a constituency of different size, with a different number of voters. The two biggest political parties, Conservatives and Labour, have an average number of voters per constituency which is less than the average number of votes per constituency in the third largest party, the Liberal Democrats (Lib Dems for short). In the 6 May 2010 national M.P.s election, 29.65 million votes were cast: 36% for Conservatives, 29% for Labour, and 23% for Lib Dems. (The rest of the votes were spoiled or distributed between 18 very small parties, the most successful of which – UKIP – received only 3.1% of the total number of votes cast).

By contrast, the percentages of M.P.s elected were different, because the average size of constituencies (in terms of numbers of voters) was bigger for the Lib Dems than for Labour or Conservative: the Conservatives got 47% (306) of the seats (below the 50+% of seats required for a majority in the House of Commons, which is required under the constitution to form a stable government), Labour got 40% (258), and the Lib Dems got only 8.8% (57 seats). If there was proper democracy, with the number of votes determining the representation in the parliament, then the Conservatives would have 235 seats, Labour would have 189 seats, and the Lib Dems would have 150 seats. Notice that the relative ratio of Conservative to Labour seats is practically unchanged; the Conservatives would still have a greater number of seats than Labour in almost the same ratio, because the average number of voters per constituency for Conservative and Labour is very similar, unlike the case for the Lib Dems. The Lib Dems have only 57 seats under the present undemocratic system, and would have 150 seats under proportional representation.

If there had been proportional representation, the government would be in the same position of Conservative gaining the largest share of the vote, and Labour gaining a dismal second place, but the Lib Dems would have a much better 3rd place position, with more say and in a better position to negotiate its terms in a coalition with the biggest winner to form a government. Now Prime Minister Gordon Brown is trying to sneakily send a negotiation team to the Lib Dems to promise immediate proportional representation if they enter a coalition with Labour, so that the second and third place losers can form a government to run Britain, leaving the party in first place (the Conservatives) by the wayside. Well done, Gordon. If you wanted that policy, why didn’t you implement it during the last 13 years you have been in the Cabinet or Prime Minister? Actually I’m not bothered if the Gordon Brown remains as Prime Minister in a coalition of the parties which came in second and third place (and in fourth, fifth, sixth, seventh, etc., place; because he can’t form a majority government with the number of M.P.s in Labour and Lib Dems combined, but would need to include many other parties in a really weak pact). A coalition of losers running the country is just what I expect from the whole way that pseudo-democracy masquerades as fairness today! It will be refreshing to see the usual “democracy” hype exposed in the open for the lie it really is and always has been in England. Gordon Brown is already the unelected Prime Minister. Let him become the national dictator, let’s adoringly call him “My Leader” (Mein Furhrer), and have done with all the idle hype about this country being run on the principles of fairness and democracy. The scenes of voters having to queue for hours in many places in England only to be refused their democratic to vote at 10 PM when the doors were shut in their faces, proves Gordon Brown’s lie about this being a proper democracy where people fight for the right to vote and to have a say. It’s very refreshing to see it out in the open:

Update (16 May 2010): New Scientist issue on debunking mass delusions

Prime Minister Brown has gone back to Scotland, having resigned with his head held high and his self-pride intact and radiant.

Patient rebuttal. Exactly what the arrogant, egotistical, impatient scientists, engineers and politicians lack when dealing with propaganda claims both from lobby groups and also from genuine critics. The arrogant, egotistical, and impatient prefer to ignore any criticisms instead of answering them, or they simply adopt the stance of Prime Minister Gordon Brown by Gillian Duffy, in taking the perceived path of least resistance when making decisions, then later trying to sweep concerns under the carpet and dismiss the critics as mere bigots, instead of undertaking the harder work of taking the hard choices, listening to the concerns, doing the research to find out the facts, and patiently explaining and discussing the facts in detail:

My answer is this: let them be heard. Examine their evidence. Consider their interpretation. If they have anything of substance to say, then the truth will out.

What do you do, however, with people who, after their claim has been fully discussed and thoroughly debunked, continue to make the claim anyway? This, of course, is where scepticism morphs into denialism. Does there come a point when it is time to move on to other challenges? Sometimes there does.

Case in point: Holocaust denial. In the 1990s, a number of us engaged Holocaust deniers in debate and outlined in exhaustive detail the evidence for the Nazi genocide. It had no effect. They sailed on through into the 2000s making the same discredited arguments. At that point I threw up my hands and moved on to other challenges. By the late 2000s the Holocaust deniers had largely disappeared.

Throwing up your hands is not always an option, though. Holocaust denial has always been on the fringe, but other forms – notably creationism and climate denial – wield considerable influence and show no signs of going away. In such cases, eternal vigilance is the price we must pay for both freedom and truth. Those who are in possession of the facts have a duty to stand up to the deniers with a full-throated debunking repeated often and everywhere until they too go the way of the dinosaurs.

Those in possession of the facts have a duty to stand up to deniers with a full-throated debunking.

We should not, however, cover up, hide, suppress or, worst of all, use the state to quash someone else’s belief system. There are several good arguments for this:

■1. They might be right and we would have just squashed a bit of truth.
■2. They might be completely wrong, but in the process of examining their claims we discover the truth; we also discover how thinking can go wrong, and in the process improve our thinking skills.
■3. In science, it is never possible to know the absolute truth about anything, and so we must always be on the alert for where our ideas need to change.
■4. Being tolerant when you are in the believing majority means you have a greater chance of being tolerated when you are in the sceptical minority. Once censorship of ideas is established, it can work against you if and when you find yourself in the minority.

No matter what ideas the human mind generates, they must never be quashed. When evolutionists were in the minority in Tennessee in 1925, powerful fundamentalists were passing laws making it a crime to teach evolution, and the teacher John Scopes was put on trial. I cannot think of a better argument for tolerance and debate than his lawyer Clarence Darrow’s plea in the closing remarks of the trial.

“If today you can take a thing like evolution and make it a crime to teach it in the public schools, tomorrow you can make it a crime to teach it in the private schools, and next year you can make it a crime to teach it in the church. At the next session you can ban books and the newspapers. Ignorance and fanaticism are ever busy… After a while, your honour, it is the setting of man against man, creed against creed, until the flying banners and beating drums are marching backwards to the glorious ages of the 16th century when bigots lighted fagots to burn the man who dared to bring any intelligence and enlightenment and culture to the human mind.”

Hence, you can predict a 50% chance at any random time in history that the temperature will be rising, and a 50% chance that it will be falling. It’s always one or the other: it’s not normally constant. Moreover, the temperature has been almost continuously rising for 18,000 years when the last ice age started to thaw, so for this period the expectancy of warming is higher than 50%. Over the past 18,000 years global warming has caused the sea levels to rise 120 metres, a mean rise of 0.67 cm/year, with even higher rates of rise during part of this time. In the century of 1910-2010, sea levels have risen linearly by a total of 20 cm or a mean rate of rise of 0.2 cm/year.

This demonstrates beautifully some of the problems in trying to make people really listen and take you seriously if there is too much difference. Prime Minister Brown was determined not to admit that he had made costly mistakes. He persuaded himself that every decision he made was backed up by “solid” reasoning, for example he was under pressure from the trade unions which prop up the Labour party, to create the many unproductive but expensive state sector jobs at a time when the economy was shrinking and government tax revenues were falling, not increasing. To him, these decisions were inevitable and necessary. In fact, they were the height of folly, and a sign of his weakness and preference to the path of least resistance for his own party politics, rather than a sign of his strength to take the difficult choices in the national interest. Moreover, like antinuclear protestors, he surrounded himself with like-minded “media spin-doctors” to “educate” and “inform” the “ignorant”, a situation slightly analogous to one German Chancellor’s use of spin-doctor Dr Goebbels (propaganda minister) to “explain” the “morality” of racist eugenics and ethnic extermination. They rewarded the spin doctor Mandelson with a Lordship, despite claiming to be critical of the democratic basis of the House of Lords! Even when confronted with examples of his failure which he could not deny, like the sale of Britain’s Gold Standard at the minimum value of gold which cost Britain a massive £7 billion loss, he refused to accept responsibility and selectively tried blamed his predecessors who were not responsible because they were no longer making the decisions:

MPS SLAM ‘SECRETIVE’ CLIMATEGATE PROBES
Labour MP Graham Stringer said Lord Oxburgh appeared to have a “conflict of interest”

Tuesday January 25,2011

By John Ingham

TWO inquiries into claims that scientists manipulated data about global warming were yesterday condemned by MPs as ineffective and too secretive.

The row, which became known as Climategate, erupted in 2009 over allegations that researchers had deliberately strengthened evidence suggesting human activity was to blame for rising temperatures.

MPs on the Science and Technology Committee have now concluded that both probes into the scandal had failed to “fully investigate” claims that scientists had deleted embarrassing emails.

The investigations were set up after around 4,000 leaked emails and documents appeared to show that scientists at East Anglia University’s Climate Research Unit had manipulated data to strengthen the case for man-made global warming.

UEA’s Independent Climate Change Emails Review was led by Sir Muir Russell, while the Scientific Appraisal Panel was led by Lord Oxburgh.

But the MPs said they had “reservations” about both inquiries.

They criticised the brevity of the appraisal panel report, at “a mere five pages”, and said both investigations should have been more open to the public.

The committee also said the emails review “did not fully investigate the serious allegation” relating to the deletion of emails and instead relied on a verbal reassurance that the messages still exist.

Though the committee was split over the credibility of the inquiries, an amendment put forward by Labour MP Graham Stringer which said that they had not been independent was voted down by members.

He said Lord Oxburgh appeared to have a “conflict of interest” because of his links to green businesses while the Emails Review panel included a former Climate Research Unit scientist.

Institutionally biased to the Left, politically correct and with a rudderless leadership. This is Peter Sissons’ highly critical view of the BBC in his new memoirs, in which he describes his fascinating career over four decades as a television journalist. Here, in the latest part of our serialisation, he reveals how it was heresy at the BBC to question claims about climate change . . .

My time as a news and ­current affairs anchor at the BBC was characterised by weak leadership and poor ­direction from the top, but hand in hand with this went the steady growth of political correctness.

Indeed, it was almost certainly the ­Corporation’s unchallengeable PC culture that made strong leadership impossible.

Leadership — one person being in charge, trusting his or her own judgment, taking a decision and telling others what to do— was shied away from in favour of endless meetings of a dozen or more ­people trying to arrive at some sort of consensus. …

For me, though, the most worrying aspect of political correctness was over the story that recurred with increasing frequency during my last ten years at the BBC — global warming (or ‘climate change’, as it became known when temperatures appeared to level off or fall slightly after 1998).

From the beginning I was unhappy at how one-sided the BBC’s coverage of the issue was, and how much more complicated the climate system was than the over-simplified two-minute reports that were the stock-in-trade of the BBC’s environment correspondents.

These, without exception, accepted the UN’s assurance that ‘the science is settled’ and that human emissions of carbon dioxide threatened the world with catastrophic climate change. Environmental pressure groups could be guaranteed that their press releases, usually beginning with the words ‘scientists say . . . ’ would get on air unchallenged.

On one occasion, an MP used BBC airtime to link climate change ­doubters with perverts and holocaust deniers, and his famous interviewer didn’t bat an eyelid.

On one occasion, after the inauguration of Barack Obama as president in 2009, the science correspondent of Newsnight actually informed viewers ‘scientists calculate that he has just four years to save the world’. What she didn’t tell viewers was that only one alarmist scientist, NASA’s James Hansen, had said that.

My interest in climate change grew out of my concern for the failings of BBC journalism in reporting it. In my early and formative days at ITN, I learned that we have an obligation to report both sides of a story. It is not journalism if you don’t. It is close to propaganda.

The BBC’s editorial policy on ­climate change, however, was spelled out in a report by the BBC Trust — whose job is to oversee the workings of the BBC in the interests of the public — in 2007. This disclosed that the BBC had held ‘a high-level seminar with some of the best scientific experts and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus’.

The error here, of course, was that the BBC never at any stage gave equal space to the opponents of the consensus.

But the Trust continued its ­pretence that climate change ­dissenters had been, and still would be, heard on its airwaves. ‘Impartiality,’ it said, ‘always requires a breadth of view, for as long as minority ­opinions are coherently and honestly expressed, the BBC must give them appropriate space.’

In reality, the ‘appropriate space’ given to minority views on climate change was practically zero.

Moreover, we were allowed to know practically nothing about that top-level seminar mentioned by the BBC Trust at which such momentous conclusions were reached. Despite a Freedom of Information request, they wouldn’t even make the guest list public.

At the end of November 2007 I was on duty on News 24 when the UN panel on climate change produced a report which later turned out to contain ­significant inaccuracies, many stemming from its reliance on non-peer reviewed sources and best-guesses by environmental activists.

But the way the BBC’s reporter treated the story was as if it was beyond a vestige of doubt, the last word on the catastrophe awaiting mankind. The most challenging questions addressed to a succession of UN employees and climate ­activists were ‘How urgent is it?’ and ‘How much danger are we in?’

Back in the studio I suggested that we line up one or two sceptics to react to the report, but received a totally negative response, as if I was some kind of lunatic. I went home and wrote a note to myself: ‘What happened to the journalism? The BBC has ­completely lost it.’

A damaging episode illustrating the BBC’s supine attitude came in 2008, when the BBC’s ‘environment ­analyst’, Roger Harrabin, wrote a piece on the BBC website reporting some work by the World ­Meteorological Organization that questioned whether global ­warming was going to continue at the rate ­projected by the UN panel.

A green activist, Jo Abbess, emailed him to complain. Harrabin at first resisted. Then she berated him: ‘It would be better if you did not quote the sceptics’ — something Harrabin had not actually done — ‘Please reserve the main BBC online channel for emerging truth. Otherwise I would have to conclude that you are insufficiently educated to be able to know when you have been psychologically manipulated.’

Did Harrabin tell her to get lost? He tweaked the story — albeit not as radically as she demanded — and emailed back: ‘Have a look and tell me you are happier.’

This exchange went round the world in no time, spread by a ­jubilant Abbess. Later, Harrabin defended himself, saying they were only minor changes — but the sense of the changes, as specifically sought by Ms Abbess, was plainly to harden the piece against the sceptics.

Many people wouldn’t call that minor, but Harrabin’s BBC bosses accepted his explanation.

The sense of entitlement with which green groups regard the BBC was brought home to me when what was billed as a major climate change rally was held in London on a ­miserable, wintry, wet day.
I was on duty on News 24 and it had been arranged for me to ­interview the leader of the Green Party, Caroline Lucas. She clearly expected, as do most environmental activists, what I call a ‘free hit’ — to be allowed to say her piece without challenge.
I began, good naturedly, by observing that the climate didn’t seem to be playing ball at the moment, and that we were having a particularly cold winter while carbon emissions were powering ahead.

Miss Lucas reacted as if I’d ­physically molested her. She was outraged. It was no job of the BBC — the BBC! — to ask questions like that. Didn’t I realise that there could be no argument over the science?

I persisted with a few simple observations of fact, such as there appeared to have been no warming for ten years, in contradiction of all the alarmist computer models.
A listener from one of the sceptical climate-change websites noted that ‘Lucas was virtually apoplectic and demanding to know how the BBC could be making such ­comments. Sissons came back that his role as a journalist was always to review all sides. Lucas finished with a veiled warning, to which Sissons replied with an “Ooh!”’

A week after this interview, I went into work and picked up my mail from my pigeon hole. Among the envelopes was a small Jiffy Bag, which I opened. It contained a substantial amount of faeces wrapped in several sheets of toilet paper.
At the time no other interviewers on the BBC — or indeed on ITV News or Channel Four News — had asked questions about climate change which didn’t start from the assumption that the science was settled.

… story goes on more about climate change cover up, all terrible for BBC liars, more to be found at http://www.dailymail.co.uk/news/article-1350206/BBC-propaganda-machine-climate-change-says-Peter-Sissons.html

CONCLUSIONS

We’ve been in a warming period for 18,000 years. The whole holocene, during which humanity thrived, has been a period of climatic change.

Much of the Sahara desert was a tropical paradise a few thousand years ago, and it wasn’t destroyed by humanity. The last ice age is still receding. What the natural climate change deniers insist, by lying, is that the climate is a delicate equilibrium, critically controlled by CO2 levels. In fact, the relative influence of H2O, plain old water vapour, is a bigger greenhouse gas.

“Studies show that water vapor feedback roughly doubles the amount of warming caused by CO2. So if there is a 1°C change caused by CO2, the water vapor will cause the temperature to go up another 1°C. When other feedback loops are included, the total warming from a potential 1°C change caused by CO2 is, in reality, as much as 3°C.”

This is what the IPCC computer models say, and is precisely why they’re wrong. NASA scientist contractor Dr Ferenc Miskolczi of AS&M Inc on 1 January 2006 resigned with a protest letter about being censored out, stating:

“Unfortunately my working relationship with my NASA supervisors eroded to a level that I am not able to tolerate. My idea of the freedom of science can not coexist with the recent NASA practice of handling new climatic change related scientific results. … I presented to NASA a new view of greenhouse theory and pointed out serious errors in the classical approach of assessment of climate sensitivity to greenhouse gas perturbations. Since then my results were not released for publication.”

NASA effectively banned its publication through the peer-reviewed literature, just as it had used groupthink fear to censor out the effects of low temperatures on making the rubber Challenger O-rings brittle, so they leaked during a cold morning launch, causing the 1986 space shuttle explosion. Their computer model claimed a 1/100,000 chance of failure. In fact, the chance of failure was was up to 100,000 times greater as Feynman explained, and was being covered up by the obfuscating NASA flowcharts which simply neglected the biggest source of failure, rubber seals in the boosters going brittle at low temperatures in the early morning for a January launch.

If they had been less into groupthink consensus, they could have prevented Challenger exploding by banning launching it when the temperature was too cold, which is precisely what Feynman found the engineers actually checking the rubber O-rings wanted. However, the launch director and the bigwig advisers who thought they knew better when in fact of course they knew nothing about physics, effectively conspired to blow the shuttle up by allowing it to be launched under conditions where the rubber was brittle and would leak fuel when the boosters vibrated during launch, with the fuel running down to the engines and blowing the whole thing up.

This was the big cover-up that Feynman famously exposed with the cup of iced water and a rubber O-ring during a TV news conference, as part of the Rogers’ Commission report into the disaster, which famous “expert” NASA astronaut Niel Armstrong and all his friends failed to spot: http://en.wikipedia.org/wiki/Rogers_Commission_Report . The suggestion one can make is that all these experts don’t know, don’t care, or even conspire to try to cause disasters, but that’s pushing it too far: their horrible attitude towards conspiring together to f*** everything up isn’t a clever plan, but a pathetic, unintended “accident”. They screw things up because they’re arrogant and ignorant.

Basically, Dr Ferenc Miskolczi’s life as a NASA climate research scientist was made hell because he discovered that the extra water vapour being evaporated is not having a positive-feedback (increasing the CO2 warming effect by absorbing more infrared from the sun), instead it is going into increased cloud cover, which reflects incoming sunlight back to space. So it has a negative-feedback effect, not a positive-feedback effect. NASA’s climate computer models all have not merely a quantitative error in the effect of H2O on climate, but an actual qualitative error. They have a plus sign where the sign is really negative.

“Since the Earth’s atmosphere is not lacking in greenhouse gases [water vapor], if the system could have increased its surface temperature it would have done so long before our emissions. It need not have waited for us to add CO2: another greenhouse gas, H2O, was already to hand in practically unlimited reservoirs in the oceans.” – Dr. Miklos Zagoni.

“Never fight a battle on terrain of the enemy’s choosing”, advises Sun Tzu in the Art of War. In which case I definitely shouldn’t be on the BBC Any Questions panel tonight in Wrexham, North Wales.

Perhaps this audience will be different. Perhaps – as Douglas Murray was once astonished to find when he went to Leamington Spa – it will be full of achingly sound right-wingers who cheer to the rafters any call for smaller government, lower taxes, and less pussyfooting around with Islamist extremism. But I doubt it. Conservatives, libertarians, classical liberals and freedom-lovers generally have better things to do on a Friday evening than sit in a church hall listening to a panel of MPs and hacks bang on about politics. That’s much more of a left-liberal-ecoloon obsession.

And this isn’t just an Any Questions problem but a BBC problem generally. Every time I’m asked to appear on a BBC programme be it Radio 4’s Today or Woman’s Hour or Radio 2’s Jeremy Vine Show, or a documentary like that Horizon stitch-up, or BBC2’s The Daily Politics, I always ask myself the same question: “What is the bloody point?”

It’s not for the money, certainly. BBC fees are very modest and are poor compensation for the time and stress and disruption. (They no longer give you First Class train tickets for Any Questions: sad because it’s the only occasion in my life I ever got to travel First Class)

Nor is it for the thrill of being on radio and TV. (Been there, done that: I don’t even bother these days to ring up all my relatives to tell them I’m on).

Nor even, I don’t think, is it for “the Brand.” Not in my case. Not unless “the Brand” I’m trying to promote is “James Delingpole: the right wing **** you just lurve to hate!!!!”, but what would be the use in that? My target audience is the type of person who might seriously want to buy my books, not the sort who wouldn’t read anything I’d written even if you forced them at gunpoint.

So why do I do it? Simple. Because the left-wing broadcast media – and I include the similarly left-wing, similarly publicly funded Channel 4 – is pretty much the only broadcast media we have in this country. Sure we have Sky, but Sky is way, way to the left of its US equivalent Fox News. Then we have ITV, which doesn’t really do politics. And Channel 5 which isn’t very watched.

So, basically, if you want to be on TV and get a political view across, your choices are to appear either on a channel which is grotesquely, hideously, institutionally left (BBC, Channel 4) or really quite surprisingly left given that it’s owned by the same people who own Fox (Sky). In other words, Hobson’s Choice.

This leaves you with only two options: do these programmes when you’re asked to appear on them; or don’t do them. I still haven’t made up my mind what the sensible course is.

To some, like Richard North, you should say “No” on the Sun Tzu principle outlined above.

To others, like Douglas Murray – who doesn’t half enjoy a ruck – you should say “Yes” on the principle that for evil to prevail all that is necessary is that good men do nothing.

I see sense in both these arguments, but reflecting on my experiences of the last few months, I’m beginning to think that North’s cynicism is better founded than Murray’s optimism.

This isn’t to say you can’t go on the BBC and occasionally score points.

On Today, the other morning, I was able to mount a deliciously satisfying assault on BBC1’s achingly right-on new director of programmes and his silly plan to replace all middle class comedy with echt, grimey working-class comedy.

On the Daily Politics, I was able to get Green MP Caroline Lucas to admit (proudly, she claimed) on camera that she was a watermelon (green on the outside, red on the inside) and to expose the fascistic bent of her party’s manifesto.

But these were essentially lucky breaks in a hostile environment. I only got away with the Today attack because the working class comic who’d been recruited to disagree with me violently ended up taking my side and agreeing BBC comedy was way too PC and that the BBC’s only comedy commissioning criterion should be “Is it funny”. And if the Lucas attack was a victory it was only a Pyrrhic one. The programme had been framed in such a way – nice Caroline Lucas sits beamingly in a chair and talks caringly about the importance of recycling – that the moment I attacked Lucas, I cast myself as the nasty, aggressive, uncaring person being horrid to a nice, middle-aged lady for no obvious reason. Lucas played this role to the hilt: if she didn’t no one would be mad enough to vote for her.

But these tiny, partial victories are the exception rather than the rule. Generally, when a right-leaning person goes on the BBC, his job is to act as the token nutter who must then be shafted.

This was my job, for example, on Horizon the other week which you can read more about here (in Barry Woods’s guest post at Watts Up With That) and also here in The Spectator.

And I think I’ve had enough of this. I think this Any Questions in Wrexham tonight may be the last I choose to do. I appear on these programmes because I believe the libertarian and conservative points I try to articulate are ones which need to be heard and simply aren’t made often enough. But it’s precisely because I do it for the cause and not for the self-publicity, the money or the self-aggrandisement that I am now starting seriously to wonder whether it’s worth it. Surely, no one in his right mind wishes to go into battle with one arm tied behind his back – which is essentially what conservatives/libertarians do, every time they appear on the BBC. Surely the sensible thing to do in the Culture Wars is only to fight battles when the fight is fair and you know you can win. Otherwise, you’re not really helping your cause when you heroically, stupidly lay your neck on the line yet again – but doing it a disservice.

“Phoebe: Oh, okay, don’t get me started on gravity.
“Ross: You uh, you don’t believe in gravity?
“Phoebe: Well, it’s not so much that, you know, like I don’t believe in it, you know, it’s just … I don’t know, lately I get the feeling that I’m not so much being pulled down as I am being pushed.”

“… reality must take precedence over public relations, for nature cannot be fooled.”

“If it exists, the graviton must be massless (because the gravitational force has unlimited range) and must have a spin of 2 (because the source of gravity is the stress-energy tensor, which is a second-rank tensor, compared to electromagnetism, the source of which is the four-current, which is a first-rank tensor). To prove the existence of the graviton, physicists must be able to link the particle to the curvature of the space-time continuum and calculate the gravitational force exerted.” – False claim, Wikipedia.

Previous posts explaining why general relativity requires spin-1 gravitons, and rejects spin-2 gravitons, are linked here, here, here, and here. But let’s take the false claim that gravitons must be spin-2 because the stress-energy tensor is rank-2. A rank 1 tensor is a first-order (once differentiated, e.g. da/db) differential summation, such as the divergence operator (sum of field gradients) or curl operator (the sum of all of the differences in gradients between field gradients for each pair of mutually orthagonal directions in space). A rank 2 tensor is some defined summation over second-order (twice differentiated, e.g. d2a/db2) differential equations. The field equation of general relativity has a different structure from Maxwell’s field equations for electromagnetism: as the Wikipedia quotation above states, Maxwell’s equations of classical electromagnetism are vector calculus (rank-1 tensors or first-order differential equations), while the tensors of general relativity are second order differential equations, rank-2 tensors.

The lie, however, is that this is physically deep. It’s not. It’s purely a choice of how to express the fields conveniently. For simple electromagnetic fields, where there is no contraction of mass-energy by the field itself, you can do it easily with first-order equations, gradients. These equations calculate fields with a first-order (rank-1) gradient, e.g. electric field strength, which is the gradient of potential/distance, measured in volts/metre. Maxwell’s equations don’t directly represent accelerations (second-order, rank-2 equations would be needed for that). For gravitational fields, you have to work with accelerations because the gravitational field contracts the source of the gravitational field itself, so gravitation is more complicated than electromagnetism.

The people who promote the lie that because rank-1 tensors apply to spin-1 field quanta in electromagnetism, rank-2 tensors must imply spin-2 gravitons, offer no evidence of this assertion. It’s arm-waving lying. It’s true that you need rank-2 tensors in general relativity, but it is not necessary in principle to use rank-1 tensors in electromagnetism: it’s merely easiest to use the simplest mathematical method available. You could in principle use rank-2 tensors to rebuild electromagnetism, by modelling the equations to observable accelerations instead of unobservable rank-1 electric fields and magnetic fields. Nobody has ever seen an electric field: only accelerations and forces caused by charges. (Likewise for magnetic fields.)

There is no physical correlation between the rank of the tensor and the spin of the gauge boson. It’s a purely historical accident that rank-1 tensors (vector calculus, first-order differential equations) are used to model fictitious electric and magnetic “fields”. We don’t directly observe electric field lines or electric charges (nobody has seen the charged core of an electron, what we see are effects of forces and accelerations which can merely be described in terms of field lines and charges). We observe accelerations and forces. The field lines and charges are not directly observed. The mathematical framework for a description of the relationship between the source of a field and the end result depends on the definition of the end result. In Maxwell’s equations, the end result of a electric charge which is not moving relative to the observer is a first-order field, defined in volts/metre. If you convert this first-order differential field into an observable effect, like force or acceleration, you get a second-order differential equation, acceleration a = d2x/dt2. General relativity doesn’t describe gravity in terms of a first-order field like Maxwell’s equations do, but instead describes gravitation in terms of a second-order observable, i.e. space curvature produced acceleration, a = d2x/dt2.

So the distinction between rank-1 and rank-2 tensors in electromagnetism and general relativity is not physically deep: it’s a matter of human decisions on how to represent electromagnetism and gravitation.

We choose in Maxwell’s equations to represent not second-order accelerations but using Michael Faraday’s imaginary concept of a pictorial field, radiating and curving “field lines” which are represented by first-order field gradients and curls. In Einstein’s general relativity, by contrast, we don’t represent gravity by such half-baked unobservable field concept, but in terms of directly-observable accelerations.

Dissimilarities in tensor ranks used to describe two different fields originate from dissimilarities in the field definitions for those two different fields, not to the spin of the field quanta. Any gauge field whose field is written in a second order differential equation, e.g., acceleration, can be classically approximated by rank-2 tensor equation. Comparing Maxwell’s equations in which fields are expressed in terms of first-order gradients like electric fields (volts/metre) with general relativity in which fields are accelerations or curvatures, is comparing chalk and cheese. They are not just different units, but have different purposes. For a summary of textbook treatments of curvature tensors, see Dr Kevin Aylward’s General Relativity for Teletubbys: “the fundamental point of the Riemann tensor [the Ricci curvature tensor in the field equation general relativity is simply a cut-down, rank-2 version Riemann tensor: the Ricci curvature tensor, Rab = Rxaxb, where Rxaxb is the Riemann tensor], as far as general relativity is concerned, is that it describes the acceleration of geodesics with respect to one another. … I am led to believe that many people don’t have a … clue what’s going on, although they can apply the formulas in a sleepwalking sense. … The Riemann curvature tensor is what tells one what that acceleration between the [particles] will be. This is expressed by

[Beware of some errors in the physical understanding of some of these general relativity internet sites, however. E.g., some suggest – following a popular 1950s book on relativity – that the inverse-square law is discredited by general relativity, because the relativistic motion of Mercury around the sun can be approximated within Newton’s framework by increasing the inverse-square law power slightly from its value of 1/R2 to 1/R2 + X where X is a small fraction, so that the force appears to get stronger nearer the sun. This is fictitious and is just an approximation to roughly accommodate relativistic effects that Newton ignored, e.g. the small increase in planetary mass due to its higher velocity when the planet is nearer the sun on part of its elliptical orbit, than it has when it is moving slower far from sun. This isn’t a physically correct model; it’s just a back-of-the-envelope fudge. A physically correct version of planetary motion in the Newtonian framework would keep the geometric inverse square law and would then correctly modify the force by making the right changes for the relativistic mass variation with velocity. Ptolemy’s epicycles demonstrated the danger of constructing approximate mathematical model which have no physical validity, which then become fashion.]”

Maxwell’s theory based on Faraday’s field lines concept employs only rank-1 equations, for example the divergence of the electric field strength, E, is directly proportional to the charge density, q (charge density is here defined as the charge per unit surface area, not the charge per unit volume): div.E ~ q. The reason this is a rank-1 equation is simply because the divergence operator is the sum of gradients in all three perpendicular directions of space for the operand. All it says is that a unit charge contributes a fixed number of diverging radial lines of electric field, so the total field is directly proportional to the total charge.

But this is just Faraday’s way of visualizing the way the electric force operates! Remember that nobody has yet seen or reported detecting an “electric field line” of force! With our electric meters, iron filings, and compasses we only see the results of forces and accelerations, so the number and locations of electric or magnetic field lines depicted in textbook diagrams is due to purely arbitrary conventions. It’s merely an abstract aetherial legacy from the Faraday-Maxwell era, not a physical reality that has any experimental evidence behind it. If you are going to confuse Faraday’s and Maxwell’s imaginary concept of field “lines” with experimentally defensible reality, you might as well write down an equation in which the invisible intermediary between charge and force is an angel, a UFO, a fairy or an elephant in an imaginary extra dimension. Quantum field theory tells us that there are no physical lines. Instead of Maxwell’s “physical lines of force”, we have known since QED was verified that there are field quanta being exchanged between charges.

So if we get rid of our ad hoc prejudices, getting rid of “electric field strength, E” in volts/metre and just expressing the result of the electric force in terms of what we can actually measure, namely accelerations and forces, we’d have a rank-2 tensor, basically the same field equation as is used in general relativity for gravity. The only differences will be the factor of ~1040 difference between field strengths of electromagnetism and gravity, the differences in the signs for the curvatures (like charges repel in electromagnetism, but attract in gravity) and the absence of the contraction term that makes the gravitational field contract the source of the field, but supposedly does not exist in electromagnetism. The tensor rank will be 2 for both cases, thus disproving the arm-waving yet popular idea that the rank number may be correlated to the field quanta spin. In other words, the electric field could be modelled by a rank-2 equation if we simply make the electric field consistent with the gravitational field by expressing both field in terms of accelerations, instead of using the gradient of the Faraday-legacy volts/metre “field strength” for the electric field. This is however beyond the understanding of the mainstream, who are deluded by fashion and historical ad hoc conventions. Most of the problems in understanding quantum field theory and unifying Standard Model fields with gravitational fields result from the legacy of field definitions used in Maxwellian and Yang-Mills fields, which for purely ad hoc historical reasons are different from the field definition in general relativity. If all fields are expressed in the same way as accelerative curvatures, all field equations become rank-2 and all rank-1 divergencies automatically disappear, since are merely an historical legacy of the Faraday-Maxwell volts/metre field “line” concept, which isn’t consistent with the concept of acceleration due to curvature in general relativity!

However, we’re not advocating the use of any particular differential equations for any quantum fields, because discontinuous quantized fields can’t in principle be correctly modelled by differential equations, which is why you can’t properly represent the source of gravity in general relativity as being a set of discontinuities (particles) in space to predict curvature, but must instead use a physically false averaged distribution such as a “perfect fluid” to represent the source of the field. The rank-2 framework of general relativity has relatively few easily obtainable solutions compared to the simpler rank-1 (vector calculus) framework of electrodynamics. But both classical fields are false in ignoring the random field quanta responsible for quantum chaos (see, for instance, the discussion of first-quantization versus second-quantization in the previous post here, here and here).

Summary:

1. The electric field is defined by Michael Faraday as simply the gradient of volts/metre, which Maxwell correctly models with a first-order differential equation, which leads to a rank-1 tensor equation (vector calculus). Hence, electromagnetism with spin-1 field quanta has a rank-1 tensor purely because of the way it is formulated. Nobody has ever seen Faraday’s electric field, only accelerations/forces. There is no physical basis for electromagnetism being intrinsically rank-1; it’s just one way to mathematically model it, by describing it in terms of Faraday rank-1 fields rather than the directly observable rank-2 accelerations and forces which we see/feel.

2. The gravitational field has historically never been expressed in terms of a Faraday-type rank-1 field gradient. Due to Newton, who was less pictorial than Faraday, gravity has always been described and modelled directly in terms of the end result, i.e. accelerations/forces we see/feel.

This difference between the human formulations of the electromagnetic and gravitational “fields” is the sole reason for the fact that the former is currently expressed with a rank-1 tensor and the latter is expressed with a rank-2 tensor. If Newton had worked on electromagnetism instead of aether crackpots like Maxwell, we would undoubtedly have a rank-2 mathematical model of electromagnetism, in which electric fields are expressed not in volts/metre, but directly in terms of rank-2 acceleration (curvatures), just like general relativity.

Both electromagnetism and gravitation should define fields the same way, with rank-2 curvatures. The discrepancy that electromagnetism uses instead rank-1 tensors is due to the inconsistency that in electromagnetism fields are not defined in terms of curvatures (accelerations) but in terms of a Faraday’s imaginary abstraction of field lines. This has nothing whatsoever to do with particle spin. Rank-1 tensors are used in Maxwell’s equations because the electromagnetic fields are defined (inconsistently with gravity) in terms of rank-1 unobservable field gradients, whereas rank-2 tensors are used in general relativity purely because the definition of a field in general relativity is acceleration, which requires a rank-2 tensor to describe it. The difference is purely down to the way the field is described, not the spin of the field.

The real reason why gravitons supposedly “must” be spin-2 is due to the mainstream investment of energy and time in worthless string theory, which is designed to permit the existence of spin-2 gravitons. We know this because whenever the errors in spin-2 gravitons are pointed out, they are ignored. These stringy people aren’t interested in physics, just grandiose fashionable speculations, which is the story of Ptolemy’s epicycles, Maxwell’s aether, Kelvin’s vortex atom, Piltdown Man, S-matrices, UFOs, Marxism, fascism, etc. All were very fashionable with bigots in their day, but:

“… reality must take precedence over public relations, for nature cannot be fooled.” – Feynman’s Appendix F to Rogers’ Commission Report into the Challenger space shuttle explosion of 1986.

‘Some physicists speculate that dark energy could be a repulsive gravitational force that only acts over large scales. “There is precedent for such behaviour in a fundamental force,” Wesson says. “The strong nuclear force is attractive at some distances and repulsive at others.”’

This possibility was ignored by Pauli and Fierz when first proposing that the quanta of gravitation has spin-2.

(2) gives a push that appears as LeSage “attraction” for small nearby masses, which only have weak mutual graviton exchange due to their small gravitational charges, and therefore on balance get pushed together by the much larger graviton pressure due to implosive focussing of gravitons converging inwards from the exchange with immense, distant masses (the galaxy clusters isotropically distributed across the sky).

It is usually by applying facts and laws to new situations that progress is made in science. If you stick to applying known laws to situations they have already been applied to, you’ll be less likely to observe something new than if you try applying them to a situation which nobody has ever applied them to before. We should apply Newton’s laws to the accelerating cosmos and then focus on the immense forces and what they tell us about graviton exchange.

Above: The mainstream 2-dimensional ‘rubber sheet’ interpretation of general relativity says that mass-energy ‘indents’ spacetime, which responds like placing two heavy large balls on a mattress, which distorts more between the balls (where the distortions add up) than on the opposite sides. Hence the balls are pushed together: ‘Matter tells space how to curve, and space tells matter how to move’ (Professor John A. Wheeler). This illustrates how the mainstream (albeit arm-waving) explanation of general relativity is actually a theory that gravity is produced by space-time distorting to physically push objects together, not to pull them! (When this is pointed out to mainstream crackpot physicists, they naturally freak out and become angry, saying it is just a pointless analogy. But when the checkable predictions of the mechanism are explained, they may perform their always-entertaining “hear no evil, see no evil, speak no evil” act.)

Above: LeSage’s own illustration of quantum gravity in 1758. Like Lamarke’s evolution theory of 1809 (the one in which characteristics acquired during life are somehow supposed to be passed on genetically, rather than Darwin’s evolution in which genetic change occurs due to the inability of inferior individuals to pass on genes), LeSage’s theory was full of errors and is still derided today. The basic concept that mass is composed of fundamental particles with gravity due to a quantum field of gravitons exchanged between these fundamental particles of mass, is now a frontier of quantum field theory research. What is interesting is that quantum gravity theorists today don’t use the arguments used to “debunk” LeSage: they don’t argue that quantum gravity is impossible because gravitons in the vacuum would “slow down the planets by causing drag”. They recognise that gravitons are not real particles: they don’t obey the energy-momentum relationship or mass shell that applies to particles of say a gas or other fluid. Gravitons are thus off-shell or “virtual” radiations, which cause accelerative forces but don’t cause continuous gas type drag or the heating that occurs when objects move rapidly in a real fluid. While quantum gravity theorists realize that particle (graviton) mediated gravity is possible, LeSage’s mechanism of quantum gravity is still as derided today as Lamarke’s theory of evolution. Another analogy is the succession from Aristarchus of Samos, who first proposed the solar system in 250 B.C. against the mainstream earth-centred universe, to Copernicus’ inaccurate solar system (circular orbits and epicycles) of 1500 A.D. and to Kepler’s elliptical orbit solar system of 1609 A.D. Is there any point in insisting that Aristarchus was the original discoverer of the theory, when he failed to come up with a detailed, convincing and accurate theory? Similarly, Darwin rather than Lamarke is accredited with the theory of evolution, because he made the theory useful and thus scientific.

If someone fails to come up with a detailed, accurate and successfully convincing theory, and merely gets the basic idea right without being able to prove it against the mainstream fashions and groupthink, then the history of science shows that the person is not credited with a big discovery: science is not merely guesswork. Maxwell based his completion of the theory of classical electrodynamics upon an ethereal displacement current of virtual charges in the vacuum, in order to correct Ampere’s law for the case of open circuits such as capacitors using the permittivity of free space (a vacuum) for the dielectric. Maxwell believed, by analogy to the situation of moving ions in a fluid during electrolysis, that current appears to flow through the vacuum between capacitor plates while the capacitor charges and discharges; although in fact the real current just spreads along the plates, and electromagnetic induction (rather than ethereal vacuum currents) produces the current on the opposite place.

Maxwell nevertheless suggested (in an Encyclopedia Britannica article) an experiment to test whether light is carried at an absolute velocity by a mechanical spacetime fabric. After the Michelson-Morley experiment was done in 1887 to test Maxwell’s conjecture, it was clear that no absolute motion was detectable: suggesting (1) that motion appears relative, not absolute, and (2) that light always appears to go at the same velocity in the vacuum. In 1889, FitzGerald published an explanation of these “relativity” results in Science: he argued that the physical vacuum contracted moving masses like the Michelson-Morley experiment, by analogy to the contraction of anything moving in a fluid due to the force from the head-on fluid pressure (wind drag, or hydrodynamic resistance). This fluid-space based explanation predicted quantitatively the relativistic contraction law, and Lorentz showed that since mass depends inversely on the classical radius of the electron, it predicted a mass increase with velocity. Given the equivalence of space and time via the velocity of light, Lorentz showed that the contraction predicted time-dilation due to motion.

Above: In Science in 1889, FitzGerald used the Michelson-Morley result to argue that moving objects at velocity v must contract in length in the direction of their motion by the factor (1 – v2/c2)1/2 so that there is no difference in the travel times of light moving along two perpendicular paths. Groupthink crackpots claim that if the lengths of the arms of the instrument are different, FitzGerald’s argument for absolute motion is destroyed since the travel times are still cancelled out. Actually, the arms of the Michelson-Morley instrument can never be the same length to within the accuracy of the relative times implied by interference fringes! The instrument does not measure the absolute times taken in two different directions: it merely determines if there is a difference in the relative times (which are always slightly different, since the arms can’t be machined to perfectly identical length) when the instrument is rotated by 90 degrees. Another groupthink crackpot argument is that although the FitzGerald theory predicts relativity from length contraction in an absolute motion universe, other special relativity results like time dilation, mass increase, and E = mc2 can only be obtained from Einstein. Actually, all were obtained by Lorentz and Poincare: Lorentz showed that evidence for space-time from electromagnetism implies that apparent time dilates like distance when an clock moves, while he argued that since the classical electromagnetic electron radius is inversely proportional to its mass, its mass should thus increase with velocity by a factor equal to the reciprocal of the FitzGerald contraction factor. Likewise, a force F = d(mv)/dt acting on a body moving distance dx imparts kinetic energy dE = F.dx = d(mv).dx/dt = d(mv)v = v.d(mv) = v2dm + mvdv. Comparison of this purely Newtonian result with the derivative of Lorentz’s relativistic mass increase formula mv = m0(1 – v2/c2)-1/2 gives us dm = dE/c2 or E = mc2. (See for example, Dr Glasstone’s Sourcebook on Atomic Energy, 3rd ed., 1967.)

“Many condensed matter systems are such that their collective excitations at low energies can be described by fields satisfying equations of motion formally indistinguishable from those of relativistic field theory. The finite speed of propagation of the disturbances in the effective fields (in the simplest models, the speed of sound) plays here the role of the speed of light in fundamental physics. However, these apparently relativistic fields are immersed in an external Newtonian world (the condensed matter system itself and the laboratory can be considered Newtonian, since all the velocities involved are much smaller than the velocity of light) which provides a privileged coordinate system and therefore seems to destroy the possibility of having a perfectly defined relativistic emergent world. In this essay we ask ourselves the following question: In a homogeneous condensed matter medium, is there a way for internal observers, dealing exclusively with the low-energy collective phenomena, to detect their state of uniform motion with respect to the medium? By proposing a thought experiment based on the construction of a Michelson-Morley interferometer made of quasi-particles, we show that a real Lorentz-FitzGerald contraction takes place, so that internal observers are unable to find out anything about their ‘absolute’ state of motion. Therefore, we also show that an effective but perfectly defined relativistic world can emerge in a fishbowl world situated inside a Newtonian (laboratory) system. This leads us to reflect on the various levels of description in physics, in particular regarding the quest towards a theory of quantum gravity. …

“… Remarkably, all of relativity (at least, all of special relativity) could be taught as an effective theory by using only Newtonian language. …In a way, the model we are discussing here could be seen as a variant of the old ether model. At the end of the 19th century, the ether assumption was so entrenched in the physical community that, even in the light of the null result of the Michelson-Morley experiment, nobody thought immediately about discarding it. Until the acceptance of special relativity, the best candidate to explain this null result was the Lorentz-FitzGerald contraction hypothesis. … we consider our model of a relativistic world in a fishbowl, itself immersed in a Newtonian external world, as a source of reflection, as a Gedankenmodel. By no means are we suggesting that there is a world beyond our relativistic world describable in all its facets in Newtonian terms. Coming back to the contraction hypothesis of Lorentz and FitzGerald, it is generally considered to be ad hoc. However, this might have more to do with the caution of the authors, who themselves presented it as a hypothesis, than with the naturalness or not of the assumption. … The ether theory had not been disproved, it merely became superfluous. Einstein realised that the knowledge of the elementary interactions of matter was not advanced enough to make any claim about the relation between the constitution of matter (the ‘molecular forces’), and a deeper layer of description (the ‘ether’) with certainty. Thus his formulation of special relativity was an advance within the given context, precisely because it avoided making any claim about the fundamental structure of matter, and limited itself to an effective macroscopic description.”

In 1905, Einstein took the two implications of the Michelson-Morley research (that motion appears relative not absolute, and that the observed velocity of light in the vacuum is always constant) and used them as postulates to derive the FitzGerald-Lorentz transformation and Poincare mass-energy equivalence. Einstein’s analysis was preferred by Machian philosophers because it was purely mathematical and did not seek to explain the principle of relativity and constancy of the velocity of light in the vacuum by invoking a physical contraction of instruments. Einstein postulated relativity; FitzGerald explained it. Both predicted a similar contraction quantitatively. Similarly, Newton’s theory or gravitation is the combination of Galileo’s principle that dropped masses all accelerate at the same rate due to the constancy of the Earth’s mass, with Kepler’s laws of planetary motion. Newton postulated his universal gravitational law based on this evidence plus the guess that the gravitational force is directly proportional to the mass producing it, and he checked it by the Moon’s centripetal acceleration; LeSage tried to explain what Newton had postulated and checked.

The previous post links to Peter Woit’s earlier article about string theorist Erik Verlinde’s arXiv preprint On the Origin of Gravity and the Laws of Newton, which claims: “Gravity is explained as an entropic force caused by changes in the information associated with the positions of material bodies.” String theorist Verlinde derives Newton’s laws and other results using only “high-school mathematics” (which brings contempt from mathematical physicist Woit, probably one of the areas of agreement he has with string theorist Jacques Distler), i.e. no tensors, and he is derives the Newtonian weak field approximation for gravity, not the relativistic Einsteinian gravity law which also includes contraction. This contraction is physically real but small for weak gravitational fields and non-relativistic velocities: Feynman famously calculated in his published Lectures on Physics that the contraction term in Einstein’s field equation contracts the Earth’s radius by MG/(3c2) = 1.5 mm. Consider two ways to predict contraction using Einstein’s equivalence principle.

Secondly, Feynman’s way. A more physically intuitive explanation to the modification of Newton’s gravitational law implied by Einstein’s field equation of general relativity is to examine Feynman’s curvature result: space-time is non-Euclidean in the sense that the gravitational field contracts the Earth’s radius by (1/3)MG/c2 or about 1.5 mm. This is unaccompanied by a transverse contraction, i.e. the Earth’s circumference is unaffected. To mathematically keep “Pi” a constant, therefore, you need to invoke an extra dimension, so that the n-1 = 3 spatial dimensions we experience are in string theory terminology a (mem)brane on a n = 4 dimensional bulk of spacetime. Similarly, if you draw a 2-dimensional circle upon the surface of the interior of a sphere, you will obtain Pi from the circle only by drawing a straight line through the 3-d bulk of the volume (i.e. a line that does not follow the 2-dimensional curved surface or “brane” of the sphere upon which the circle is supposed to exist). If you measure the diameter upon the curved surface, it will be different, so Pi will appear to vary.

A simple physical mechanism of Feynman’s (1/3)MG/c2 excess radius for symmetric, spherical mass M is that the gravitational field quanta compress a mass radially when being exchanged with distant masses in the universe: the exchange of gravitons pushes against masses. By Einstein’s principle of the equivalence of inertial and gravitational mass, the cause of this excess radius is exactly the same as the cause of the FitzGerald-Lorentz contraction of moving bodies in the direction of their motion, first suggested in Science in 1889 by FitzGerald. FitzGerald explained the apparent constancy of the velocity of light regardless of the relative motion of the observer (indicated by the null-result of the Michelson-Morley experiment of 1887) as the physical effect of the gravitational field. In the fluid analogy to the gravitational field, if you accelerate an underwater submarine, there is a head-on pressure from the inertial resistance of the water which it is colliding with, which causes it to contract slightly in the direction it is going in. This head-on or “dynamic” pressure is equal to half the product of the density of the water and the square of the velocity of the submarine. In addition to this “dynamic” pressure, there is a “static” water pressure acting in all directions, which compresses the submarine slightly in all directions, even if the submarine is not moving. In this analogy, the FitzGerald-Lorentz contraction is the “dynamic” pressure effect of the graviton field, while the Feynman excess radius or radial contraction of masses is the “static” pressure effect of the graviton field. Einstein’s special relativity postulates (1) relativity of motion and (2) constancy of c, and derives the FitzGerald-Lorentz transformation and mass-energy equivalence from these postulates; by contrast, FitzGerald and Lorentz sought to physically explain the mechanism of relativity by postulating contraction. To contrast this difference:

(1) Einstein: postulated relativity and produced contraction.
(2) Lorentz and FitzGerald: postulated contraction to produce “apparent” observed Michelson-Morley relativity as just an instrument contraction effect within an absolute motion universe.

These two relativistic contractions, the contraction of relativistically moving inertial masses and the contraction of radial space around a gravitating mass, are simply related under Einstein’s principle of the equivalence of inertial and gravitational masses, since Einstein’s other equivalence (that between mass and energy) then applies to both inertial and gravitational masses. In other words, the equivalence of inertial and gravitational mass implies an effective energy equivalence for each of these masses. The FitzGerald-Lorentz contraction factor [1 – (v/c)2]1/2 contains velocity v, which comes from the kinetic energy of the moving object. By analogy, when we consider a mass m at rest in a gravitational field from another much larger mass M (like a person standing on the Earth), it has acquired gravitational potential energy E = mMG/R, equivalent to a kinetic energy of E = (1/2)mv2, so by Einstein’s equivalence principle of inertial and gravitational field energy it can be considered to have an “effective” velocity of v = (2GM/R)1/2. Inserting this velocity into the FitzGerald-Lorentz contraction factor [1 – (v/c)2]1/2 gives [1 – 2GM/(Rc2)]1/2 which, when expanded by the binomial expansion to the first couple of terms as a good approximation, yields 1 – GM/(Rc2). This result assumes that all of the contraction occurs in one spatial dimension only, which is true for the FitzGerald-Lorentz contraction (where a moving mass is only contracted in the direction of motion, not in the two other spatial dimensions it has), but is not true for radial gravitational contraction around a static spherical, uniform mass, which operates equally in all 3 spatial dimensions. Therefore, the contraction in any one of the three dimensions is by the factor 1 – (1/3)GM/(Rc2). Hence, when gravitational contraction is included, radius R becomes R[1 – (1/3)GM/(Rc2)] = R – GM/(3c2), which is the result Feynman produced in his Lectures on Physics from Einstein’s field equation.

The point we’re making here is that general relativity isn’t mysterious unless you want to ignore the physical effects due to energy conservation and associated contraction, which produce its departures from Newtonian physics. Physically understanding the mechanism for how general relativity differs from Newtonian physics therefore immediately takes you to the facts of how the quantum gravitational field physically distorts static and moving masses, leading to checkable predictions which you cannot make with general relativity alone. It is therefore helpful if you want to understand physically how quantum gravity must operate in order to be consistent with general relativity within its domain of validity. Obviously general relativity breaks down outside that domain, which is why we need quantum gravity, but within the limits of validity for that classical domain, both theories are consistent. The reason why quantum gravity of the LeSage sort needs to be fully reconciled with general relativity in this way is that one objection to LeSage was by Laplace, who ignored the gravitational and motion contraction mechanisms of quantum gravity for relativity (Laplace was writing long before FitzGerald and Einstein) and tried to use this ignorance to debun LeSage by arguing that orbital aberration would occur in LeSage’s model due to the finite speed of the gravitons. This objection does not apply to general relativity due to the contractions incorporated into the general relativity theory by Einstein: similarly, Laplace’s objection does not apply to quantum gravity which inherently includes the contractions as physical results of quantum gravity upon moving masses.

In the past, however, FitzGerald’s physical contraction of moving masses as miring by fluid pressure has been controversial in physics, and Einstein tried to dispose of the fluid. The problem with the fluid was investigated by citics of Fatio and LeSage, who promoted a shadowing theory of gravity, whereby masses get pushed together by mutually shielding one another from the pressure of the fluid in space. These critics included some of the greatest classical physicists the world has ever known: Newton (Fatio’s friend), Maxwell and Kelvin. Feynman also reviewed the major objection, drag, to the fluid in his broadcast lectures on the Character of Physical Law. The criticisms of the fluid is that it the force it needs to exert to produce gravity would classically be expected to cause fast moving objects in the vacuum

(1) to heat up until they glow red hot or ablate at immense temperature,

(2) to slow down and (in the case of planets) thus spiral into the sun,

(3) while the fluid would diffuse in all directions and on large distance scales fill in the “shadows” like a gas, preventing the shadowing mechanism from working (this doesn’t apply to gravitons exchanged between masses, for although they will take all possible paths in a path integral, the resultant, effective graviton motion for force delivery will along the path of least action, due to the cancellation of the amplitudes of paths which interfere off the path of least action: see Feynman’s 1985 book QED),

(4) the mechanism would not give a force proportional to mass if the fundamental particles have a large gravitational interaction cross-sectional area, which would mean that in a big mass some of the shadows would “overlap” one another, so the net force of gravity from a big mass would be less than directly proportional to the mass, i.e. it would increase not in simple proportion to M but instead statistically in proportion to: 1 – e-bM, where b is a gravity cross-section and geometry-dependent coefficient, which allows for the probability of overlap. This 1 – e-bM formula has two asymptotic limits:

(a) for small masses and small cross-sections, bM is much smaller than 1, so: e-bM ~ 1 – bM, so 1 – e-bM ~ bM. I.e., for small masses and small cross-sections, the theory agrees with observations (there is no significant overlap).

(b) for larger masses and large cross-sections, bM might be much larger than 1, so e-bM ~ 0, giving 1 – e-bM ~ 1. I.e., for large masses and large cross-sections, the overlap of shadows would prevent any increase in the mass of a body from increasing the resultant gravitational force: once gravitons are stopped, they can’t be stopped again by another mass.

(5) the LeSage mechanism suggested that the gravitons which cause gravity would be slowed down by the energy loss when imparting a push to a mass, so that they would not be travelling at the velocity of light, contrary to what is known about the velocity of gravitational fields. However this is false and is due to the real (rather than virtual “off-shell”) radiation that LeSage assumed. The radiation goes at light velocity and merely shifts in frequency due to energy loss. For static situations, where no acceleration is produced (e.g. an apple stationary hanging on a tree) the graviton exchange results in no energy change; it’s a perfectly elastic scattering interaction. No energy is lost from the gravitons, and no kinetic energy is gained by the apple. Where the apple is accelerated, the kinetic energy it gains is that lost due to a shift to lower energy (longer wavelength) of the “reflected” or scattered gravitons. Notice that Dr Lubos Motl has objected to me by falsely claiming that virtual particles don’t appear to have wavelengths; on the contrary, the empirically confirmed Casimir effect is due to inability of virtual photons of wavelength longer than the distance between two metal plates, to exist and produce pressure between the plates (so the plates are pushed together from the complete spectrum of virtual photon wavelengths in the vacuum surrounding the places, which is stronger than the cut-off spectrum between the plates). Like the reflection of light by a mirror, the process is consists of the absorption of a particle followed by the emission of a new particle.

However, quantum field theory, which has been very precisely tested for electrodynamics, resurrects a quantum fluid or field in space which consists of gauge boson radiation, i.e. virtual (off-shell) radiation which carries “borrowed” or off-mass shell energy, not real energy. It doesn’t obey the relationship between energy and momentum that applies to real radiation. This is why the radiation can exert pressure without causing objects to heat up or to slow down: they merely accelerate or distort instead.

The virtual radiation is not like a regular fluid. It carries potential energy that can be used to accelerate and contract objects, but it cannot directly heat them or cause continuous drag to non-accelerating objects by carrying away their momentum in a series of impacts in the way that gas or water molecules cause continuous drag on non-accelerating objects. There is only resistance to accelerations (i.e., inertia and momentum) because of these limitations on the energy exchanges possible with the virtual (off-shell) radiations in the vacuum.

“Gerard ‘t Hooft expresses pleasure at seeing a string theorist talking about “real physical concepts like mass and force, not just fancy abstract mathematics”. According to the article, the problem with Einstein’s General Relativity is that its “laws are only mathematical descriptions.” I guess a precise mathematical expression of a theory is somehow undesirable, much better to have a vague description in English about how it’s all due to some mysterious entropy.”

So Dr Woit has finally flipped, giving up on precise mathematical expressions and coming round to the “much better” vague and mysterious ideas of the mainstream string theorists. Well, I think that’s sad, but I suppose it can’t be helped. Newton in 1692 scribbled in his own printed copy of his Principia that Fatio’s 1690 gravity mechanism was “the unique hypothesis by which gravity can be explained”, although Newton did not publish any statement of his interest in the gravitational mechanism (just as he kept his alchemical and religious studies secret).

I guess a precise mathematical expression of a theory is somehow undesirable, much better to have a vague description in English about how it’s all due to some mysterious entropy.

“No-one is suggesting the existing mathematical models should be abandoned. The point being made is that the entropic approach may give us some physical insight into those mathematical models.”

This is a valid point: finding a way to make predictions with quantum gravity doesn’t mean “abandoning” general relativity, but supplementing it by giving additional physical insight and making quantitative, falsifiable predictions. Although Professor Halton Arp (of the Max-Planck Institut fuer Astrophysik) promotes heretical quasar redshift objections to the big bang which are false, he does make one important theoretical point in his paper The observational impetus for Le Sage Gravity:

‘The first insight came when I realized that the Friedmann solution of 1922 was based on the assumption that the masses of elementary particles were always and forever constant, m = const. He had made an approximation in a differential equation and then solved it. This is an error in mathematical procedure. What Narlikar had done was solve the equations for m= f(x,t). This a more general solution [to general relativity], what Tom Phipps calls a covering theory. Then if it is decided from observations that m can be set constant (e.g. locally) the solution can be used for this special case. What the Friedmann, and following Big Bang evangelists did, was succumb to the typical conceit of humans that the whole of the universe was just like themselves.’

The remainder of his paper is speculative, non-falsifiable or simply wrong, and Arp is totally wrong in dismissing the big bang since his quasar “evidence” has empirically been shown to be completely bogus, while it has also been shown that the redshift evidence definitely does require expansion, since other “explanations” fail. But Arp is right in arguing that the Friedmann et al. solutions to general relativity for cosmological models are all based on the implicit assumption that the source of gravity is not an “emergent” effect of the motion of masses in the surrounding universe. The Lambda-CDM model based on general relativity is typical of the problem, since it can be fitted in ad hoc fashion to virtually any kind of universe by adjusting the values of the dark energy and dark matter parameters to force the theory to fit the observations from cosmology (the opposite of science, which is to make falsifiable predictions and then to check those predictions). That’s a religion based on groupthink politics, not facts.

“But there’s problems, too. There ought to be “air resistance” from the particles as the planets move through space. Then there’s the fact that the force is proportional to surface area hit by the particles, not to the mass. This can be remedied by assuming a tiny interaction cross-section due to the particles, but if this is true they must be moving very fast indeed to produce the required force – many times the speed of light. And in that case the heating due to the “air resistance” of the particles would be impossibly high. Furthermore, if the particle shadows of two planets overlapped, the sun’s gravity on the farther planet should be shielded. No such effect has been observed.

“For these and other reasons Fatio’s theory had to be rejected as unworkable.”

Wikipedia is a bit unreliable on this subject: Fatio assumed on-shell (“real”) particles, not a quantum field of off-shell virtual gauge bosons. The exchange of gravitons between masses in the universe would cause the heating, drag, etc., regardless of spin if the radiation were real. So it would dismiss spin-2 gravitons of attraction, since they’d have to be everywhere in the universe between masses, just like Fatio’s particles. But in fact the objections don’t apply to gauge boson radiations since they’re off-shell. Fatio didn’t know about relativity or quantum field theory.

Thanks anyway, your post is pretty funny and could be spoofed by writing a fictitious attack on “evolution” by ignoring Darwin’s work and just pointing out errors in Lamarke’s theory of evolution (which was wrong)…

“This can be remedied by assuming a tiny interaction cross-section due to the particles, but if this is true they must be moving very fast indeed to produce the required force – many times the speed of light.”

Or just increasing the flux of spin-1 gravitons when you decrease the cross-section …

Pauli’s role in predicting the neutrino by applying energy conservation to beta decay (against Bohr who falsely claimed that the energy conservation anomaly in beta decay was proof that indeterminancy applies to energy conservation, violating energy conservation to explain the beta decay anomaly without predicting the neutrino to take away energy!), and in declaring Heisenberg’s vacuous (unpredictive) unified field theory to be “not even wrong”, is well known, thanks to Peter Woit.

There is a nice anecdote about Markus Fierz, Pauli’s collaborator in the spin-2 theory of gravitons, given by Freeman Dyson on p. 15 of his 2008 book The Scientist as Rebel:

“Many years ago, when I was in Zürich, I went to see the play The Physicists by the Swiss playwright Friedrich Dürrenmatt. The characters in the play are grotesque caricatures … The action takes place in a lunatic asylum where the physicists are patients. In the first act they entertain themselves by murdering their nurses, and in the second act they are revealed to be secret agents in the pay of rival intelligence services. … I complained about the unreality of the characters to my friend Markus Fierz, a well-known Swiss physicist, who came with me to the play. ‘But don’t you see?’ said Fierz. ‘The whole point of the play is to show us how we look to the rest of the human race’.”