Links

30 September, 2015

It is hard to disagree with the concept of the organisation of a Society for the benefit of its people – the idea of Service. Just as it is similarly obvious that such an organisation should be under the control of the populace – the concept of Democracy.

So, historically, it has been necessary for those in power to in some way subscribe to both of these principles (or at least pretend that they do).

Let us look, critically, at some significant examples.

The fabled properties of Democracy, both as the will of the people, and also their overall control, are, of course, total myths in current so-called “Democratic States”, and the evidence for these assertions can be found everywhere, and can show exactly whose wants and needs are serviced by this lauded system of rule. Of course, it must be where both wealth and power reside that has to be addressed, but, in a somewhat distorted way, for the nearest thing to what is desired is delivered, if only marginally, by Local Democracy, where known and accessible representatives do things that immediately affect people – that is in local or District Councils of various types. For, one aspect of these organisations did take things out of the control of the oligarchs, and it was in the services owned and run by these elected Local Authorities.

Of course, the composition of these Councils would represent the area, which elects it, and in affluent areas the local authority would see its task to serve that constituency, and its occupants. Whereas in a working class city, the majority would have very different priorities, and these would be, sometimes at least, evident in the actions of their elected Council.

But, that is Democracy, and a comparison of how such different Councils see their priorities is very interesting and informative, and always distorted by misleading comparisons such as in efficiency and expenditure priorities.

Clearly a prosperous area would not need to allocate large resources to support the poor, nor would they have any sympathy for those Councils that did. They would compare expenditures and condemn the “high-spending” Council that have large populations of people needing all kinds of essential support.

Now, it is precisely these kinds of criticisms that are used to discredit “Serving Councils” for the affluent take pride in “paying their own way”, while assuming that those who cannot are lazy or worthless, while, at the same time, lavishing vastly more on their own ill family members than could ever be spent on a poor patent by a social service.

So, let us look objectively at certain social services, which have shown great contributions to the good of the populace, and were and occasionally, still are supplied by Local Democratic Organisations.

Public Transport

Let me start by giving an example from my own experience.

It also should be made clear that I am a working class person from the City of Manchester in England, and was born and brought up in a slum area called West Gorton. I am certain you would get a very different story from someone in Withington or the Stockbroker Belt in North West Cheshire, but their view is available everywhere, whereas the one I will give certainly isn’t.

Oh, and just in case the reader has already pigeonholed me... I finally retired some 20 years ago as a Professor in London University, so what I relate cannot be dismissed as sour grapes from a social failure (as is regularly slapped onto any criticism of society from the working class).

In the 1950s I used to go, every fortnight, to watch my favourite football team, Manchester United, and an average home gate of around 53,000 spectators was got to the ground from all parts of the enormous conurbation by Manchester Corporation Transport in a large fleet of special buses, which were organised like clockwork. In a very short time literally all of these were delivered to the ground, and then removed, just as efficiently, at the end of the match. It was both cheap and vastly more efficient than private cars could ever be, and being a Local Authority Service NO profit was involved. Each and every double-decker bus was packed, and the flexibility of tailor made routes (only used for this purpose) was unachievable by any other means. Indeed, such an effective and wide-ranging transport system was largely self-financing and economical for its users.

But now, after 60 years of “progress” no such system exists. The bus companies are privately owned, and work to a very different imperative, instead of being an efficient and economic service, they now must make a profit as well – without which they simply wouldn’t exist.

NOTE: Imagine how different hospitals would be if they too had to make a profit!

For example the evident virtues for both passengers and transport workers of the old Driver and Conductor arrangement have finally been completely dispensed with after privatisation was finally established as the universal method of provision. Such things as helping old people and mothers with children, on and off the bus were, to say the least, “not conducive to making profit”, so they were dispensed with. And the advantages for speed of service made possible by the collection of fares while on the move, has been replaced by the driver doing all that himself at every single stop and for every passenger, which, it has to be admitted, did wonders for the profit margins now available to the new owners.

Indeed, for a very long while, a significant part of the transport systems were entirely electrically driven in either Trams or Trolley buses, with vastly superior environmental effects than occur with present systems.

The care and maintenance of all these vehicles was undertaken by Council owned and run facilities – again a service with no profit involved. Both my uncle and my brother-in-law worked as bodybuilding specialists in one of the main garages, and were highly trained and well paid, having had apprenticeships, along with Tech-College linked courses. I worked for 10 years in such a College, and the quality of the lecturers and instructors, as well as the qualified engineers that they produced were second to none (I know this because I employed such people as technicians and they were a valuable contribution to the department).

NOTE: By the way, these Colleges were also a service, run by the Council, and, of course, non-profit making.

Funnily enough, all sorts of other, seemingly unconnected things declined too. For example the Public Service Vehicle licence (PSV), which all public service vehicle drivers had to gain before they were allowed to drive such vehicles, were then clearly superior to what they are now. I wonder why?

Also behind the scenes in Public Transport mechanics, with similar rigorous training, kept the engines and safety systems up to scratch, while a large army of cleaners kept both the insides and outsides of the vehicles at an acceptable standard.

I’ll leave the reader to consider what has happened to all these aspects too, and for the very same reasons!

Whatever criticisms there were of Public Transport, there is also little doubt that the imperatives involved were for Service rather than Profit, and usually the workers unions were given much better access and facilities than are ever provided in most private companies.

Even local and national regulation was vastly more efficient, for one visit of an inspection team to the enormous garage where my relatives worked could cover far more and far better, than could be achieved in innumerable visits to multiple small transport companies, and their sub-contracted support firms too, as is the case now. Finally, the economies of scale also made the large publically owned organisations superior to tiny shoestring alternative: there would be the right kit and an appropriate range of trained operatives, from those with years of experience down to apprentices constantly monitored and instructed in best practice.

All this is indeed a taster of Socialised Services, better 60 years ago than they are now!

Yet, the directing of these services was NOT directly in the hands of the populace, or of their elected representatives in Local Government.

The people did not elect the managers of these vast undertakings. They could vote off their known, local and available councillors, and could change the councillors in office at regular elections. But, such a system wasn’t naively bottom-up controlled and run. It required specialists to do that. But, nevertheless, if truly democratic control was in place, the electorate could act at the ballot box. The job of elected councillors was to establish the Service Ethos in their employees, from bottom to top. And even way back in my youth, there was ample evidence that this was achieved in many such organisations. To judge appropriately you merely have to compare then with now.

Do you really think that modern transport firms are run with the service approach? They wouldn’t last long today!

This brief visit to the past was not meant to define some Golden Age. It was never that. But, it showed here and there how Services should be run and most important of all BY WHOM!

Education

Now, Public Transport may not be considered the most important area that involves services to the people, and I would agree.

In a long career in Education, with posts at every level from Junior Schools to Universities, I can speak authorativly about these services, as by far the most important.

Now, it is in schools and colleges of all kinds that Local Democracy has a major role. And, once again, the differences between how this totally non profit making and countrywide service is delivered, and how it contrasts with organisations dedicated primarily and predominantly with the production of profits for those who have no other necessary qualification or general knowledge, but can extract profit, and hence do have the money to invest, is remarkable.

Once again, the quality involved in how such a service as Education was delivered to the Community, is vastly better than in any profit-making concern.

Indeed, there are no bonuses for teachers, and none desired, or expected. The calibre of those who choose such a demanding and worthwhile career is uniformly superior to any other organisation, if your criteria are to do with what is delivered to the community served, and for what reasons.

And, for some considerable time, now, whenever they got into power, the Tories, would make yet another assault upon State Education, while, of course, sending their own children to private, fee-paying schools, where they would receive, primarily, the appropriate social connections and command training for their future ruling roles in society.

For, the mass of the population are, in their eyes, only educated in such ways, and to such levels, to service the current economic system, Capitalism, and its essential role of producing ever-larger profits. Unless what was done in such institutions was limited to such ends, such places would only foster discontent with the Natural Order.

Such totally unproductive educational content must be actively swept away, to produce the ideally prepared workers for this, “the only possible system”.

Indeed, it had been coming to their notice that in certain areas pupils were being educated in such a way that they would have happy and fulfilling lives, and that could certainly only “lead them astray”1

What is clear to these traditional rulers, is that educational institutions must be, primarily, to fit all their products to the needs and wants of their future employers, and concentrate all learning upon only what they will need in their assigned-for roles in society. Education that encouraged them in any other prospective futures was both unkind to them, and destructive to an ordered and healthy economic future for Society. Crucially, thinking for themselves and being creative, artistic or maybe politically active would be well beyond the Pale.

And, we must see all their changes in Educational Policy in this light.

Even the current attacks upon Birmingham Council, under the guise of attacking Moslem extremists, is basically yet another attempt to wrest this jewel of real Social Service out of the hands of Local Democracy, and into the hands of people who agree with their pro-capitalist policies.

Indeed, in a recent news programme on TV the ministers in Parliament, and even the newscasters, themselves, steadfastly refused to either ask, or answer, the Key Questions, and, in fact, purposely misled ordinary people as to both what was actually going on in 21 Birmingham Schools, who was responsible for them, and what their own agendas were for Education in particular, and Local Democracy, in general.

Clearly, Education should never be in the hands of those who don’t really care about anything but making a profit, and should demonstrate the most democratically controlled service of all!

27 September, 2015

Well, of course, it is used when many things are happening
simultaneously, and it, therefore, becomes impossible to deal with
all the various contributions, individually, so then
the method is to merely address their overall summed effect, is
always the pluralist norm with simpler situations. Indeed, it turns out to be significantly better than the usual means,
in many situations, because it makes for overall measurements and
does not concern itself with the multiplicity of different
contributions involved.

It becomes a kind of “backstop” for the inadequacies of the usual
approach and the two have delivered a reasonably useful pair for a
very long period of time. It is yet another case of having two quite
different approaches, and switching between them in a pragmatic
manner, when necessary.

But, there are assumptions involved, which are not always applicable,
and, as always, such compromises are never the complete solution; as
cases will occur which simply don’t fit either method.

These usually occur, due to assumptions made about the overall nature
of the factors involved - often assuming a total, perfectly Random
Mix, with a great deal of cancelling-out of opposing factors, and
a resultant set of overall parameters which conform to a simple
pattern.

The technique involves overall relations and parameters, which can be
effective for the situation, as a whole – like temperature,
Pressure or Volume. Indeed, early, historical experiments, and the revealed laws, were
those that related such quantities.

So, the usual admonishment to young experimenters to, “Stir
thoroughly, and wait for equilibrium before measuring!” was
a sound piece of advice (though Reality wasn’t always so dutiful)! Clearly, only if the conditions approximated very well to the
necessary requirements, could the measurements deliver results that
could be investigated and used with confidence!

Now, such methods are in fact statistical measurements, but arranged
for and taken physically, to reveal an overall effect. If, for example, individual measurements of the temperature of single
atoms were possible, literally NONE would have the measured overall
temperature. Yet, nevertheless if that overall temperature had been
taken properly, it would accurately reflect the average temperature
of all the atoms involved.

Now, clearly, such measurements were all that were available to us in
the early days, but the necessary conditions were also not always
possible to arrange for. So, an alternative, when individual elements could indeed be
measured, was to measure as many as possible, and take their average
to represent that variable for all elements, and for the whole
situation.

With this method, we move to a more transparent type of statistical
measuring.

Now, I will not be spending much time upon either of these types of
statistical measurements. They are well understood, and have a large
number of cases, and an extensive theory concerned with them.

But, I will be attempting to reveal another type of statistical
measuring and consequent theories, which, though they can be made to
work, pragmatically, are, in fact, wholly misleading theoretically.

For these stop the possibility of physical explanation entirely, and
instead, along with a series of incorrect speculative models, call a
complete halt to theory-as-physical-explanation in the areas
concerned, and replace that objective with “working equations” –
without any explanatory account at all! They even switch their stance
to one in, which it is solely these fitted-up equations that are said
to actually drive the area of Reality under consideration, and
totally abandon the essential attempts at ever better physical
explanations.

I am, of course, talking about the Copenhagen Interpretation of
Quantum Theory in Sub Atomic Physics.

Now, such an approach is both illegitimate and misleading, for it,
more or less, terminated what Theory has always been – an attempted
explanation of phenomena in terms of the substances involved, and
their properties, and replaced that intention with merely useable
formal equations.

The attempt to understand is sidelined for a totally pragmatic
approach – “If it works, it is right!”

But, of course, that is incorrect!

It amounts to using universal forms or patterns as if they are the
driving essences of Reality, and that is not only impossible, it is
blatant self kid!

How can purely formal abstractions DO anything? They are man-devised
descriptions, and the very same forms recur in many different
and causally unrelated areas. So, how can they be the causes of all
of these qualitatively different cases?

Clearly, such forms are merely recurring patterns –useful for
prediction, but useless for explanation. They are about common
appearances ONLY!

The so-called Revolution of Solvay, in 1927, was, in fact, an
ignominious Retreat, abandoning real Theory for pragmatically useable
statistical equations, and hence leading Science into an idealist
mire!

This new issue is a kind of review of how far we have got in describing and assessing Man’s struggle to understand his world.

It has not been a straightforward history, for Man had had to literally change the game, in order to make any progress in understudying both his own context and, indeed himself. But in making that significant progress, it has been undoubtedly a heroic trajectory!

It is very important at this stage that a difference between Knowledge and Understanding be established. For, the latter was never an automatic development from the former.

To use the common description “Man has had to pull himself up by his own bootlaces” - or to use V. Gordon Childe’s appropriate title Man Makes Himself. Attempting to understand the world has not been at all easy, and perhaps surprisingly, has been predicated upon just how successful Man has been in more everyday tasks of survival and even prosperity. For, his basic general method was initially to grasp whatever was to his advantage, whatever that entailed, and gain himself both a measure of leisure and repose.

The brilliant ideas did not come first! For, it proved almost impossible to solve all the many problems of Mankind’s usual hunter/gatherer existence, including the many seemingly unavoidable and unbridgeable impasses in his contradictory development.

For well over 90% of human history, Homo Sapiens roamed the Earth in small family groups, his most sophisticated tool being a sliver of brilliantly knapped flint. Clearly, significant developments in his mode of life were impossible without large gains in that sphere. And while there were brief interludes during that long “childhood”, when he was able for a time to acheieve remarkable things - such as the cave paintings at Lascaux: they were brief and excenptional events. Something permanent in his means of life had to occur, to enable real and persisting gains.

It wasn’t until the invention and spread of agriculture and animal husbandry in the Neolithic Revolution that the developments in human understanding really took off. For instead of constantly living on the edge of survival, Man could then settle and gather in growing aggregations of people.

Even then the trajectory of developement was never smooth or incremental. Indeed, it was characterised by a series of “false leads” which enabled progress to be made, but which always, in the end, ground to a halt in yet another impasse.

So this brief foray attempts to trace out the subsequent paths, dead-ends, and hopefully the way forward, from where we have finally reached.

21 September, 2015

In the first instalment of Jim Al-Khalili’s series on BBC 4 entitled “Let There be Light! The Secrets of Quantum Physics", he tackles the long-standing argument between the position of the Copenhagenists and that supported by Albert Einstein, on what is termed Quantum Entanglement.

As is usual in this area of Physics, analogies are used to attempt to “solve” (though really only describe) crucial anomalies in Reality.

So, here Al-Khalili uses playing cards to represent what is supposed to happen with Quantum Entanglement at the sub atomic level. And, by a series of modifications, ends up with an experiment, using two simultaneously caused quanta of light, which are, “therefore, entangled”, and he looks at their polarizations, to see if the idea of entanglement is “correct”. But, he defines the test as being a dispute between the two contradictory explanations of a certain case of the phenomenon – one by the Copenhagenists, and the other by Einstein.

So, Al-Khalili asserts one must be right, and the other must be wrong! But, I have to insist, “Why should they be the only considered possibilities?”

The way Al-Khalili puts it, one answer proves the that the Copenhagenists are right, and that the phenomenon is totally inexplicable physically, while the other (Einstein’s) proves that the two photons’ properties were fixed when they were created, and no inexplicable link between the two would be necessary.

Al-Khalili uses his described Laser Set Up with photons, but insists that we see it in terms of his analogy with the playing cards, so how might he be misleading us? Can the playing cards change, or are they fixed? Clearly, we are persuaded that they cannot change, all by themselves – that would be magic – especially if the change was due to a measurement made elsewhere, at the other case. But this is also misleading us even more!

Our quantum entities are not playing cards that are fixed forever - they were created (in the more usually used example of Pair Production) modelled here by the split light into two photons (and considered to act in exactly the same sort of ways with regard to quantum properties), and the assumption of that creative process being the production of two massive particles from pure energy alone, is made without any chance of it being mistaken.

NOTE: We cannot continue such a discussion without questioning Al-Khalili’s many, quite definitely questionable assumptions. He refers to a photon, which we are to accept as a disembodied quantum of pure energy. Then, also, in the alternative argument, two particles can be created out of just such a high-energy photon. No possible substrate is assumed to be involved in these phenomena, and finally, in conclusion, that separated entities can be still instantaneously linked, no matter how far apart they get. These are not to be questioned. They are assumed to be totally unassailable. What do you think?

But, this theorist (Jim Schofield) sees the area very differently. The phenomenon of Pair Production is due to the dissociating of a known-to-be-physically-existing unit (not pure energy) in a universal substrate, made up of large numbers of these, each consisting of two mutually orbiting particles, of one electron and one positron, which can also hold and transfer internal quanta of energy by the promotion of that orbit. It has been observed in colliders as the Positronium - in it's stable state we call it a Neutritron. (By the way, this assumption also solves electromagnetic propagation through space, and all the anomalies of the full set of Double Slit Experiments – a supposed cornerstone of Al-Khalili’s set of assumptions embodied in the Copenhagen Interpretation of Quantum Theory). Finally, this alternative also stands upon very different holistic grounds, which means, “Everything affects everything else!” and also “Nothing is eternal!”

Theoretical particle - the Neutritron

So, if our Pair were linked (synchronised) at their joint point of creation, and, thereafter, were in-step-evolving from there on, then both Einstein’s and the Copenhagenists’ assumptions (both of which are entirely pluralist) and requiring both eternal laws and eternal entities – must be wrong.

Al-Khalili’s presented alternatives are not intrinsically opposite, so that one or the other must be the truth!

It is, on the contrary, an example of a classical Dichotomous Pair of concepts, due entirely to common, yet wrong, premises, by both sides of the argument. The contradictory pair was entirely due to their mistaken common premises (embodied basically Plurality for both sides).

And, as the philosopher Hegel clearly demonstrated, a sound critique, and then a necessary replacement of those erroneous premises, would remove the seeming contradiction, and allow the impasse to be transcended, and a consistent and better theory to be possible, while opening the door to further developments too.

Effectively, both sides of the argument were determined by the same errors, and hence no resolution would be possible without those common and false premises being removed and replaced with something closer to the truth.

Now, such alternative reasoning may sound to be something of a circuitous route, but it is far superior to the thing it replaces. Let’s face it; the premises of the Copenhagenists mean that certain things just cannot be explained physically, and we must not even try! And, as long as we have an overall, formal means of getting what we want, in a given situation, then we must be satisfied with that.

NOTE: The final part in the experiment to test Bell’s “thought to be final proof”, was that if there was NO built-in relation between them, then the overall results, in his analysis, would be “more than 2”, whereas if there was an in-built relation (as Einstein insisted) the overall results would be “less than 2”. But, this is really only testing between the two options proposed by the Copenhagenists and Einstein, and consistent with Formal Logic. Yet, with a non-pluralist, changing situation, that test would not be appropriate. The tenets of Formal Logic would NOT apply! The thinking is entirely pluralist, hence it must have the supposed, totally underlying laws – independent of context – the same happening in all circumstances – in fact they must be FIXED! Whereas, that will certainly not be the case at all – and the holist stance is bound to be be much closer to the truth than the pluralist.

As with all pluralist experiments, they are set up specifically to reveal a given pluralist law, and one, which, to the holist, is anything but that. It is, in fact, totally determined by the context of the experimental set up. It isn’t a fixed Natural Law at all.

Finally, the whole World of Formal Logic, of the Principle of Plurality, and Form as Cause is certainly mistaken. In the end, its laws are those of the World of Pure Form alone – Ideality, and NOT of the real subject of Physics – Reality.

And, to cap it all, the assumptions used were, at the time of their establishment, historically unavoidable.

Mankind didn’t come into the World already ideally equipped for such problems. They have has to develop them from scratch over millennia, and the posing of the problem in that way was the only thing they could do at the time.

12 September, 2015

Crucially, a whole series of mathematical philosophers, including Russell, Godel and Turing, proved that Mathematics isn't the totally consistent and comprehensive discipline it appears to be. So it is difficult to comprehend how someone like Mochizuki can spend many years and 500 pages upon a wholly new branch of Maths with such weaknesses having long been established.

How could such a work possibly be checked?

And of course, you also have the fact that Mathematics is an idealised discipline, which, though it has deep resonances with reality, actually occupies an entirely different realm - a universe of pure form. It is all about pattern, as if pattern in all its forms is an integrated subject!

It certainly isn't helped by the extension of space to more than three dimensions, which certainly don't exist in reality, but can be very useful in dealing with patterns with more than three variables, by enabling the alternative "graphical view and means" to be fruitfully employed (though by proxy via Algebra).

Also, in addition, Mathematics has been regularly extended, so that it has moved beyond its initial area. The inclusion of Operators as a kind of number is the most dramatic example of far. And it can be confusing. For, using well-established methods originally developed just for numbers, but now also on an operator like "2" - the doubling operator, makes for ambiguity between numbers and operators. And the whole realm of so called Complex Numbers is actually really concerned with operators, based upon the ubiquitous "i" - the operator "turn anticlockwise through 90 degrees" - which when applied twice becomes "i²", also becomes the Inversion Operator "-1" - hence the perennial teaser "i = sort (-1)". Clearly Mathematics is no longer just about number but about the manipulation of form - including various extensions well beyond its original realm.

Now this view demands that formal relations cannot be primary. They certainly cannot be essential drivers of reality as many scientists seem to believe.

It has to be the other way around. The physical substances and their properties (matter) are primary, and the forms are common consequences.

The position of modern Sub Atomic physicists, with their Copenhagen Interpretation of Quantum Theory, must therefore be nonsense - for they always make form primary.

Nevertheless, the very manipulatability of Mathematics makes it a pre-eminent and revealing tool of students of many parts of Reality, and clearly the most powerful handmaiden of the sciences - it just cannot be the essence.

Now, early in this account in New Scientist (3036), a proof in Mathematics is said to be a series of logical steps, leading from an established starting point to an undeniable conclusion. And it must be absolute! The Mathematical proof is supposed to demonstrate this without any doubts whatsoever. The problems involved at the present time are getting impossible, as typified by Mochizuki's work. Confirmation of a 500 page proof is so daunting nobody dare take it on. And, such colossal tasks are becoming more common in the field, as Mathematics stretches ever further away from its origins in Number.

The rest of this article seems to be about attempts to garner the possibilities of using computers to "lighten the load"...

On reading "Beyond Knowledge", an article in New Scientist (3036), about the direction that Mathematics is taking nowadays, I am forced to look in very different directions to the content of this contribution, and for very good reasons.

First, the abandonment of physical explanation in Sub Atomic physics to be replaced by purely formal equations, and an inexplicable standpoint, we are presented only with Mathematics, without any actual descriptions of actual phenomena, which does nothing to help us to understand it. Formal equations are hence considered to be the true driving essences of the universe.

Now, these well-established changes become far more important when using super computers to carry out complex mathematical proofs. Indeed this development, now in its 88th year - since the Solvay Conference where Einstein was defeated by the Copenhagenists, Niels Bohr and Werner Heisenberg - has totally undermined any further understanding in this crucial area of Science.

In addition to this dramatic effect, more generally, we have to stress that Mathematics as a discipline severely limits our studies of reality, not only to the consequent abstractions involved, but also in any further limited set of abstractions, which by no means delivers the whole of this important Science. And, outside of it, delivers little, indeed, of the crucial abstractions, which are Mankind's means of constantly attempting to extend its understanding of reality.

In the Sub Atomic area this researcher has, for some time, defined the Realm of Mathematics as being limited to the World of Pure Form alone - which I have termed Ideality. And, there is literally no chance at all of such a World delivering much of what actually occurs in Reality.

Perhaps the most damning feature of Mathematics is its dependence upon the Principle of Pluality, as its most important basis. For, Plurality only deals with things that don't change qualitatively. The assumption of Plurality suffices in stable and local situations, or for limited periods of time, as long as the crucial factors are unchanging. To generalise the assumption to cover all of reality is just not possible, and can only be majorly distorted if it is.

Mathematics deals in unchanging things.

Now, before the mathematicians line up to condemn this statement, let me explain how I reached this position.

Things change in quantity all the time, and Mathematics is good at dealing with changes in quantity. Some things, however, can not be quantified, and crucial significant changes are usually qualitative in nature. Mathematics cannot cope with these kinds of changes at all.

Laws which are delivering current predictions can suddenly terminate, delivering either inexplicable zeros or infinities. The Mathematic response to this failure is simply to change to a new Law that works. They never attempt to morph from one Law to another, or even to explain the switch.

The trick is of knowing both Laws, and either switching between them via the signal of an important variable passing a key threshold value, or alternatively, use both laws fused together into the frig, of a supposed compound Law. The changeover is still unexplained.

And, of course, such "Laws" can only be derived retrospectively, integrating what had been previously experienced and measured.

In fact, Mathematics and Pluralist Science generally, are both based upon entirely fixed (eternal) Laws of Nature - the key realisation is that these do not actually exist. In fact, even sets of "Laws" are not independent of one another, but actually modify one another to various greater or lesser extents. The pristine Laws which we extract and manipulate are not what is acting in nature. They are careful approximations, simplifications and idealisations of natural phenomena, which we obtain by very careful farming of the experimental locations investigated.

Not a single "Law of Nature" is eternal. And our banker principle of Plurality is also wrong!

A much closer, real-world conception is Holism, which simply states that "Everything affects everything else".

Now clearly we have moved a very long way from Mathematics in delivering real relations in our world. Mathematics can, and indeed does, work in those carefully organised and controlled domains very well, but it far from the fundamental foundation of the universe it is purported to be.

The crisis in physics since 1927 has shown that even materialist sciences have been severely compromised by these incorrect assumptions about reality - seeing the world through the distorted spectacles of Plurality.

The best of science today is no longer Pluralist, but Holist and Materialist. We must rescue Physics too from this fantasy world.

11 September, 2015

What Bell's Inequalities are about, is not what is claimed. It is about formal descriptions of reality and not the material world itself.

In so-called Quantum Entanglement, the assertion is that in measuring one of an entangled pair of particles, it influences the quantum state of the other, even if they are a million light years apart, and it does this literally instantaneously (obviously much faster than the speed of light, at any rate).

For any ordinary mortals reading this, I must point out that the believers of this "magic" are sub atomic physicists - and this kind of drivel is pretty well par-for-the-course in those quarters.

However, when it comes to actually "confirming" this phenomenon, they must measure one entity and then the other, or even simultaneously to prove their case. My concern is, "How do the experimenters know a change has been made to the other one of the pair?" For, if you measured the first, then it would immediately influence the other member of the pair. Clearly there is a problem here.

Do they regularly measure them alternately or simultaneously to attempt to establish a pattern? Questions arise even for those who support the theory. How could you ever know what the undisturbed state of either one was?

You can't of course! So what do the "believers" say?

They insist that prior to measurement they are both simultaneously in "all possible states at once" until you actually measure one of them, which then forces it into a particular one of those possible states.

Such ideas recur throughout this theoretical stance: it is the basic myth of superposition once again! This concept states that a particle (before measurement) is simultaneously in all possible positions (like a wave), but with a fixed probability of being in each and every one. And, this remains the case until we measure its position, and by doing so, fix it into a single possible position.

Ryoji Ikeda

Now, though this is totally counter-intuitive (and most probably wrong), it does allow statistics to be used over a number of cases, and the statistically arrived-at answers do indeed match certain observations in reality.

The mathematicians make it all work by taking a Wave equation and associating probabilities to all possible points on the wave, which are interpreted as being probabilities that the particle is in each possible position.

Notice that this method cannot deal with the position of a single particle, but can give overall estimates of a whole group!

As a physicist myself (and one who was originally a mathematician), I have a name for such methods - I call them frigs! They are mere tricks. Such techniques are often used in Mathematics as clever ways of arriving at hard-to-find solutions to purely abstract equations.

So maybe you can see how they happened.

With this in mind we return to Quantum Entanglement - this totally counter-intuitive standpoint is described as having a before-measurement-particle only existing "as a fuzzy cloud of probabilities of all its possible states." And this is how they avoid the otherwise necessary infinite regress! Instead of an oscillation with each and every measurement, we are expected to believe that before measurement, such quantum entities are not in any particular state at all, but when measured an entity will suddenly be in a specific state, and its remote entangled partner will somehow be affected by this intervention too!

In other words, more generally, we can conceive of such things as particles, but nevertheless, often take such things that are as particulate as its position, as a quantum property, as if controlled by a wave. The trick involved, for it can be nothing else, is that of all possible positions in a wave represented by the probability of the particle being there. And this is, of course, complete nonsense, both in the way it is presented and used by these scientists.

Unless, that is, you consider there to be an actual substrate, filling all of space, which is both affected by, and can in turn itself affect, the enclosed particle.

In previous work undertaken by this researcher all the various anomalies of the infamous Double Slit experiments were completely explained away by the assumption of the presence of such a universal substrate - at the time called Empty Photons.

The idea of a substrate was far from a new supposition, it had at one time, been the consensus view. But a substrate was never detected, so the prior theoretical idea, known as The Ether, was permanently dumped as insupportable, despite the fact that James Clerk Maxwell, using his theoretical model of The Ether, derived a correct set of Electromagnetic Equations which are still used to this day.

Clearly, all the points made here must be addressed. In fact, this theorist suggests that the whole myth of superposition and multiple simultaneous states, was invented to explain the results of things such as Quantum Entanglement.

Now, the reader might wonder how scientists could be so influenced: for it runs counter to the basic materialist conceptions that are the key premises of Science. The actual reason for this is clear. They have abandoned Physical Explanation for Purely Formal Description. They are no longer physicists, for they have rejected the physical world - they are merely mathematicians!

Einstein's dismissal of Quantum Entanglement is encapsulated perfectly in his phrase:

"The Universe is real - observing it doesn't bring it into existence by crystallising vague probabilities"

For such are most certainly, idealistic notions.

There can, however, without recourse to idealism, exist a hidden universal substrate with wave-like characteristics, and a sort of symbiotic relation between that substrate and physical particles moving through it.

It is the impossibility of answering my question about the "entangled particles" measurement that precipitates this monstrosity of a theory! The counter to that position by de Broglie, and later by David Bohm, about so-called "hidden variables" did not solve it, as these features were never found, no matter how detailed was the study of the particles involved.

What was really needed to attempt to explain the sub atomic world was a separate substrate, involving a reciprocal and recursive relationship between a particle and its context. For then, and only then, can we have a passage of time between the initial influence, and then the recursive effect. The assumption of an intrinsic "Pilot Wave" meant simultaneous effects, but the role of a substrate as intermediary allowed this crucial delay.

It is the formal, and even stilted nature of the mathematical physicists' thinking, that draws them inexorably towards the Copenhagen myths, and unfortunately away from reality.

Niels Bohr's insistence that the Quantum States "explained" things that classical physics could not was false in the first part, while true in the latter condemnation. In fact neither approach could explain our observations. Bohr's position was descriptive of certain forms, but not in the least bit explanatory. Forms do not explain! They can describe reality, but they don't even do that perfectly. All equations are descriptions of idealised forms, they are not even accurate descriptions of any natural part of reality, they are always approximations, simplifications. Those forms can then only be applied back on to areas of reality that we have carefully prepared, or farmed into experimental domains. Here lies the role of technology in all our investigations. The form's validity is then confirmed by successful use in these domains.

The battle between the two standpoints embedded in Science was never resolved, because both sides of the argument subscribed to the same belief - that equations represent reality as it is - an obvious fallacy when you stop to think about it. Both the classicists (such as Einstein and de Broglie) and the new school mathematical-physicists (Bohr, Heisenberg et al) were completely wedded to form.

Even Einstein's Relativity, and Space-Time Continuum were crucially formal ideas.

So, in spite of a small section of physicists refusing to embrace the Copenhagen Interpretation of Quantum Theory, these remnants (in the 1960s), after 30 years of argument, required a final means of settling the dispute. And for the Copenhageners John Bell's suggestion was a godsend.

But he did this using only the purest basic forms of Mathematics, to which both sides mistakenly subscribed. Bell used Set Theory, and its embodiment in Venn Diagrams to "do it".

Now here had to be the most inappropriate "proof" concerning anything in concrete reality, for it only dealt in idealistic laws, and this was to prove what reality really was, and to do it by this means alone!

Bell used this method to construct a set of inequalities which could be clearly demonstrated in Venn diagrams, and as such, he had to be handling fixed things: no qualitative modifications or evolution of those forms could take place, as it was impossible by such means. It would be accurate to state such a basis as the premises for Mathematics and Formal Logic only.

Bell used these as a basis for tests about reality. He used his Inequalities to set limits, and if in concrete reality they were clearly exceeded, then the claims of the opposing realists were "proved to be wrong", and Quantum Entanglement was proved correct.

Many will have noticed it was a proof which only really convinced the faithful! This kind of "proof" was reality to them, it was their everyday modus operandi. But this gave the Copenhageners the result they required. The vast majority of physicists now claimed Quantum Mechanics victorious, and Realism finally defeated.

Bell had generated his required test using Formal Mathematics, and, as that was the "Essence of Reality" it simply must deliver a valid and correct test. But the actual conclusion of this method should be no, you cannot prove the nature of concrete reality solely by resorting to Formal Logic. Only other forms are provable solely by Mathematics. And only phenomena consistent with Formal Logic are provable by the methods of Formal Logic! Nothing else is possible in either case.

Nevertheless, though all experiments seemed to support the idea that Bell's Inequalities proved the Quantum Mechanical position to be true, the fact that it wasn't correct actually refused to go away. However, this recent Dutch experiment mentioned in New Scientist was supposed to settle the dispute forever...

The test was proved over many productions of Entangled pairs, and it was the statistics of the many runs overall result that delivered the required answer to Bell's Inequalities.

So, what had actually been achieved?

It was his formalisms that were proved correct!

He had suggested a test for his Formal reasoning, not for any feature of concrete reality.

Lots of so-called "loopholes" - all put forward by scientists who actually agreed with their opponents on the mathematics involved, turned out to be not only wrong, but entirely inapplicable. But as they came from the same camp in their attitude to the primacy of form, proving things in these loopholes was inappropriate anyway. They merely corrected their formal mistakes - absolutely nothing to do with concrete reality at all! All the Dutch group achieved was the defeat of their opponents on Formal Reasoning only.

Hanson lab at Delft University of Technology

However, it is easily proven that by the means he used, Bell's Inequalities can only be used to address Ideality - the world of purely formal relations. They don't actually mean anything in concrete reality at all!

I concur that this condemnation of the precedence of Form over Content is still not enough to debunk these ideas. The most crucial principal in all such experimental investigations, both classical and Copenhagen school, is the cornerstone of all formalism and all analysis - the Principle of Plurality. This view of the world sees it as composed of many simultaneous natural laws, with different mixes of these happening in each and every observed situation. This can be drawn as distinct from Holism, which sees all things as inter-connected and inter-dependent, where Plurality sees only inherently separable component parts, which can always be deconstructed and analysed. Analysis can be made of any given situation, however complex, through isolation, simplification and control of those components, extracting the laws from the mix. Such methods are the basis of literally all scientific experiments.

This is all fine (and Science certainly wouldn't exist without such analytical methods), until erroneous assumptions are made about what this means - Plurality assumes that any law extracted in this way, is identical to that when acting in totally unfettered reality. And this is not true. In unfettered reality all "laws" are modified by their context. Realising this is the first step towards a Holist stance on Science. The objectivity of this position is confirmed by the fact that any law extracted by the usual method of farming and controlling a context for the experiment, can only be reliably used in that same context. The Pluralist view survives (and indeed thrives and dominates) because we are extremely adept at arranging for it, at controlling our environment, and this makes both prediction and production possible.

But, in theory, all reasoning using such laws as the actual components of reality, is bound to be wrong. Pluralist laws and techniques are pragmatically extremely powerful, but theoretically greatly misleading.

It isn't just specialisation that leads to scientific teams consisting of experimenters, theoreticians and technologists - all of these roles are actually differing standpoints, and all are essential to the Scientific process. But they will contradict one another! Disagreements are unavoidable, and dead ends are likely in many scenarios.

Postscript

This paper concentrates upon the underlying bases of the methods and reasoning used in postulating Quantum Entanglement. Despite the fact that I think this torpedoes Quantum Entanglement from the word go, QE forms the last line of defence for the regressive Copenhagen Interpretation of Quantum Theory, which must be defeated, so a job must be done on it!

I am currently working on a new body of work entitled Quantum Disentanglement, which I hope to publish as an extended issue of the Shape Journal in coming weeks...

10 September, 2015

This is a message to Socialists in the Capitalist West; but it is also relevant to those in the newly Capitalist East too. What has been missing, largely due to the diversion of what we term Stalinism, is real Marxism. We need to equip ourselves to do again what Lenin and his comrades did in 1917.

We have had a massive slump on a worldwide scale, and it isn’t over yet. Capitalism is faltering and it should be our greatest opportunity, yet in spite of the Arab Spring uprisings, and the Revolution in East Ukraine, we are virtually invisible... And, without a socialist alternative, in the recent General Election in the UK the Tories got in, again!

What have you been doing? I’m afraid even 250,000 in London after the event is no good!

The Greeks have been demonstrating all the time, and got an anti-austerity party in, followed by a resounding “NO!” to the austerity merchants. Even, the Egyptians had a Revolution and occupied Freedom Square incessantly, yet ended up with a military dictatorship once again.

The problem, surely, is in understanding the processes involved. It is not enough to condemn Capitalism. You have to know what to do about it!

Do you call yourself a Marxist?

If so, what do you think that such a position is merely a political stance, or is it a philosophy? (In fact, the most sophisticated that Mankind has ever produced). But, do you know what it is and how to use it? I don’t mean tactics, and regular day-to-day activities, I mean do you use it to understand these situations, and, crucially, how can things be changed?

I am certain that literally all activists against Capitalism have no idea of the power of this philosophy! To get some idea of its range and power see SHAPE Journal, on the Web, where it has covered all issues from Politics to Science, Philosophy to Art, and many more.

Did you know that the greatest archaeologist, V. Gordon Childe, was a Marxist?

Did you know that Lenin wrote a book condemning the world famous physicists Poincaré and Mach for the Empirio Criticist stance, which was the immediate predecessor to the current idealist stance in Modern Physics?

Currently, SHAPE Journal addresses the Philosophy of Marxism in all disciplines, and has published 74 monthly Issues over the last six years, which has included over 450 articles, with another 250 posts on the SHAPE Blog. Have you seen them? They are all available for free from www.e-journal.org.uk

And currently the main theorist on SHAPE is cooperating with others in the USA and India to bring about a replacement for the current, so-called Copenhagen stance in Physics.

Don’t you think, as a socialist, you should be addressing this body of current Marxist works, and even contributing yourself?

05 September, 2015

In Capitalism, the establishment of a company not only needs ideas for a required and viable product, but also, and primarily, the money (or Capital) to establish the organisation, its necessary equipment, accommodation and staff to carry out the whole scheme. For, only with sufficient start-up Capital can all this be assembled and organised. And, there is, thereafter, an ongoing need for extra Capital to make changes and keep the company competitive.

Capitalism, as its name implies, requires constant access to more Capital. Indeed, there must be a regular supply of the resources necessary to make the intended products, long before payments for those, as sales, will be arriving.

The whole process depends upon the availability of such Capital, and the increasing march of Technology, also means that the costs involved soon far outreach the resources of craft manufacture and demand the most sophisticated and expensive machines to stay competitive.

The prior system to Capitalism could not deliver such things, so Capitalism was indeed an advance. But, of course, still the resources of Capital have to come from somewhere. Where would that normally be?

Now, in spite of being a life-long socialist, I am also an educator of long experience at every level of Education, but also a research scientist and computer expert. So, at a particular point in our careers, a colleague and I struggled to establish a company producing the most advanced multimedia aids in the world – designed to be used in certain very difficult areas of education. We knew we could do it, for in funded research, we had made exceptional products, that even won a National award in the United Kingdom “for excellence” from the British Interactive Video Awards organisation (BIVA), and, later, the award of degrees, based on our achievements, in addition to winning the most prestigious award in our field in the USA, we got only meagre financial resources, and often no grants at all, to fund new products, and market them worldwide. Now, there we were in a Capitalist System, why didn’t we apply to the usual sources of such Capital – for example the Entrepreneurs and the Banks?

Well, there were very good reasons for this.

We were primarily educators, working at this stage in Universities, so our primary job was that, and that came first. And secondly, our experience with several “interested parties” revealed that their interest and main criteria were about how much they could make out of the venture. NOTE: The Dragons’ Den TV programme reveals investors' concerns very clearly.

Our objectives were very different, indeed, from all of theirs. Frankly, the means to access this Capital meant that you had to subscribe to the motivations of that system or you would get NOTHING. We refused such offers, and decided to do two things.

First continue to apply for grants, and

Second, to work for NO pay and use what we would have earned from sales as our future Capital.

It was for very good reasons that we had a long uphill struggle amounting to 10 years, before a major breakthrough with our product Wild Child became a worldwide hit.

Now did we do the right thing? Of course we did! All the imperatives were determined by the discipline involved, which was teaching Dance Performance and Choreography, and our approach led in just 6 more years to having products distributed in over 100 countries. We even managed to outflank the establishments in our field by using the web.

So, there are important lessons here, as to what will be necessary in a Socialist Society, where there is no super wealthy class, and the Banks are all publicly owned, with an entirely different remit from what caused the 2008 crash world wide.

So, Capitalism concentrates available Capital into the hands of the class who see it as THE generator of more of the same. Indeed, the practical properties of what is produced are definitely secondary to company-involved’s power to make Money. And it has become a self-defining and self-perpetuating system with Capital as both its means and its purpose.

Interestingly, the efficacy of the products produced is not primary. So, a perfect product will still not persist perpetually for it cannot generate constant replacements, and, therefore, the requisite flows of Capital. Only new products can do this, so the whole system is constantly renewing itself, in order to regularly increase profits (Capital).

Now, it isn’t actually sustainable, because it is regularly disposing of past solutions to replace them with new ones, involving some new feature, so everyone is pressed to update to be at the very forefront of what is currently available. Clearly, such an imperative definitely will maximise profits.

Yet, such a system cannot be said to be reliable: it is always under pressure to renew, and this cannot be said to have been to improve products efficacy, so it pours Capital through the system via credit, which it must then pay back with sufficient interest – all the time! Short-term returns are the accepted measure of success, and NOT how much is owned and owing by the company involved.

Any loss of confidence among the investors, and a recession or even a Slump will ensue!

In fact, no one can actually repay what is owed. Many loans are taken on to repay previous loans, so the system is never-ending.

Also, no bank can return all deposited money back to account holders, and a run on a bank, if it isn’t rescued by getting a loan itself, will very quickly ruin it.

So, to replace Capitalism, you have to change the whole system from bottom-to-top.

At present it is the holders of most wealth who determine what happens. After all, they hold the real purse strings! And, of course, they have their own purposes (usually to get even richer, or at least protect their wealth and status).

Clearly, such a set up cannot continue forever: it is increasingly unstable, and its crises get more and more difficult to address.

Now, it is clear that Capital is necessary, but that doesn’t mean that Capitalism is inevitable. We have to make a clear decision after a revolution, “Who should hold and invest the wealth?”

Let us learn from what happened in the Russian Revolution!

The major institutions were all nationalised, and without a penny compensation to the ex-owners. They started as thieves, and continued as parasites, so they will get absolutely Nothing!

Now, this aspect was crucial. For what they considered to be their Capital was never really theirs in the first place. It had always been generated by subsequent production, but always ends up (primarily) in someone else’s capacious pockets. When addressing this mess, these thieves shouldn’t get a single penny.

They will have to work for a living, like everybody else.

Capital will not be allowed to go to any private individual. So, who, or more accurately, what should hold and allocate all Capital?

There is only one answer! It has to be the democratic organisations of those involved in its creation – the Working People! At first, it will be in their now-worker-owned Production Companies, but then, later, in their Democratic Organisations, such as the Soviets (or elected Councils) at every single level.

And, no individuals should wield total control of such wealth, even with such people’s organisations. For, they would inevitably use that power to their own personal advantage.

Clearly, though the State will play a role, the real question has to be about the actual form of Democracy that will be involved. And to totally prevent the building of organisations against these principles, there will be NO Stock Markets, and NO private Banks! No singularly powerful groups of cliques will be allowed – only democratic organisations, responsible below to their electorate, and above to their next democratic level organisation. Clearly no such easy solutions can, this time, be allowed.

What will be involved here is a real Revolution, and by its very nature, exactly what will be created in such an Event, cannot be prescribed completely beforehand. We don’t and can’t know what will emerge, except that such an event is the most powerful creative force that can exist!!

But, we must constantly guard against the rising of individuals and/or groups, who will undermine what is being constructed, to their own advantage. This is the risk.

01 September, 2015

One important consideration when dealing with vortices within a substrate, must be to include the initial background state before their creation. For, such would most certainly affect those vortices, in addition to their clearly obvious causes by any moving material intruder.

And, what is most clearly shown in the turbulent atmospheres both of the Earth, and even of Jupiter, is that the vortices occurring in those circumstances are not simple consequences of the causing limited-swift-flows, or intrusions of some kind alone.

In the case of Earth’s atmospheric disturbances, one of the key causes-plus-results of the overall situation is the high-speed of the clearly determining Jet Stream, which itself has causes, governed by the heat supplied by the sun and the spin of the planet. Whereas on Jupiter, the initial causes, at least, seem to come from processes inside the planet, itself, along with a similar context of planetary spin.

Clearly, the spinning of both of these planetary bodies significantly affects the moveable, cloaking substrate of the atmosphere, and, once again, the results react back to become further affecting causes in themselves.

Now, though all these considerations are vital, particularly in the examples mentioned on the planetary scale, the key question arises, “What will be different when considering a Universe-wide, substrate, composed of micro components, inevitably also extending to within the Atom?”

It is just possible that “down there”, we could profitably assume, as a simplifying, first approximation, that NO such external disturbances will be significant. We could locally assume a totally quiescent substrate only disturbed by the extremely close effects of internal components, and especially the orbiting electron (when considering, of course, a Hydrogen atom as the simplest possible case).

And, unlike the majority of disturbances in atmospheres, the situation within that enclosed space, must be affected, from the outset, by not only delivered effects, but also by the recursive effects of the vortices, caused by the orbiting electron acting back upon their original causes. For, with orbits the conditions are constantly being repeated time-after-time with each and every orbital return of the electron to previously affected parts of the substrate.

And, of course, these “same again” effects will either accumulate to dissociate the atom, or, much more likely, settle, somehow into a stable and persisting situation.

The situation is crucial, because, almost uniquely, this set up produces Quantised Orbits of the electron involved. And, this means that a fixed set of allowed orbits results, of different radii, and hence a consequent set of fixed energy levels. And, it is these, which allow the storing of electromagnetic energy therein, and, in turn, govern precisely both the frequencies and emitted energies of any released quanta.

The processes involved, in these energy transactions, are brought about by the demotion of a previously promoted orbit - from a higher orbit and Energy Level to a lower one.

Indeed, in comparison with Yves Couder’s experiments in a silicone oil substrate, with the establishment of a stable entity (termed The Walker) in that case, and the possibilities, which that also strongly suggests as also being relevant for the Atom – means that they both seem to indicate a sufficiently undisturbed environment for such stabilities to be established purely internally.

Indeed, the case of Couder’s Walker seems to suggest a very similar conjunction of resonances of various oscillations plus recursive feedback to be the crucial physical, formative causes for the remarkable occurrences in the atom too.

To get a better idea of what this means, let us take a more common case – the products of vortices by a narrow and fast moving stream entering and passing through a still pond. It seems to be a relatively simplified situation, but is nothing like as localised and indeed “locked-in” as is likely to be the case within the atom. First, the causing stream is unlikely to be close to behaving with the usually-idealised “streamline flow”. On the contrary, it is certain to carry with it disturbances, from its own forming history.

So, these are also brought into the still pool.

Also, there are no close constraints upon its subsequent passage, so vortices would be created, and then inevitably left behind, throughout its subsequent passage, to simply dissociate into almost random disturbances, while the continuing stream continues to generate more vortices elsewhere, thus contributing to its own ultimate demises a discernable, coherent stream.

But, within the Atom, on the contrary, the causing orbiting electron is a constantly returning enmity, repeatedly interacting with its own previously created vortices. And all of this is happening within the close confines of the atom, within a tiny, local area.

So, as with Couder’s Walker, stabilities could, and indeed must, be possible, if all the interactions are appropriately tuned to elicit the observed special effect of stable quantized orbits.

NOTE: Intended here is a suggestion, by this theoretical physicist, that the physical arrangement of the atom is such that an electron NOT in a stable (quantized) orbit will inevitably lose energy to the substrate, and so reduce that orbit until it matches an allowed level, at which it will become stable, and will stay the same until some external conditions cause it to change internally.

Now, clearly, these few inclusions are nowhere near a full explanation. The reason, for getting as far as we have in this problem, is the concrete evidence of Couder’s Walker, where with only a substrate and various oscillations a stable Walker was not only produced, but also maintained as long as the producing conditions persisted.

That alone was sufficient to begin to assume the possibility of a similar occurrence within the Atom.

http://fuckyeahfluiddynamics.tumblr.com

But, just as with the Walker, where considerations and consequent explanations of the properties of the substrate and the bouncing drop, and of course, the absolutely essential matching of the involved vibrations, the situation in the atom will unavoidably also include many other considerations.

For, each element’s atom is different, with different quantized levels, and hence the influence of the many different nuclei (which also perform their own small orbits), will have to be included in a final and comprehensive Theory.

Postscript:

The current state of play in these theoretical considerations has now been taken beyond these brief notes in the SHAPE Special entitled The Atom published in SHAPE Journal on the Internet in July 2015. You can read that issue here.

About Me

I am a retired lecturer and full-time writer. As the truth of Science has been my major concern throughout my life, I cannot conceive of teaching it in an uncritical, passive way. It's truth or error is THE question, and its improvement must be my main purpose. Teaching for me is Philosophy, and that means taking a stand on all sorts of issues, not sitting on the fence!