Wikipedia: Decline Or Maturity?

My recent migration to Debian Jessie, which required fixing my local tweaks for web-applications ended with some examination of Wikipedia, of which I have a snapshot from 2004 or so. I have edited it a lot and customized it for use in the schools where I worked but lately I have not maintained it well. Through various upgrades some links broke and I must have restored a backup that messed up the archive of images. Even damaged it is a huge asset.“The first few edits of these newcomers indicate that they are trying to contribute productively (i.e. acting in good faith) and, therefore, likely will become valuable contributors if they remain in the community. We show empirically that, while the proportion of desirable newcomers who arrive at Wikipedia has been holding steady in recent years, a decreasing fraction of these newcomers survive past their initial contributions. We demonstrate that the decline has been caused, at least in part, by the Wikipedia community’s reactions to the enormous influx of contributors between 2004 and 2007. In order to maintain quality and efficiency during this period, the community’s views toward the goals of the project changed. These new views were instantiated in a set of policies, and a suite of algorithmic tools were developed for enforcement. Over time, these changes resulted in a new Wikipedia, in which newcomers are rudely greeted by automated quality control systems and are overwhelmed by the complexity of the rule system. Since these changes occurred, newcomers – including the crucial, desirable newcomers – have been leaving Wikipedia in droves.” Currently, Wikipedia is one of the world’s great websites, full of information and very accessible. It is somewhat mature in that just about everything has some coverage but more work remains to be done. The question of decline or maturity is about whether or not Wikipedia is less vigorous because the job has been largely done or because changes made years ago have become a millstone, weighing it down.

I will describe some of my contributions over the years. Where I have particular knowledge and I see a gap in some article that I was reading for my own purposes, I have offered edits from time to time. At first, that was that and I went on with life. Then came a time when whatever edits I made were almost certainly rejected by some nameless creatures in the system, rejecting my work sentence by sentence because I did not provide proof of every assertion, almost every sentence or phrase. I kid you not. A paragraph could not be contributed. It had to be a list of sentences with one or more references to the web for each one. Asserting that the sky was blue was unacceptable. One had to prove it. Stating the obvious and how it related to well-known facts and principles, reasoning, was never enough. To me it was as if binary bits were OK if they were copies of stuff elsewhere, but expressing any idea however modest was unacceptable.

Apparently I was not alone in this depressing phase of Wikipedia. According to TFA linked below, thousands of editors have dropped in, contributed, and fled. Wikipedia just isn’t a great place to live any longer.

Don’t believe me. Look at some examples.

GNU/Linux Adoption – when I added this edit,“There is another reason that web counters are unreliable. Some are clearly connected with use of operating systems in business. For example, when 10000 users at Google’s headquarters moved at once to GNU/Linux in the summer of 2010, [[Net Applications]]’ web stats showed a swing from a few percent to 88% for the city of [[Mountain View, California]]{{cite web|url=http://mrpogson.com/2012/02/28/mountain-view-california-penguin-heaven/|title = Mountain View, California, Penguin Heaven|accessdate = 14 March 2012|last = Pogson|first = Robert|year = 2012|month=March}}, a city of 74000 people, and a swing from 1.87% to 18.69% for [[California]]{{cite web|url=http://mrpogson.com/2012/03/01/penguins-seen-over-california//|title = Penguins Seen Over California|accessdate = 14 March 2012|last = Pogson|first = Robert|year = 2012|month=March}}, a state of 37 million people. Clearly, 10000 people is a small change in usage but was over-counted because it was used in business. The fact that Windows is heavily used in business results in low numbers for GNU/Linux. Web-counters can readily select for business usage by counting during hours of business in a location or counting only clients from business Internet domains.” I ran afoul of the “no original research” rule, and my contribution was tagged with a “conflict of interest” because I provided a link to my website. In the discussion of my contribution, you can see the problem:“NetApplications does not publish the charts, only values month by month and location by location. My blog collects their data. I am not the source. Do I need to cite the URI for each of dozens of datapoints? The paragraph I added is about the change and how and why it happened clearly showing bias in Net Applications numbers. I have a M.Sc. in Nuclear Physics and know how to analyze data. The wide publication of “1%” is clearly wrong information coming form Net Applications and my collection shows that quantitatively. Assume a world using 90% GNU/Linux and Windows adoption at Google. 10K people showing huge adoption that is not valid. Pogson (talk) 13:29, 14 March 2012 (UTC)Your own blog is not an acceptable reference as per WP:SPS, regardless of what qualifications in stats you have. You can’t cite data points with your own graphs and interpretations about them either as this would be WP:OR and WP:SYNTHESIS and is specifically not allowed on Wikipedia. You need to find proper reliable independent third party refs to retain this text in the article. – Ahunt (talk) 13:45, 14 March 2012 (UTC)The refs you added do not support the claims you have made in the text, so I restored the “citation needed” tags. Since you removed those again the only choice remaining is to remove the challenged text as per WP:V, which I have done. Please don’t add it back in without proper, reliable refs that actually support the text this time. – Ahunt (talk) 16:30, 14 March 2012 (UTC)Sounds good, maybe we should archive these older threads as well to stop people adding to discussions from some time ago (hard to follow). IRWolfie- (talk) 09:48, 15 March 2012 (UTC)” So, how does a contribution to human knowledge make it into Wikipedia? Politics. Popularity. Stuff like that. It’s not enough to be correct or useful, information has to be acceptable to some elite in the organization. Wikipedia has lost its way.

The words that greet me on Wikipedia, “Hello, Pogson! Welcome to Wikipedia! Thank you for your contributions to this free encyclopedia.” are hollow and a sham. If Wikipedia were really free, the powers that be would be all over my contributions improving them rather than deleting them. What does their page say about the reliability of web stats these days? Well, that’s water under the bridge. The Linux Adoption page is quite different now. Just a line or two on the matter but they still cite other’s original research on the topic.

Really, how vital is an organization of house-builders that insist on bull-dozing each other’s work?

About Robert Pogson

I am a retired teacher in Canada. I taught in the subject areas where I have worked for almost forty years: maths, physics, chemistry and computers. I love hunting, fishing, picking berries and mushrooms, too.

85 Responses to Wikipedia: Decline Or Maturity?

System restore points work if you are just bringing the system back to some set of installed packages. They have nothing to do with what malware or users have done and certainly would be useless for what I had to face, systems on XP SP1 long after SP3 was out. They were also using FAT… I found a unit that functioned more or less correctly. Updated it for many hours and distributed an image of its hard drive around to all similar units. There were four different types of systems that I had to do that for, a heck of a lot of work and still the XP systems quit on us. With GNU/Linux there were literally no software problems for more than a year before I left. So, just shut up, twit. You weren’t there. M$ had no solutions we could afford. I did the best I could. No one in the organization had proper paperwork for licensing the software and stickers were gone on several machines. I used them as servers to hold the images. All I had were 40gB hard drives, one on each machine. You bet we were short of storage for holding XP together.

To be correct system restore was changed to volume snapshots in Windows Vista but just like Microsoft the interface between the GUI and the Volume management was kind stuffed up leading to the magical disappearing disc space due to unreported snapshots. Yes its Windows 7 when system restore completely works without causing strange issues. Vista roll back is perfect but its tracking of restore points is not. What is the vista bug the restore point list is store in a file on the partition that is rolled back perfectly. There are so many ways you can make a volume restore system and shoot shelf in foot Microsoft found them all.

DrLoser System Restore points don’t work correctly in anything prior to Windows 7. XP system restore points are made by a file system operation intercept that is not perfect. Result is alterations happen that System Restore should undo when you go back to a point in time but they get left. As the crud of incorrect rolled back stack up you are in more trouble. This is why on XP the more you use System restore the more likely you will have to reinstall the thing from scratch. You are better off using clonezilla or fog or nortons ghost on XP as these take a full system snapshots. Windows 7 and after changes to using volume snapshots for system restore that truly to catch every single change. The change to volume snapshots was because of the multi-able failures that had been reported with XP.

System restore points is another example of Microsoft implementing a feature the wrong and requiring multi versions to get it correct even that the correct option had existed in windows.

DrLoser since you have focus on only knowing Linux issues you suggest stupidity to what people could have done with Microsoft products. Schools here in Australia with XP ghost application was standard operation.

The rules are quite simple to running XP. Problem is if you don’t know one of the rules your actions will cause disaster.
1) XP Home Never Network you will regret it every single time.
2) Don’t use system restore. Instead use a clone drive product open or closed source.
3) have anti-virus products.
4) keep updates installed. Remember Windows update service sometimes stuffs it self and will be reporting to wsus updates are installed when they are not. This issue is fixed in Windows 7. XP and 2000 suffer from this. 7 windows update may fail to install updates but it is reported to wsus.

Fairly much anything prior to Windows 7 is a pretty big pain to admin perfectly. Not knowing all of the XP issues will leave a big bad taste after doing admin work with it due to Microsoft documentation tricking you into that things are a good thing when they are extremely bad.

It was not clear to anyone whether or not there was a proper licence for anything

This “anyone” being you, I assume, Robert.

Now, I’ll accept that you were parachuted into a bit of a legal pickle here. What I don’t understand is why you didn’t write a position paper of about a page or so and forward it to the school administration. That is what they are there for.

In the mean time you are covered and you can make the assumption that what you are left with is “legal.” Paper trails and good intentions are the way to go, I think.

It was not clear whether or not backups had to be per machine or per type of machines. Anyway we did not have the storage for per machine. We just moved to GNU/Linux and everything worked as it should.

We had a mixture of machines where I last worked. I do recall some were XP Home but I can’t remember the others. I think XP Pro was the other choice. I don’t think we had those kinds of problems because students didn’t do much sharing, teachers did and their PCs were standard XP Pro as I recall. It was a chaotic system that was installed years before I arrived and was in chaos when I arrived. It was barely functional and I could not afford the effort to fix it if that was even possible/legal. It was not clear to anyone whether or not there was a proper licence for anything… It was not clear whether or not backups had to be per machine or per type of machines. Anyway we did not have the storage for per machine. We just moved to GNU/Linux and everything worked as it should.

This thread has really derailed. Interesting how it became Microsoft shills insulting users and developers of other systems. What a public relations strategy!

Just as a reminder, my original post to the article said:

“Wikipedia is normally so wrong as to be useless. Certain plants that are growing in my backyard they claim are extinct – and nothing can convice them otherwise. Their history of audio and video CODECS is complete nonsense and disagrees with (older) standard reference books on the topic. The examples are endless.”

Nothing was said about any operating systems, although I can understand how Microsoft might be a little upset they were conned for really big bucks by an organization claiming to have “invented” a certain CODEC.

Kevin Sorbo wrote, “This is why traditionally, buying a PC sans Windows didn’t result in any significant savings, or even cost more in some cases.”

I bought many PCs without that other OS and saved a bundle. Around the time ATX desktop PCs were selling for ~$700, I was getting beauties for ~$500. I certainly saved ~$100 on the OS and ~$100 on the CPU because I didn’t take what Wintel offered.

You guys are naive if you think crapware pays OEMs that much. How does the crapwarist get paid at all? Most crapware I’ve seen on PCs was some ISPs and anti-malware. I don’t think their combined cost would be anywhere near the cost of a licence. At the height of their power, M$ banned all that anyway. ISP’s mostly get paid for the “last mile” which is not over in Taiwan. Many consumers just use the freebie anti-malware they get on the web.

You can find lots of examples on the web where a PC with that other OS costs more and only a few where it costs less. e.g. Here‘s a comparison of a no-OS machine and the same machine with that other OS. TOOS costs more.

If DrLoser’s nonsensical proposition were true, M$ could just give the software away and charge the makers of crapware directly: “Want to sell a copy of your crapware? Pay us $X first!”.

Your understanding of this process is naive at best.

When it comes to Operating Systems, unless you buy a box copy, Microsoft isn’t in the business of selling directly to consumers. The OEM buys the Windows license, integrates it into their system, and sells it to the consumer. This has been MS’ business model since the time of XENIX.

Now, the OEM has two choices, it can pass the cost of the license along to the consumer, which once, long ago, before Microsoft established itself as hegemon was likely the case, or bundle crapware to offset the cost of the license, you see, crapware vendors _PAY_ OEMs to bundle their software. This is why traditionally, buying a PC sans Windows didn’t result in any significant savings, or even cost more in some cases.

DrLoser things move on iptables has been replaced NFTables. NFTables address a large number of the complaints like having to create IPv6 and IPv4 rules independent to each other. NFTables supports 1 single rule to block or allow traffic by IPv6 or IPv4. Even so today most desktop setups of Linux will be using firewalld. Firewalld is very much like the firewall in Windows 8 with zones and all the other nice things. Firewalld is even has graphical parts to report when firewall is blocking stuff and allow user to allow stuff. Firewalld can even be network managed with very little work.

DrLoser time has moved on. Most of the issues you are complaining about don’t exist any more. 6 years ago yes. Today no.

DrLoser I have rebuild from source thousands of times. In fact my results from doing this is in the bugzilla where I was testing applications against wine versions that where not shipped as binaries for the Distribution I was using. I can tell you today its different because the number I have to rebuild is reducing. Scribus, Blender….. and a long list of other applications if I need new or old version for some reason you just download the developer provided binaries and use those bi passing distribution supply. So more and more Linux users have never built a binary.

But there are issues to be aware of. Lets say I wish to setup a debian/redhat/…. Linux Distribution repositories for hosting and controlling updates locally. The requirements todo this is a http or ftp server. So a Linux update server can be a Windows Server even the software to manage the repositories can be got for Windows. Windows on the other hand mandates it Windows update server be a server edition version of Windows as well. There is a security problem here. Since update server OS type is predictable attackers can go after it to infect the complete network. Fun feature of group polices under windows is that you can push out a new signing key into the network to allow custom packages in the windows update server to install without requiring authentication.

Windows XP had issues in schools and business a lot. Issue is the volume of machines sold with XP home. A feature Microsoft removed from XP Home was sections of file locking. Problem is the way they did it. Not that the file locking functions failed but they reported to programs that they had the lock and was reporting this to every program that asked even that no lock had been applied so triggering read write sequence order stuff ups. One XP Home machines mixed with a stack of XP Pro machines nasty bad things happened. A network of all XP home machines nasty bad things happened. XP was reasonable if you had Pro. XP Home 100 percent not good.

One of the Microsoft issues is the intermittent releasing of Lemon OS’s that we have to deal with. You never asked Robbert if any of those 5 XP machines were XP Home. Because if one of those machines was XP Home would explain why Robert got a bad taste in mouth about how Windows Networking performs. Windows 9x had the same issue of failing to take out locking over networking. Yes sometimes maintaining backwards compatibility in the form of function causes massive pain XP Home is one of examples of this.

DrLoser you will say but the network should have been all XP Pro. Explain how you are going to hand your boss coming up with a XP home machine and demanding you connect it as is. Remember Schools your BOSS’s are not IT smart. They see Windows XP on box so think they will function identically. Now you are in hell. Microsoft has caused this hell by their actions.

DrLoser wrote, “I am going to state that the cost to the typical (yes, ram, by “typical” I mean at least 50%) home user of a PC for the Windows OS and a browser and a starter edition of an Office suite that interoperates with everybody all over the world and sundry other things is precisely zero.”

This totally ignores the fact that it is the consumer paying for everything in the supply-chain of PCs: the retailer, the OEM, the maker of crapware, all of them. If DrLoser’s nonsensical proposition were true, M$ could just give the software away and charge the makers of crapware directly: “Want to sell a copy of your crapware? Pay us $X first!”. There’s a reason M$ gets paid by the consumer who doesn’t see the price as a line-item. M$ gets more that way. If the makers of crapware knew what tax they had to pay, they would write for some other OS, like GNU/Linux.

Clarence: “He said $130 for software, Mr. Pogson. That doesn’t match up with OS prices for Windows at all. Plus it was a Wintel notebook to begin with, so why would it need a new OS? 99.44% of the laptops in the world are trashed with the same OS as they were purchased with, I am sure.”

There’s one, denying that re-installing that other OS is a huge burden on society. I’ve met with many people who thought it was “normal” to spend $hundreds re-installing that other OS to “fix” it. I’ve never re-installed GNU/Linux except for testing or the occasional radical change.

oldman: “My reality is that I have no problems with windows. From my brand new “fire breather” portable running windows 7 x64 to my 8 year old dell running windows XP SP3, all run the applications that I need them to run without effort or event. That is my reality, whether you like it or not.”

Then there’s blaming the user for M$’s vulnerabilities: “You FAILED TO DISCLOSE whether you have done your updates and whether you installed software from untrustworthy sources.Is it because such disclosure would not favor your position, because I KNOW you either didn’t do your upgrades or you installed stuff from untrustworthy sources, and THAT’S the reason you got what you got.”

This despite a history of “drive by” malware, worms, attacks by LAN and USB drive and FLOPPY…

Oh, and have you come to terms with the obvious fact that Wikipedia, or indeed any global site, is hardly going to listen to you if you redefine the terms in question (Linux => Gnu/Linux, how pointless is that?) and insist on using your own personal and unverifiable calculations as cites for the three major claims you made?

I would, if I were you. It’s the peril of posting something like this. Remember, this all started off as Wikipedia, Decline or Maturity?

I am the same guy that organized 100+ computers to work smoothly together in several schools with far less trouble than those five PCs caused me.

This is known in the scientific community as “anecdotal evidence.”

Now, to be absolutely fair, we could divide your experience into two halves:

1) A Gnu/Linux network of 100+ clients. As you may remember, I pointed out that neither you nor I am an expert in XP networks. This is what we call “works for me” in the IT business.

And there’s nothing wrong with that.

2) A trivial number of non-networked machines, in this case five, running some form of XP on some unspecified form of hardware.

I’m terribly sorry to hear you had a problem with that, Robert. But, you know what? Anybody with any understanding whatsoever of how to install a commodity OS (XP SP3 at the time) would have coped with no problem.

I mentioned NT network installations of 100,000+ instances across the USA and Canada. They seem to work pretty well.

And you couldn’t have sorted out five XP SP3 desktops, presumably with a separate Admin/root and User(s) account?

Doesn’t that simply suggest that, highly qualified as you were and are, you were simply not up to the job?

Jeez, even an ignoramus like me could have slip-streamed an image onto a DVD in half a day, and used the other half of the day to distribute that image.

DrLoser blathered on, “You were clearly incompetent when it came to organising a small school network of Win95 or Win98 machines.”

What organizing? It was a consumer OS and it would crash just sitting there or students running a browser or word-processor. The OS was supposed to be managing resources, not me. I am the same guy that organized 100+ computers to work smoothly together in several schools with far less trouble than those five PCs caused me. If my incompetence caused problems why did all problems disappear once I installed GNU/Linux, an OS I had only seen running once and never installed before? Just luckey, eh?

Yes. They claim M$’s product works for everyone of any level of capability and then attacks anyone who doesn’t get it to work for them as incompetent. It’s quite dishonest to do that.

I don’t expect the onerous level of Wikipedia citations here, Robert.

I don’t even expect the rigour of a scientific journal, peer-reviewed.

But “they” (ram’s completely theoretical “Microsoft shills”) have not done either of the following, have they?

1) They claim M$’s product works for everyone of any level of capability.

Citation, please. And in this case you are welcome to use a self-reference.

2) … and then attack anyone who doesn’t get it to work for them as incompetent.

That’s a fair accusation, Robert. You were clearly incompetent when it came to organising a small school network of Win95 or Win98 machines. To be frank, I would have been equally as incompetent, pre NT days. Your Tales of the Frozen North suggest that you, admirably, carried the same level of incompetence through to an XP network.

Now, I wouldn’t attack you in any way whatsoever, just because you were and presumably still are incompetent at setting an XP network up. I’ve fiddled around with those things, and I’m only twice as competent as you … still worthless, in other words, and I put a lot more effort into it.

But the world doesn’t revolve around the likes of you and I, does it, Robert?

There are probably tens of thousands of Microsoft Network Professionals out there in the USA and Canada alone … and I imagine they are all better trained and more knowledgeable than either of us.

As proof, I offer the inarguable fact that there are hundreds of thousands of NT-based networks out there, all working to a high degree of operational efficiency, practically none of them requiring a trained experimental physicist with an MSc to run around carrying an assorted collection of CDs or DVDs.

It’s what you or I would do, Robert, because in that environment, we are both hopelessly incompetent idiots.

But it’s a little tough to have a go at Microsoft just because despite their best efforts, they can’t convert either of us from being a hopeless incompetent idiot to a fully-fledged network professional.

Not a single one of you people have ever used configure/make/make install, have you?

Well, maybe two or three. And almost certainly in cases at the back end of the long tail, say a specific tgz for instrumentation or something.

Why do I suspect this? Because I have done it dozens of times.

And I have dealt with the dependencies. And hand-editing the make scripts (or their antecendents). And fiddling with gcc flags. And making big choices, such as whether to use static or dynamic libraries. And searching the Web for all the bits and pieces I need.

Unlike you lot, I, the Microsoft Shill, have done this, because my job required it. And I didn’t complain about the cost. (Which was with 100% certainty, nota bene ram, more than ~$20).

And yet I still failed to address your “~$20” claim properly, didn’t I? My apologies. Here we go again. On a single installation, there are only two possibilities:

1) You get a highly trained technical expert, who has kept up with the latest Debian developments, to do this for you. Well, Robert, you were a scarce resource at the time. $20 sounds fine for half an hour of your time, but you’ve forgotten the cost of flying you in from the sub-arctic.
2) You do it yourself. Let’s assume you rate your hourly time at minimum wage, say $5 per hour. Let’s assume you are technically literate (this is a big assumption and invalidates your case for all but the 5% elite), but have not spent your life immersed in gooey Gnu goodness, and struggle a bit over the CLI and daemon startup scripts and what-not.

I’m guessing half-a-day, or $20. And, anecdotally, this is what it typically takes me, and I don’t suffer from any of those lacks of skill. And I, like you, Robert, am not paid minimum wage.

And then there’s tweaks to X. And tweaks to the audio. And maybe finding the proper driver for the network card … did I mention configuring firewalls and so on, which are a serious pain with iptables … and various other contingencies.

And then there’s Wiping Baby’s Bottom when the next set of updates comes out.

All told, and ignoring the sheer pointless inconvenience of it all, I would guess that the time/money thing is closer to $100 or $200 than it is to $20.

But, let’s say it’s only $20. The M$ bulk license for a single machine is, as I say, $30 – $50. Let’s take the top end of that. And let’s completely ignore my point that the cost is subsumed into the product as a whole.

Do you really think that people care that much about $30?

Because, although you personally might be enraged about this onerous imposition … nobody else gives a flying monkey’s.

Actually, to be fair, I got that claim wrong, didn’t I? I was distracted by oiaohm’s wall of gibberish. I apologise. So, we’re talking about this claim that a Gnu/Linux installation costs ~$20 in time/money:

Installations of Debian GNU/Linux from my local repository (on the LAN) took about 15 minutes if I had to interact with them. Imaging-type installations were several minutes faster. Add a bit of customization and this could stretch to half an hour, $20 worth of my time. It’s the time it takes to copy the software under a licence that permits any number of copies by any means.

Let’s leave aside the possibility offered here that the ~$20 is a one-off cost, and that after that all one needs is a disk image at effectively $0. I’m dubious about that proposition, too, but it wasn’t your original intention and we can discuss that separately.

I am going to state that the cost to the typical (yes, ram, by “typical” I mean at least 50%) home user of a PC for the Windows OS and a browser and a starter edition of an Office suite that interoperates with everybody all over the world and sundry other things is precisely zero.

Do I mean, because that cost is a hidden cost absorbed by the OEM? No, I don’t. I mean zero. Why?

1) The OEM offsets the bulk license ($30 – $50, I don’t know exactly) of the license via crapware. Feel free to have a go at crapware. I do not like crapware any more than you do. But it helps offset the cost of the license.
2) OEMs do not build your hardware in a garage any more. They source, integrate, and test the bejeezus out of the various parts to make sure that the end user has a reliable, compatible, system. (And part of that M$ license fee, btw, is to ensure backward compatibility in the vast majority of cases — ram, take note, I am talking about at least 95%, almost certainly %99.9 or more.)

“But I don’t need that!” you say. “I want a barebones PC without all of that, and I want to install Linux on it!”

Even if Linux desktop adoption was 5% (it isn’t), that means that the OEM would have to maintain two product lines: one for the ignorant masses, with all of that nasty QA and compatibility and so on, and one for the elite 5%.

You’re fond of inappropriate analogies, Robert, so you should be able to see the fundamental problem here.

Large-scale factories do not do that, ever.

Well, almost never. I mean, if the elite (you) want to pay a premium price for non-commodity produce, then the elite will find a marketplace for their needs. Apple springs to mind.

DrLoser wrote, “How, precisely, did you come to this rather interesting figure?”

When I was a teacher I was paid >$40/h.

Which was either incredibly good value for a fully-trained professional network architect, Robert, or a complete waste of money on somebody who would have been better employed doing his actual job, ie teaching.

Noun 1. state of the art – the highest degree of development of an art or technique at a particular time; “the state of the art in space travel”

Somehow I don’t believe that either you, Robert, or the large but dwindling number of XP users, think of Windows XP as “state of the art.”

You can state that “for all practical purposes, an apple is a hippopotamus,” but all you’re doing is to confirm the logical fallacy of your premise. People don’t use XP because it is “state of the art.” They use it because for some reason it suits their needs.

Ever notice how the Microsoft shills always attack the person and not the issues?

It’s possible to do both, ram. For example, whilst claiming that you would have to be an ignorant buffoon (at best) to make the claims you did, I was at least prepared to detail the issues.

One thing I have noticed is that Linux zealots, when presented with a robust counter-argument based on their spurious and inaccurate claims, almost always fall back on personal insults, bizarre accusations of M$ bribery/shilliery — for the record, I worked for M$ for 2 1/2 years and was paid on an hourly basis for actual programming; I have never once received so much as a free or even discounted phone or tablet, let alone a bribe — and the trusty old “Waaah! Mommy! The nasty man is being nasty to me!”

Therefore you win. Not.

Obviously it’s too difficult for you either to admit that your statements were factually incorrect, or even to attempt an argument in their defence.

But it’s all right to feel sorry for yourself, ram. I feel sorry for you, too.

And while we’re on this “ever notice that…” claim, have you actually counted up the personal attacks and/or insults on this site? I’m prepared to bet that the totals for a single week, let’s say 13th – 19th October, will demonstrate beyond doubt that the majority of “personal attacks” come from those claiming to represent FOSS.

Not from the non-existent “Microsoft shills,” although for clarity and the sake of argument I am prepared to lump in all non-FOSS advocates (including Kurks) in that latter camp.

“So, the dishonesty is with olderman who puts words in others’ mouths/keyboards.”

Really? , let me take that one step at a time.

“So, in a practical sense XP is state of the art for many people. .” But in reality it is NOT, state of the art. And no amount of verbal handwaving can change the fact that it is not state of the art. All of your efforts to criticize the current running version of windows based on your experiences remain dishonest.

“I have no idea what “there are no differences in the behavior of the linux versions of FOSS and their more popular windows/OS X versions” means in this context. I doubt I have ever said or written that. ” When confronted with the reality that most of the popular FOSS runs on Windows and OS X as well, so there is no reason to change to Linux, you have chosen to talk about how these applications run better on linux, you have responded by touting the superiority of linux as a “superior” platform, without also being honest about the sometimes jarring differences in behavior that one can encounter when moving from using FOSS like firefox on windows to firefox on linux.

Of course given the fact that you refuse to actually use a modern version of Windows even to the extent of being able to bolster your own arguments this is not surprising.

“Unlike olderman, I have worked with a variety of students and teachers from diverse backgrounds and I do know what people do with their PCs. ”

Well Olderman has worked with an even wider group of people that you could dream of, Robert Pogson. THat experience included helping a Family friend who was I high school teacher build an introduction to computer science course around a computer lab filled with obsolete ill maintained computers. With my help he did it without throwing out the baby with the bathwater as you did. You claims to know what I do are about as pure dishonesty as you can get.

“Many professionals from a wide variety of disciplines use it, in particular, teachers. They are not hobbyists. ”

More dishonesty. Many of those professionals you speak of are, however, experimental scientists running custom programs on SERVERS running linux. Servers BTW that aren’t even running a GUI. Interesting enough most those same scientists also do the bulk of their non experimental work on desktops running windows or OS X, not linux. And in that I, unlike Robert Pogson, speak from experience with the real world, not ancient history filtered through biases.

olderman preached, “It is dishonest to propose a hobbyist/tinkerer OS as the primary desktop OS for a would be computer user who isn’t interested in tinkering or hacking around to solve a problem. It is dishonest to make believe that there are no differences in the behavior of the linux versions of FOSS and their more popular windows/OS X versions. It is dishonest to make blanket statements about what a computer user does or does not need to have as part of the function and feature of their computer.And it is especially dishonest to talk about window XP as if it is still the state of the art for microsoft OS’s.”

Working my way backwards through that pile of …:

M$’s XP is still widely used in the world, probably on half the desktop/notebook PCs in China, 9% of PCs in Canada, 12% of PCs in USA etc. So, in a practical sense XP is state of the art for many people. M$ wanted it that way and took steps legal and illegal to make that happen far beyond the characteristics of the product and its price. When XP was created, M$ was a very profitable and huge corporation and if it didn’t create a product that was state of the art, that was their choice, their fault, not mine.

Unlike olderman, I have worked with a variety of students and teachers from diverse backgrounds and I do know what people do with their PCs. I was in one place where a family had a desktop PC and no connection to the Internet, not even a phone. M$’s OS shut them down because of that and I liberated their PC by installing GNU/Linux. Why did they have a PC? Only to play CDs… Similarly, I’ve met people who only browse Facebook and nothing else. I also know what various schools do with IT and GNU/Linux could easily meet all their needs. If other people are locked in to Wintel, that’s their problem, not mine. IBM, RedHat, Google, Munich and many others find that by and large 80% of users of PCs can drop in GNU/Linux and carry on. Others may have to hunt for a substitute for some application(s) or run an instance of that other OS in a virtual machine or terminal-server but there are very few who must have M$’s OS.

I have no idea what “there are no differences in the behavior of the linux versions of FOSS and their more popular windows/OS X versions” means in this context. I doubt I have ever said or written that. I’ve seen many applications such as browsers and media players that worked very well under several operating systems as they should. An operating system is supposed to manage resources not interfere with applications. Indeed, trolls here have argued that because FLOSS works the same on that other OS and GNU/Linux, there’s no need for GNU/Linux. ilia:““and other Software” is the key, Linux has very few of it and the best of it can work on Windows, when you buy Windows you can use almost all FLOSS and Windows-specific software, with Linux you can use only FLOSS and a small subset of Windows software which can run under Wine. And many and many of Windows proprietary software is superior to FLOSS. Thus buying Windows you get access to much more software of high quality.”/li>

GNU/Linux is not a hobbyist OS despite it being readily available to hobbyists. Many professionals from a wide variety of disciplines use it, in particular, teachers. They are not hobbyists. Linus Torvalds is not a hobbyist. He was a skilled programmer trained in the computer arts at post-secondary school. Same for RMS. The folks at Munich, Google and IBM that use it widely are not hobbyists.

So, the dishonesty is with olderman who puts words in others’ mouths/keyboards.

It is dishonest to propose a hobbyist/tinkerer OS as the primary desktop OS for a would be computer user who isn’t interested in tinkering or hacking around to solve a problem. It is dishonest to make believe that there are no differences in the behavior of the linux versions of FOSS and their more popular windows/OS X versions. It is dishonest to make blanket statements about what a computer user does or does not need to have as part of the function and feature of their computer.

And it is especially dishonest to talk about window XP as if it is still the state of the art for microsoft OS’s.

DrLoser as per LSB standard requirements all LSB Distributions must be able to install LSB RPM packages. Debian and most other distributions that are not internally default rpm has a program called alien for the job. Even so lot of upstream developers parties decide to go for the old classic tar.gz file that you just extract into a directory somewhere and run from that location so avoiding packaging completely. Firefox and Blender are two big examples of packaged in tar.gz to be extracted by user into any directory they like. There are in fact a lot of programs for Linux packaged in the tar.gz extract format.

What about the entire OS ecosystem? Forget the kernel and whatever broken version of glibc this wonder app depends upon. How about the dozens of other shared object libraries?

LibreOffice, Blender and Firefox ship with almost all their shared libraries bundled. So this arguement is mostly a joke. Its not dozens compatibility is about 5 parts. glibc being the biggest, kernel being the next, opengl interfaces, finally gstreamer and pulseaudio. glibc and kernel are mostly X version+. Most of these become ABI stable in year 2010. Its just taken developers a little time to wake up they can produce distribution independent packages that work.

DrLoser the Linux world changed in 2010 and trolls like you have missed the event.

More and more Linux programs are provided by developer direct as well as by Distributions. Most of the developer direct packages are distribution neutral.

Scribus provides a stack of interfaces providing the same files for many different distributions in each distributions update system format. Please note Debian based and Redhat based support developers registering their own repositories.

DrLoser if you are running into an application that does not provide a distribution independent binary its a bug by that upstream.

Steam by valve has been great to force developers to look at how they have been treating Linux and wake up that releasing universal packages was not impossible.

The problem here the Myth of cannot release universal binaries for Linux is dieing. Ship your own run-times no problems. Some of the programs of past and current that give users trouble its kinda in face their will be problems. Yes a give away is like Windows installer is like 40MB and Linux installer is 2MB at this point you basically know that developer has been a prick and in Linux installer has not bundled the runtime. Windows and Distribution independent Linux installers of applications should be fairly close to the same size. Why windows applications work cross Windows versions is bundled runtime.

Distrobution saves bandwidth by having a universal runtime shared between all applications inside the Distribution. The problem of a universal runtime is forced updating to access stuff. As proven by many open source and closed source applications for Linux there is no requirement for Application developers to use majority of distribution provided runtime.

Sorry DrLoser if all the problems you said were true closed source steam would not work on many distributions neither would many other closed source programs. Yes closed source applications may be the minority on Linux but there is still quite a few of them. Some of these include high level CAD software. Yes almost all the high level CAD software supporting Linux only release 1 binary package for all Distributions.

ram wrote, “Ever notice how the Microsoft shills always attack the person and not the issues?”

Yes. They claim M$’s product works for everyone of any level of capability and then attacks anyone who doesn’t get it to work for them as incompetent. It’s quite dishonest to do that. I’ll never forget the poor lady who tolerated a system that had slowed to ~5minutes per click because she was afraid a change to the system would risk her files… That was XP. I switched to GNU/Linux because (no other reason at all) Lose ’95 would not run without interruption in my classroom. That software was installed at the factory by experts and I did nothing to modify it. It was always crashing/freezing. Munich switched because M$ would no longer support NT4(?) and they wanted off the Wintel Treadmill. They were capable enough to replace their whole IT system with something based on GNU/Linux yet the trolls call them incompetents even though they saved $millions in the process. Sick.

DrLoser being way off-base, wrote, “Forget the kernel and whatever broken version of glibc this wonder app depends upon. How about the dozens of other shared object libraries?”

Because FLOSS licences don’t expire and copying is allowed, if one can find the distro on which an application ran way back when, one can almost always install it. If newer hardware won’t work with an old distro, one just run it in a virtual machine many of which have ancient interfaces to cover this case. There’s still source code as a final backup. I have, just for fun, installed an old distro and StarOffice 5.2 from the old days and it worked just as it did on my first installation. The last time I did such a thing I used a RedHat archive on a 15 year old PC. Installation took forever because the machine was so slow, had no USB or CD drive, but the software ran perfectly. If I recall correctly, I found a floppy disc in the back of a dusty drawer, copied RedHat’s installer onto it and after a few tries got it to boot and install. There was no PXE mode, just Ethernet and that floppy. If that floppy drive had not worked I could have tried a serial link but I doubt that would have been as much fun. Virtual machines OTOH are about as fast as their host.

DrLoser wrote, “How, precisely, did you come to this rather interesting figure?”

When I was a teacher I was paid >$40/h. Installations of Debian GNU/Linux from my local repository (on the LAN) took about 15 minutes if I had to interact with them. Imaging-type installations were several minutes faster. Add a bit of customization and this could stretch to half an hour, $20 worth of my time. It’s the time it takes to copy the software under a licence that permits any number of copies by any means. For some thin clients I set up, it was even faster. I could boot them PXE, and load software that would boot and copy a PXE loader to the MBR in about a minute. Reboot from the hard drive and a session opened up to the local terminal server. A dedicated thin client OS may need only a few seconds to transfer over the LAN. The largest installation I did required as little as 3 minutes to unpack from the box, install a base, record the NIC’s MAC and stack it on a trolly. Another few seconds to record the data on the DHCP server and we were done all the per-client work, so ~$20 was an upper limit. There is also a means for unattended installations from a list of packages so, a single person could set up as many as could be connected the LAN at once and let her rip, bringing down the number. I once worked with PCs that came set up for PXE out of the box. It was plug and play if they were on a sheltered LAN in a lab. A fairly good terminal server could be installed with software in an hour or so including setting up accounts, auto-logins, a bunch of applications and some monitoring/control software for teachers. I once planned to do a whole lab in an hour but it took 3h because many of the old clients needed custom configurations. It was still an average of 180minutes/24 = 8minutes per PC, just a few dollars worth of my time. I didn’t need to do all that extra work of recording stickers, preserving them, phoning home, authenticating, etc. that M$ requires. I have spent 3-4h just installing on one machine from media included with the machine, e.g. one machine I moved to “7” from GNU/Linux at the insistence of a teacher. The teacher got less capability too. There was no driver for her desktop printer with “7” and “7” would not interface with our shares for reports that worked fine with XP and GNU/Linux. Then there’s the EULA for which one should have a lawyer working alongside to make sure no copyright liabilities are accumulated. We’ve all read the Ernie Ball story.

What is this nonsense with people talking about Linux apps not having backward compatibility? For the vast majority of applications, if you can’t find a binary for the version of Linux you are using, you can download the source and compile it.

I hate to burst your bubble, ram, but in reverse order:

1) Unless you put a reasonable figure on “the vast majority,” we are still talking between 0% and 100% here. Quite possibly nearer to 0%.
2) No sane>/b> human being wants to do this. Sane human beings interpret “backwards compatibility” as “I plug it in, and it works out of the box.”

It would be entirely wrong of me to question your honesty, ram. Which means, unfortunately, that I am going to have to question your sanity.

4) But let’s assume you can find that “elusive” binary. A few little fiddles, and with luck you can convert it from format X (say RPM) into format Y (say DEB).

What about the entire OS ecosystem? Forget the kernel and whatever broken version of glibc this wonder app depends upon. How about the dozens of other shared object libraries?

5) Oh well, failing everything else, you can always recompile the source. Assuming that you can find the source. And configure/make/make install is such fun, isn’t it? Darn,, I just need to twiddle the odd flag here and there. And convert floats to doubles where necessary (I deliberately use a very low-level example here). And … I think I’d rather spend $50 on a software package that just works.

Out of the box.

I’m not questioning your personal dedication to this lifestyle, ram. I’m just wondering how you can pluck up the nerve, every day, to try to force it on normal people.

I don’t suppose that you’ve noticed, by now, that normal people cannot be faffed with this broken model of “backward compatibility?”

What is this nonsense with people talking about Linux apps not having backward compatibility? For the vast majority of applications, if you can’t find a binary for the version of Linux you are using, you can download the source and compile it. If someone doesn’t have that technical skill they can pay a local computer shop or consultant to do that for them. Virtually always fees for a custom compile are extremely modest.

I’ve paid for backports of audio applications and libraries going back to Linux versions almost ten years old. It took a few days, but then everything worked. I had the new features I wanted, such as netjack, and legacy applications worked even smoother since the new libraries were more efficient.

kurkosdr pardon a lot of Android Applications exist only for version 4.4. Yes the latest.

Android suffers from most of the same problem normal Linux does. Difference here most Android applications are built against the Linux Desktop equal to Linux Standard Base. Latest Libreoffice for Debian from Libreoffice.org don’t depend on you running the newest Debian or the latest LTS/Stable.Linux apps for some reason require the latest LTS/stable.
This is nothing more than application developer for Linux choice. Linux Apps in this case can be Android/Linux as well. It makes no difference. There are a lot of applications for android that require the latest version and don’t work on anything older.

Download http://www.xonotic.org some time and notice how many different Linux distrobutions this runs on. Blender http://www.blender.org/download/ for example anything with glibc 2.11 or newer is required for the latest version so 2010 distributions and newer. Most Linux applications are not the latest LTS or stable required. Most Linux applications require something in the last 5 years. kurkosdr if you are submitting a bug to the main blender project you are not allowed to use Distribution built.

You want to extend Linux longer chroot or have applications ship with own run-times. Please also beware that older glibc than 2.11 introduces security issues.

Dellbuntu netbook debacle was caused by dell making their own sub version of Ubuntu. Since Dell has been working with main-line issues of broken drivers has not happened again.

Kurkosdr linux reduces is need for a anti-virus by keeping it core parts up to date. If the defects a virus/malware depend on are removed that type of virus/malware will die out. There is a price for 10 year old OS being left in active usage.

You notice a lot of people complain about Android slow update cycle so they are unable to run the latest android applications. Google is introducing the Android One program to address this. I am sorry Kurkosdr Desktop Linux and Android Suffer from the same issue of having perfect backwards compatibility and basically non existent forwards compatibility. Android One program equals OEM/ODM having to do more work certification new versions.

The difference between Desktop Linux and Android really when it comes to the problems. Androids managed to cross critical mass that is it. All the issue with forwards compatibility exist on Android.

investments in hardware supported by proprietary drivers.
This issue is also common to desktop linux and android that this is a problem. This has resulted in a lot of Android devices being stuck on old versions. Difference here most people with a phone are willing to throw it away after 1 to 2 years and buy a new one. So Android issue with hardware has not blocked critical mass development due to the market its been operating in.

kurkosdr the process of moving to wayland sees a lot of problem things disappear.http://www.phoronix.com/scan.php?page=news_item&px=MTgxMDE
Issue with X11 causing video card driver issues is more lack of generic interfaces to operate video cards so video card drivers bound against X11 server Dependant crap. AMD will have the same open source kernel driver for there closed source video card stack and their open source stack. This is something to be aware Linux world has had a problem child X11 forced on them by closed source drivers. The X11 problem child resulted in multi dependency because the kernel driver was processing X11 server made information. KMS and DMA-BUF results in kernel side data being completely based on kernel provided stuff. Result Kernel driver for video card is now only dependant on kernel stuff. EGLStream is an official standard from 2011. This also results in solid ABI between graphical compositor being X11, Wayland, Mir, Surface flinger….. and Video driver interface libraries.

At long last Nvidia has learnt their lesson. Don’t just replace everything in the core and hope it works.

OpenGL has been a pig with Glx, egl, wgl….. all doing things differently. EGLStream is what android video drivers use. Mesa GDM is not bad either. Both EGLStream and Mesa GDM don’t care if you are X11, Mir, Wayland, Some other new invention. Heck directfb will be able to have accelerated graphics again. One of the more interesting results if this is chrome/chromium getting a full screen version that has no compositor or X11 behind it. Yes browser straight to screen.

At least this time around the communication standards between compositor/application and graphical drivers will be defined standards.

olderman wrote, “Microsoft continues to make money because it’s products are judged worth the price by those who require and run the applications that require Microsoft OS as a prerequisite.”

There’s nothing wrong with that, but M$ also makes a ton of money from consumers who don’t even know they are paying for an OS and who mostly run a web browser and media player which don’t require M$’s OS. That’s a lot like theft, taking money without permission. Why can’t M$, a business, tell the consumer how much their product costs? Are they afraid the consumer will wake up and demand choice?

New paradigm – “My SSD based system boots in 10 seconds.” <–Sounds like a Chromebook…LOL.

Imagine, if you would, having to pay a user license to drive your vehicle or perhaps your tractor, of which both are plagued with continual issues such as: bad gas, faulty hydraulic line, oil leaks, etc.. the engine slows down and on as little as three years you are forced to upgrade to the latest model. People would be suing left and right, however with M$ one cannot, even when upgrades trash your system.

BUT, they ‘d still have hardware supported by proprietary insidethe laptop/desktop. Which means that when they upgrade to the next version, it will break, be it because of the kernel or X.org version bump.

Which is EXACTLY what happened with the Dellbuntu netbook debacle. Hidden cost

And users can’t just stay with the old version like they do with Windows, because Linux apps for some reason require the latest LTS/stable. And dealing with unsupported backports sucks.

Have you ever thought there is a reason OEMs will ship devices with Android[1] and offer Linux on servers but won’t touch Desktop Linux? (Dell touched it, and…)

[1]Even thought android is a junk OS. For example, during an intercity bus ride today, my TETRIS app was sending notifications, and notifications cause music playback to stutter in Android even if you have turned down notifications for an app. I kid you not, search for “android music playback stutters”. I had to open Tetris, play a bit to shut the notification up calling me to play, l and continue listening to music on my SGSIII. It still stutrted a bit, but better than before. But OEMs will sell Android, because it offers some back compat and apps don’t require the latest version.

There are no hidden costs. It’s FLOSS. It’s not bundled with some hardware usually and anyone can install it for a few minutes of time. That’s one reason Munich, Google, Ernie Ball and millions of others migrated to GNU/Linux. People who live in glass houses should not throw stones. M$’s OS has all kinds of hidden costs starting with the hidden cost of the licence, infinite malware, re-re-reboots, slowing down, forced migrations etc.

olderman wrote, “The price of anything in the market is what the market will bear, nothing more and nothing less. That is the way it works, period.”

Nonsense. If the market is bearing $100 and a competitive product is offered for $20 the market will change. That’s why M$ is now giving away its OS on the low end just as it did for the netbook. It’s the same OS…

olderman wrote, “Bizarre is the cult that assumes that they get to dictate the worth of someone else s efforts.”

That is bizarre but that’s not what I’m writing about. The price of anything in the market should be the lesser of its cost of replacement or the price of competing products. One can obtain GNU/Linux for ~$20 worth of time/money when that other OS costs ~$100+ (except where M$ gives it away). Now you can argue that somehow M$’s stuff is worth more but it isn’t if what you want an OS to do is manage resources and get out of the way. Lots of people want that. Why should they pay for other bloatware, restrictions on what can be done with the software and all kinds of lock-in that M$ shoves at them? There’s a reason that 85% of active websites run on GNU/Linux. It’s an OS and it works and the folks who run websites know they have a choice. Because many consumers don’t know that should not cost them 5X as much.

Deaf Spy wrote, “Whether you decide (are forced) to allow cooperation by giving your work for $0 or for > $0, is a question of philosophy. “

It could be but not always. I’ve offered to clear the neighbour’s driveway this winter because the husband is sick and the wife is a tiny thing. That has nothing to do with philosophy. That’s what neighbours do. It’s the same in Munich where the government decided using that other OS was not in their best interests. It wasn’t at all about costs but about being free of monopoly. A consequence of that is lower cost. These are real practical decisions made rationally for good reason. Google adopted GNU/Linux. They are hyper-rational. Ernie Ball adopted GNU/Linux because M$ was out to get them. That was a defensive measure. IBM adopted GNU/Linux because it’s what their customers wanted. My last employer adopted GNU/Linux because XP was too unreliable.

Developers have some of the same considerations. A startup may well choose FLOSS for a lower cost of entry. That’s nothing to do with philosophy but economics. FLOSS costs less to make as well as to deploy. One of my employers, who knew nothing of FLOSS and probably still doesn’t, adopted FLOSS because they got twice as much IT for the same money. The folks who wrote Linux and VLC and a bunch of others chose FLOSS because it allowed them to work the way they wanted to work. No one is forced to use FLOSS at all. It is optional yet many millions create FLOSS and many more millions use it.

Cooperation is a great way to do IT.
Cooperation comes in many forms, Mr. Pogson. But it always comes for a price. It may cost $100, it may cost 2 hours of your time, it may cost you $50 and 1 hour of your time.

Whether you decide (are forced) to allow cooperation by giving your work for $0 or for > $0, is a question of philosophy.

You should try it sometime.
Actually, I have. And I have being trying it for the last, hm, 17 years. As a part of a team. A team that cooperates with other teams by using various tools. Have you, Mr. Pogson? What have you given the others to cooperate with you?

What about the cult of greed that monopoly is good and M$ can’t have enough money and if you can charge 5X what software costs you should? That’s absolutely bizarre. Being able to copy, modify, distribute and use a work is not. It’s the way things should be. Copyright was intended to encourage creative work, not to make corporations filthy rich. The four freedoms simply amend the defaults of copyright and it is OK for people to want to do that. After the first few millions copies, why should anyone need to be paid for more? M$ is a very inefficient company if they require $billions just to release their next version of an OS. The entire cost of FLOSS over decades comes to a similar amount. Why should the world support an inefficient corporation? Oh yeah, they had a monopoly. That’s not a good answer economically, philosophically or morally. We should pay what software costs, no more no less, just like food, shelter and clothing…

Deaf Spy wrote, “It is just a philosophy of what to do with the fruits of your labor, and the fruits of other labors. Just like communism is a philosophy, not a technology.”

That’s false. The authour of a work gets to choose the licence. Many people create FLOSS because they like to share but also because they are paid to do it. Cooperation is a great way to do IT. You should try it sometime. Individuals can’t do a good job of creating all the software they need but millions of programmers working cooperatively can and do that very well. There’s nothing wrong with that at all. That’s what churches, corporations, governments and families do all the time.

When I was a teacher, I had a great principal. She would get on the PA and announce that she was going to do X and it would be great if people went to Y to help. We did. We could make short work of tidying up a storage room or picking up trash on the grounds or whatever. We weren’t paid extra to do that. It was the right thing to do. That could be considered a philosophy but it’s also very practical and efficient. It is good technology.

Purely academic-wise, Pogson, you did fail to provide good cite reference indeed. What you did is called “self-citing”, which is considered appropriate only if already peer-reviewed and published. Your blog is definitely not peer-reviewed. Unless you decide to consider the likes of dougie and Peter “peers”, but I would never, ever put you so low on the intellectual level, to compare you with these two bio-masses.

kurkosdr wrote, “Not only they use weird terminology specific to their cult, like “GNU/Linux””

dict cult: “A system of intense religious veneration of a particular person, idea, or object, especially one considered spurious or irrational by traditional religious bodies; as, the Moonie cult.”

There’s nothing cult-like in FLOSS. FLOSS is about a better way to do IT/produce software. It’s technology. The four freedoms are preserved in the licences, permitting copying/modification/distribution of works.

Now the GNU/Linux haters, that’s another matter. Without any reason they attack a person’s very existence based on simple choices of words. Ad hominem attacks seem to be a requirement of their faith.

Also, I like how cult-specific terminology was designed to restrict thought to the maximum extent possible.

This is why you should never use such terminology, not even in a quick one-sentence reply, because by doing so, you ‘ve already lost ground.

For example, by saying free software, you ‘ve indirectly admitted the concept of “software freedom” exists and that Stallman’s “Four Freedoms” definition is the only definition.

By saying “GNU/Linux” you ‘ve regocnized GNU as a special project that deserves credit when incorporated into other projects (for example Linus’ Linux project) while other projects don’t have that priviledge.

“Editing “Linux” to “Gnu/Linux” is not going to win you any friends at Wikipedia. ”

Basically, FSF fanatics are like scientologists. Not only they use weird terminology specific to their cult, like “GNU/Linux”, the concept of inanimate objects being “free as in freedom”/”non-free”, “SaaSS” (Service as a Software Substitute).

They also think the world should use their cult terminology, and anyone who doesn’t is corrected mid sentence or edited.

When they end up beimg ignored or banned, cult members deeply affected by the cult can’t even identify the reason they got banned.

Sometime ago, I created ‘started’ a few pages that were work related etc.. and would go back and edit them from time to time, reading through the history I would find competitors and suppliers adding stuff, then came people wanting to use Wikipedia for advertising whih is a no-no.

After a few years, then came this total lock-down on new edits and such, one Nazi-type person went to far as to call me out as a hacker, due to my nickname and removed everything I ever contributed.

That’s absolute crap and you know it. My link was to the analysis of the data, mathematical proof of my claim. Should I have copy-and-pasted it into the discussion?

Your link was to a graph drawn on your own personal resources, Robert. 100% accurate, no doubt. But not actually worth academic spit as a reference.

Let’s say you made a trivial error in scaling. Who would be able to tell?

This is not an independent data source. It is madness.

If you don’t have an independent citation, then you have no basis for submitting your “evidence” to Wikipedia. Why? Because they have no way to determine whether or not you are a lunatic.

Should Wikipedia pump revenue into NetApplications?

Not the point at issue. All they require is a verifiable citation. You did not provide one.

As far as I know my site is the only place on the web with that analysis and no one has pointed out a flaw in my reasoning.

“I alone amongst humans have this vital information.”

Good for starting your own religion, Robert. Rubbish for contributing to a general-purpose online encyclopedia.

And as usual, you helpfully elided several pertinent points:

1) Editing “Linux” to “Gnu/Linux” is not going to win you any friends at Wikipedia. One battle at a time.
2) That really wasn’t “one citation required per sentence,” was it? One citation required per extraordinary claim, more like it.
3) Linking to your own blog is not just “publicity.” It is hubris, pure and simple.

I certainly do. It’s a great idea being smothered with bureaucracy. You seem to think it’s mature and smoothly functioning but it has barely started to encompass all of human knowledge and yet it bans new contributions on arbitrary terms having nothing to do with human knowledge.

See another’s views on this matter: “Wikipedia and its stated ambition to “compile the sum of all human knowledge” are in trouble. The volunteer workforce that built the project’s flagship, the English-language Wikipedia—and must defend it against vandalism, hoaxes, and manipulation—has shrunk by more than a third since 2007 and is still shrinking. Those participants left seem incapable of fixing the flaws that keep Wikipedia from becoming a high-quality encyclopedia by any standard, including the project’s own. Among the significant problems that aren’t getting resolved is the site’s skewed coverage: its entries on Pokemon and female porn stars are comprehensive, but its pages on female novelists or places in sub-Saharan Africa are sketchy. Authoritative entries remain elusive. Of the 1,000 articles that the project’s own volunteers have tagged as forming the core of a good encyclopedia, most don’t earn even Wikipedia’s own middle-­ranking quality scores.
…
those tougher rules and the more suspicious atmosphere that came along with them had an unintended consequence. Newcomers to Wikipedia making their first, tentative edits—and the inevitable mistakes—became less likely to stick around. Being steamrollered by the newly efficient, impersonal editing machine was no fun. The number of active editors on the English-language Wikipedia peaked in 2007 at more than 51,000 and has been declining ever since as the supply of new ones got choked off. This past summer only 31,000 people could be considered active editors.”

Do the maths. Human knowledge and its flow are increasing exponentially with the web and number of users. Wikipedia should be multiplying editors, not cutting them. There are 500K bloggers on WordPress alone. Wikipedia should have a similar number. A declining number of editors means Wikipedia can never catch up. A limited scope of editors means Wikipedia will never be a real encyclopaedia. The software and hardware to make Wikipedia thrive is there but these “rules” are strangling it. I have long used Wikipedia in my work and blog but I may just add “-site:wikipedia.org” to my search-strings and get more diverse results.

No Robert Pogson, you ask that Wikipedia set aside their standards and requirements and simply allow you to publish whatever you wish, as if you are someone special. When it is made clear to you that that you are not going to get your way, you whinge about censorship when the solution is simple. Find a way to document your citation in the manner that is required by the rules of Wikepedia and you will have your say.

Refous to conform to their publication standards, and you will get nowhere, AND get yourself tagged in the process as one of the problem people that the Wikipedia rules were set up to deal with.

That’s absolute crap and you know it. My link was to the analysis of the data, mathematical proof of my claim. Should I have copy-and-pasted it into the discussion?

DrLoser also wrote, “It would have been trivially easy to supply direct citations to Netstat.”. As you know NetApplications no longer publishes that information except for $. Should Wikipedia pump revenue into NetApplications? As far as I know my site is the only place on the web with that analysis and no one has pointed out a flaw in my reasoning. That’s all I ask, constructive criticism, not deletion/censorship.

I have no love for Wikipedia, but I can accept that it has rules and, if necessary, I would follow them, Robert. In this case, you didn’t. And in fact if you had, there’s every chance that all but one of your edits would have stood.

I have written in peer-reviewed rigorous scholarly journals and no one has ever required a reference per sentence. That would require a bibliography larger than the text… In Wikipedia every sentence is challenged as lacking sources when one could read a single reference and see the correctness of whole volumes of text.

Looking at TEG’s cite, which is a little more specific than your original plaint (in that it actually referred directly to the diff in question), three things stand out:

1) For no good reason at all, you chose to substitute “Gnu/Linux” for “Linux.” Now, this may well be your preferred terminology. It may even be the correct terminology (it isn’t). But it doesn’t fit too well with the title of the article in question, which refers only to Linux, does it?

Drop that small point, or move the discussion to a more appropriate article, and you would sound much less like a swivel-eyed maniac with a hidden personal agenda. Not that you are a swivel-eyed maniac with a hidden personal agenda, of course. But sounding like a swivel-eyed maniac with a hidden personal agenda, when editing a Wikipedia article, is not, I suggest, either good manners or good strategy.

2) I see no evidence of a onerous “line by line” demand for citations. I see three fundamental claims (yours): that 2010 Netcraft stats are specifically dubious for Mountain View in particular, California in general, and business hours browsing in general.

These are all quite substantial claims. And they may very well be 100% correct. (I happen to believe that they are nonsense, but that doesn’t matter.) Either way, however, all three clearly need to be backed by citations, as per normal Wikipedia behaviour.

This is not a “sentence by sentence” witch-hunt, and you do yourself no credit by misrepresenting it as such.

3) When asked for citations, whether in a scholarly periodical or otherwise, I don’t think you’ll find any reputable organisation that will accept one that starts “http://mrpogson.com”. I know that you are scrupulously honest, even painstakingly so, about these things. But how is a Wikipedia monitor supposed to know? It would have been trivially easy to supply direct citations to Netstat. Did this not occur to you?

And not only is this peculiar choice likely to stand out when a Wikipedia monitor reviews it (strike one), but it also clearly breaks the “no publicity” rule (strike two).

What were you thinking, Robert? And why are you whinging? Has this insignificant episode really been stewing at the back of your mind for two whole years?

You should get out more and shoot a few deer. I recommend GEBC. It’s a handy little ballistics tool …

robert, really, you are not listening, and you don’t understand wikipedia.

i’ll try one more time. in the US we have “freedom of speech” but that doesn’t mean you can say whatever you want – there are limits. (“fire” in a crowded theater; off-label marketing of drugs, etc etc). editing at wikipedia is a privilege (one freely granted, but a privilege nonetheless). it comes with the responsibility to follow policies and guidelines (right there in the terms of use). there are limits to the freedom to edit at wikipedia. flout the policies and guidelines, and at best your edits will be reverted, and at worst, if you persist, you will get blocked or banned. it is not a wild west… there is a “rule of law”there. and just like in the real world, ignorance of the law does not get you off the hook for breaking the law, in wikipedia, your edits will not “stick” if you don’t edit per policies and guidelines. it is not a wild west. and if you take the time to understand the policies and guidelines, you will see that there is some real wisdom there.. something beautiful is happening every day in wikipedia. ugly, ugly stuff goes on too. usually when people (like you) refuse to deal with the context in which they are operating.

but as i said before. it is very very clear to me and to others who understand what wikipedia is all about who have responded to you, that you do not understand what you are critiquing. the mistakes you are making are typical “newbie” mistakes, compounded by the (sadly, but very understandably human) typical mistakes that experts make. i’ve tried to explain, and you don’t want to listen, and you haven’t asked a single authentic question. calling “intolerance” or “censorship” is also typical for people who don’t understand that there is no “free speech” on wikipedia – who essentially want to cry “fire” in that theater… which is not on fire. not at all. it is just fine.

“We love to share knowledge and think it should be free but we are effectively banned/shunned on Wikipedia because we don’t play by your rules, causing Wikipedia to struggle to maintain a few thousand contributors. There’s a word for that. Intolerance: ”

Nope. There are actually two words: Standards and requirements. Wikipedia has now been requiring that editors meet a simple standard of veracity. The fact that you can not back up your claims in the way that meets Wikipedia’s requirements is your problem, not Wikipedia’s.

jytdog wrote, “experts have a special problem, in they are often too arrogant with regard to their area of specialization to even see that they are ignorant with regard to the “rules” of Wikipedia”

Think what this means… The best and the brightest can’t contribute their wonderful knowledge to the world through Wikipedia. Why cripple such a brilliant platform? I have several times implemented Wikipedia/Wikimedia on LANs where I worked. I added the software and not those rigid policies and human knowledge accumulated just fine.

People who know me call me a walking encyclopaedia, the human equivalent of Google… There’s a reason for that. I’ve been reading encyclopaedias since I was a child and I learn something new every day. There are millions of people just like me in the world. We love to share knowledge and think it should be free but we are effectively banned/shunned on Wikipedia because we don’t play by your rules, causing Wikipedia to struggle to maintain a few thousand contributors. There’s a word for that. Intolerance: ” The quality of being intolerant; refusal to allow to others the enjoyment of their opinions, chosen modes of worship, and the like; lack of patience and forbearance; illiberality; bigotry; as, intolerance shown toward a religious sect.
[1913 Webster]”

Done here. I get it that you are frustrated, and I am sorry about that. Sounds like this blog is a good and happy place for you to write.

There are lots of people who don’t step back and time the time to understand Wikipedia before they start getting into arguments, and as a result, they get more and more frustrated, more and more closed to learning, and (usually) more and more firm ideas about what they ~think~ Wikipedia is. Which in those cases is generally wrong.

Wikipedia is mature and the body of policies and guidelines is substantial. There is a learning curve to understanding and working within the policies and guidelines. There are lots of people willing to help and teach. But as I said, experts have a special problem, in they are often too arrogant with regard to their area of specialization to even see that they are ignorant with regard to the “rules” of Wikipedia. It is not a bad thing to be ignorant – I am very ignorant with regard to software, compared with you, for example, and I have no shame in that.

Anyway, enough of this. You are frustrated and dug in, and I don’t see much point in going forward. But I really don’t think you understand WP enough to make a valid critique. But good luck to you!

wrote, “b/c wikipedia allows, values, and even protects anonymous contributors, it really doesn’t matter – really! – who you say you are. WP has no way of knowing, and doesn’t care.”

That is a real pity, a hole in the soul of Wikipedia, because other global organizations can and do establish webs of trust that work. See, for example, kernel.org. People are real. They exist. They contribute to Wikipedia and it is wrong for Wikipedia to trash their contributions based on some rule. If Wikipedia really wanted to document human knowledge, they would verify facts/data rather than delete them. That’s what Wikipedia does with images, for instance. If the copyright of an image is uncertain, they obtain another image and move on, they don’t do away with the image which might be quite informative. Images can be faked yet Wikipedia is not demanding multiple citations per image or any such nonsense. Wikipedia accepts contributors’ claims of authourity of images. Wikipedia trusts contributors of images, not text. That’s a double-standard and one that short-changes the world. One picture may be worth 1000 words, but not on Wikipedia. There it’s a thousand words and 200 citations… or its trashed.

jytdog wrote, of Wikipedia, “It does exist online and uses lots of web-resources, but it is aims to be scholarly and rigorous.”

I have written in peer-reviewed rigorous scholarly journals and no one has ever required a reference per sentence. That would require a bibliography larger than the text… In Wikipedia every sentence is challenged as lacking sources when one could read a single reference and see the correctness of whole volumes of text. Wikipedia also seems to deny the existence or correctness of logic/reason/mathematics. If A is true and B is true then I should be able to state A AND B is true but not on Wikipedia. A citation is demanded. It’s as if Wikipedia denies the ability of the reader to think. There’s also nothing not scholarly or not rigorous about a knowledgeable person having an opinion. Opinions are knowledge of a kind that inspires people and motivates them to do more. Wikipedia stifles people.

opinion:“That which is opined; a notion or conviction founded on probable evidence; belief stronger than impression, less strong than positive knowledge; settled judgment in regard to any point of knowledge or action.
[1913 Webster]
Opinion is when the assent of the understanding is so far gained by evidence of probability, that it rather inclines to one persuasion than to another, yet not without a mixture of incertainty or doubting. –Sir M. Hale.
[1913 Webster]

oh, i also wanted to say — you wrote “the web goes back only a few years.” this is a strange thing to write. newspaper archives online (e.g. the NY Times) goes back years and years. Most “reliable sources” (as we call them) do. And it is perfectly fine to cite any legitimate publication, even if it only exists on paper. {libraries do still exist 🙂 }

WP really is a scholarly project. It is NOT a blog…. not by any means.

You really do seem to think of WP as something bloggy/webby. It does exist online and uses lots of web-resources, but it is aims to be scholarly and rigorous.

hi robert, thanks for replying. That Exploit Guy was unnecessarily harsh and mean, but the thing he articulated is important. again, being an encyclopedia that anyone can edit, if we didn’t have our “rules” about sourcing, WP would quickly have become a garbage dump. So our policy on verification says that if something is not sourced, you have to bring a source, or it goes. and b/c wikipedia allows, values, and even protects anonymous contributors, it really doesn’t matter – really! – who you say you are. WP has no way of knowing, and doesn’t care. everything has to be verfiable. it is robust and to me, deeply wise — maximizing freedom and accuracy at the same time.

the other thing to note, is that there are different schools of thought about how… tolerant to be. “inclusionists” leave unsourced material, but tag it with a “citation needed” tag to allow the contribution to stay, but warn readers that it might be bullshit. “deletionists” will just go ahead and delete it, and say that it needs a source. I’m one of the latter, by character. I am busy and only have slivers of time to work on WP, and if something comes in that is questionable and i just tag it, i have no idea how long it might take me to come back and find a source and i might just forget. we have content with “citation needed” tags hanging around for years. And to be honest, most content like that, that i have tried to verify, turned out to be wrong and i ended up deleting it anyway. Most people are not careful, and people strongly believe all kinds of things that are not true or are only partly true … but anyone can edit. the verification policy is a good thing!

anyway, again i am sorry you had a negative experience. i am glad you found a forum where you can say whatever you like!

TEG wrote, “nothing more then you failing to produce a proper source of reference to support what you have written.”

I have no objection at all to producing references except that the web only goes back a few years and there’s millenia of human history… and a citation per sentence is ridiculous style. That’s just not readable. No one in the scientific/professional community has such “standards”. They seem applied variably, too. They pull them out to attack people, not the value of the information. Wikipedia has documented that they have a problem acquiring new blood. The old guys, the turf-protectors, just hammer outsiders and drive them off. The Britannica never required such standards from Einstein and other contributors. Wikipedia does not value authority in the least. Because you can find a reference to something on the web does not make it right. Not finding a reference on the web does not make it wrong.

Pogson, being self-unaware, wrote, “The problem I had is that the end-point or the snapshot I encountered are completely hostile to knowledgeable people contributing.”

Since it’s Wikipedia you are talking about, I have done the necessary steps to dig up the diff in question that you are reluctant to cite. Even as far as the talk page convo is concerned, the so-called “hostility”, really, was nothing more then you failing to produce a proper source of reference to support what you have written. Seriously, did the University of Manitoba not teach you to cite sources properly?

I am not sure jytdog was a glassy-eyed 15-year-old or just being sarcastic, but this is not mrpogson.com we are talking about. Here, we only need to deal with you and the occasional Peter Dolding pretending to be experts of everything known to man. Over at Wikipedia, however, only God knows how many people claim to be experts in order get away with things every day. The rules that Ahunt cited that you interpret as hostility are exactly what prevents the place from devolving into a collection of links to personal blogs and irrelevant statistics. Moreover, people far more qualified and reputed than you adhere to those rules even if it’s only for the benefit of the readers. Do you honestly think that anyone should give you special treatment just because you arrogantly refuse to make your contributions up-to-standard? Pul-lease.

Wikipedia is an encyclopedia, not a vanity press, or forum for advertising or self-promotion. As such it should contain only material that complies with its content policies, and Wikipedians must place the interests of the encyclopedia first. Any editor who gives priority to outside interests may be subject to a conflict of interest. Adding material that appears to advance the interests or promote the visibility of an article’s author, the author’s family, employer, clients, associates or business, places the author in a conflict of interest.

The edits you complained about were in 2012. You did your first edit in 2007. One would assume that, given almost 5 years of time, you would have been fairly familiar with the rules of of the site. But, no, instead of learning them, you went on to whine about why no one was sucking up to your shoddy M.Sc.

jytdog wrote, ” over the years, the WP community developed policies and guidelines to create a kind of “rule of law” that provides a foundation for the work and importantly, the relationships among editors.”

That’s certainly a welcome and necessary process. The problem I had is that the end-point or the snapshot I encountered are completely hostile to knowledgeable people contributing. If I broke the rules that should be fixable. Deleting contributions is destructive, like capital punishment. There can be no growth of participation in this case. It’s just not an environment I choose to visit like certain dark alleys downtown after midnight. Should I be mugged for trying to make the world and Wikipedia a better place? Where I come from neighbours help neighbours. They don’t beat up people for offering to help. I take personal offence to my contributions being deleted. That’s why I started this blog in large part. I’ve written 4330 posts, some good, some OK but all heartfelt. That’s who I am. If that’s not OK with Wikipedia, that’s too bad.

hey robert, i am sorry that you had a negative experience on Wikipedia. We love expert editors — we have invaluable contributions every day from experts who have taken the time to become familiar with WP’s policies and guidelines. Those policies & guidelines are invaluable, but they can drive experts crazy. I don’t know if anybody ever pointed you to this helpful essay (https://en.wikipedia.org/wiki/Wikipedia:Expert_editors) but it lays out the special challenges WP poses for experts like you.

I just want to add to that….. being an “encyclopedia that anyone can edit” could mean that WP is an ugly wild west where people just slug it out. And I imagine it was that way in the early days. But over the years, the WP community developed policies and guidelines to create a kind of “rule of law” that provides a foundation for the work and importantly, the relationships among editors. Many people don’t realize that foundation even exists, or don’t take time to understand both its letter and its spirit (which remains informed by the key policy “there are no rules”(!) and the high value placed on working toward consensus (which is also a policy)). When folks work with that foundation in mind, WP can be actually beautiful. There is a way to work things out, rationally and with civility, among people who have very different ideas about the world.

Your experience sounds like you are one of those folks who never “got it”, and instead got frustrated and left. WP lost out, and so did you. I am sorry about that, and for your frustrating experience, and I hope you consider coming back, but this time taking the time to really understand how WP works. Best regards!

I think you are right, and a fork of Wikipedia with reasonable policies would work. Wiki’s are good concept, but the official Wikipedia site has gone “off the tracks” with propaganda and political correctness.

I should point out that when I referred to students, those are actually college post-graduate students who really should know better and are given advance warning.

“Then came a time when whatever edits I made were almost certainly rejected by some nameless creatures in the system, rejecting my work sentence by sentence because I did not provide proof of every assertion, almost every sentence or phrase. I kid you not.”

Agreed, I have seen that sh1t in action too. Trying to correct something and being called out on where is your proof. I was like, “Uhhh… I created the damn thing idiot!”

Wikipedia, is a great start, but should not be relied upon for salient knowledge. I even started a few entries, but watched them devolve into something less worthy.

This is why I like private wiki’s for companies, great way of sharing information internally.

usually is a good starting point for information on something if only looking for definition. not so hot for comparison since everyone contributing opinion is in favor of what they say and give biased picture.

ram wrote, “If I catch a student using Wikipedia, except as an example of disinformation, they fail.”

That’s silly. I taught students to evaluate and reflect on whatever sources they had even Wikipedia. For schools with limited Internet access, my snapshot of Wikipedia was bigger than the library of dead trees. Wikipedia is editable and should improve with time. Instead folks are prevented from editing the thing… It’s like the guys who have climed the tree are cutting it off at ground-level so no one else can climb. I think it’s time for a fork but who would have the servers? Google? I’d bet they could set up a fork in a week and draft a reasonable policy and mechanisms to deal with bots and other trash. Would a Wikipedia with ads work? I think so.

Wikipedia is normally so wrong as to be useless. Certain plants that are growing in my backyard they claim are extinct – and nothing can convice them otherwise. Their history of audio and video CODECS is complete nonsense and disagrees with (older) standard reference books on the topic.
The examples are endless.

If I catch a student using Wikipedia, except as an example of disinformation, they fail.

My Mission

My observations and opinions about IT are based on 40 years of use in science and technology and lately, in education. I like IT that is fast, cost-effective and reliable. I do not care whether my solution is the same as yours. I like to think for myself.

My first use of GNU/Linux in 2001 was so remarkably better than what I had been using, I feel it is important work to share GNU/Linux with the world. I have been blessed by working in schools where students and school systems have benefited by good, modular software easily installed in most systems.

I have shown GNU/Linux to thousands of students and hundreds of teachers over the years and will continue in some way doing that until I die in spite of the opposition.