31 March 2009

As you may have heard, there's been a bit of a to-do over a new “Open Cloud Manifesto.” Here's the central idea:

The industry needs an objective, straightforward conversation about how this new computing paradigm will impact organizations, how it can be used with existing technologies, and the potential pitfalls of proprietary technologies that can lead to lock-in and limited choice....

Asterisk, a PBX, telephony engine, and telephony applications toolkit, is one of open source best-kept secrets. As with many open source projects, there is a company has been set up to provide support, Digium. Here's its latest press release....

The new MAA Documentation System combines open-source technologies with deep social computing principles to create a truly innovative approach to museum documentation. The new MAA Documentation System shifts the age-old documentation principles of standardized description and information accumulation to multi-vocal and multi-source accounts and distributed documentation.

For the past few years, the MAA has been developing an open-source Documentation System. With over 20 years experience of developing its own Documentation Systems and Collections Management Systems, the MAA is just about to finish one of the most ambitious upgrades of its history. In fact, this system is the result of a complete re-think of its documentation practices. Thought the new system takes account of documentation standards, such as SPECTRUM, and newer developments such as CollectionSpace, it differs from the traditional approaches is several key respects.

And if that isn't wonderful enough, this new project comes from Cambridge's Museum of Archaeology & Anthropology - known to its friends as Arch and Anth. Its old, Victorian(?) building was one of the most atmospheric places in Cambridge.

Last week I was urging you to write to a particular set of MEPs about proposed changes to the Telecoms Package, which is wending its slow way through the European Union's legislative system. Now it's time to write to *all* you MEPs, since a crucially important vote in a couple of committees is to take place tomorrow. You can read more about what's been happening and why that's a problem on the La Quadrature du Net site, which also offers a detailed analysis of the Telecom Package and the proposed amendments.

Here's what I've just sent to all my MEPs using WriteToThem:

I am writing to ask you as my representative to contact your colleagues on the IMCO and ITRE committees about crucial votes on the Telecoms Package, taking place in 31 March. At stake is nothing less than the future of the Internet in Europe. If amendments being supported by AT&T and others go through, the main driver of the Internet – and with it, online innovation – will be nullified.

This would be deeply ironic, since it was in Europe that the most important online innovation of all – the Web – was invented. In fact, no less a person than Sir Tim Berners-Lee, its inventor, has warned (at http://dig.csail.mit.edu/breadcrumbs/node/144) that the loss of net neutrality – which is what some of the proposed amendments would lead to – would have made it impossible for him to have carried out his revolutionary work. If we wish Europe to remain in the forefront of digital innovation, it is vital that the net neutrality of the Internet be preserved.

This is a complex issue – I personally find it very difficult to navigate through the many conflicting options before the committees. Fortunately, others have already done the hard work, and boiled down the recommendations to the following.

For your colleagues on the IMCO committee, please urge then to:

Vote against the amendments authorizing “net discrimination” and guarantee it is not put in place, by :

Reject the notion of “lawful content” in amendment 45 for it is a major breach to the technical neutrality of the network, would turn operators into private judges, and open the door to “graduated response” (or “three strikes”) schemes of corporate police.

If you or your colleagues are interested in seeing the detailed analysis of all the amendments, it can be found here:

This is a critical series of votes for the Internet in Europe. At a time of great economic turmoil, the last thing we can afford is to throttle Europe's entrepreneurial spirit; for this reason, I hope that you will be able to convince your colleagues on the committees to vote as suggested above.

Sadly, this is really important and really urgent. Please add your voice if you can, or the Internet as we know may cease to exist in Europe soon, to be replaced with something closer to a cable TV service. You have been warned.

29 March 2009

What was Richard Stallman's greatest achievement? Some might say it's Emacs, one of the most powerful and adaptable pieces of software ever written. Others might plump for gcc, an indispensable tool used by probably millions of hackers to write yet more free software. And then there is the entire GNU project, astonishing in its ambition to create a Unix-like operating system from scratch. But for me, his single most important hack was the creation of the GNU General Public Licence....

28 March 2009

Not content with destroying the world’s economies, the banking industry is also bent on ruining us individually, it seems. Take a look at Verified By Visa. Allegedly this protects cardholders - by training them to expect a process in which there’s absolutely no way to know whether you are being phished or not. Even more astonishing is that this seen as a benefit!

...

Craziness. But it gets better - obviously not everyone is pre-enrolled in this stupid scheme, so they also allow for enrolment using the same inline scheme. Now the phishers have the opportunity to also get information that will allow them to identify themselves to the bank as you. Yes, Visa have provided a very nicely tailored and packaged identity theft scheme. But, best of all, rather like Chip and PIN, they push all blame for their failures on to the customer

I've instinctively hated these "Verified by Visa" ever since they came out, and tried not to use them. The fact that they are not just inherently insecure, but encouraging merchants to use this in the most insecure way possible, is astonishing even for an industry as rank and rotten as banking.

The one consolation has to be that Verified by Visa is so demonstrably insecure that it should be easy to challenge in court any attempts to make customers pay for the banks' own stupidity.

27 March 2009

Another classic post from Mike Masnick about the absurdities our current copyright regime visits upon us:

PRS has now threatened a woman who plays classical music to her horses in her stable to keep them calm. She had been turning on the local classical music station, saying that it helped keep the horse calm -- but PRS is demanding £99 if she wants to keep providing such a "public performance." And it's not just a one-off. Apparently a bunch of stables have been receiving such calls.

The group seems to believe that playing music in almost any situation now constitutes a public performance and requires a licensing fee. You just know they're salivating over the opportunity to go after people playing music in their cars with the windows down.

Because you know what? I bet the PRS is really considering how to do this.

26 March 2009

As I wrote earlier today, things are looking bad for the Internet in Europe. But the European Parliament continues to do its bit protecting you and me. Here's the latest from the excellent Quadrature du Net site:

The European Parliament, endorsing the Lambrinidis report and turning its back on all the amendments supported by the French government and defended by Jacques Toubon and Jean-Marie Cavada, has just rejected "graduated response" for the third time. France is definitely alone in the world with its kafkaesque administrative machinery, an expensive mechanism for arbitrary punishment.

The report of Eurodeputy Stavros Lambrinidis concerning the protection of individual liberties on the Internet has just been confirmed by the European parliament by an overwhelming vote of 481 to 252.

It stands in clear opposition to the French HADOPI law in "holding that illiteracy with computers will be the illiteracy of the 21st century; holding that guaranteeing Internet access to all citizens is the same as guaranteeing all citizens access to education and holding that such access must not be refused in punishment by governments or private organizations; holding that this access should not be used abusively for illegal activities; holding that attention must be paid to emerging questions such as network neutrality, interoperability, the global accessibility of all Internet nodes, and the use of open formats and standards."

The approval of the Lambrinidis report and the rejection of the French amendments is the third consecutive time that the European Parliament has rejected the French "graduated response", since the approval of the Bono amendment to the report on cultural industries and the well-knownBono/Cohn-Bendit/Roithova Amendment 138.

Furthermore, all the amendments supported by the French government, notably those proposed by Eurodeputies Jacques Toubon and Jean-Marie Cavada, have been rejected. They were trying specifically to prevent measures related to graduated response, showing that the French government realizes that Europe is about to render the HADOPI law obsolete before it even comes to a vote.

Alas, this is by no means the end. The same wretched clause will come bounding back, along with all kinds of other stupidities. The fight goes on....

WIPO has just published a study entitled Dissemination of Patent Information. I've not read it, but here's someone who has, with an interesting observation:

In the first 71 paragraphs of the study, theoretical availability of patent information is confused with dissemination of patent information. Indeed, the study itself, belatedly, recognises the distinction between the theory of patent law and disclosure and the reality of accessing useful patent information in paragraph 72. Here the study states that availability of information does not always mean it is accessible in practical terms. Based on the figures provided in the study, in practical terms, accessibility of patent information is quite poor.

In other words, the one thing that patents *must* do - to disclose and make patent - they generally do badly. The net effect is that patents take away from the knowledge commons, without giving back even the paltry payment they owe. Add it to the (long) list of why patents fail. (Via Open Access News.)

If open knowledge is your thing, London is the place, Saturday the time:

The Open Knowledge Conference (OKCon) is back for its fourth installment bringing together individuals and groups from across the open knowledge spectrum for a day of discussions workshops.

This year the event will feature dedicated sessions on open knowledge and development and on the semantic web and open data. Plus there's the usual substantial allocation of 'Open Space' -- sessions, workshops and discussions proposed either via the CFP or on the day

Things seem to be going from bad to worse with the EU's Telecoms Package. Now, not only do we have to contend with French attempts to push through its “three strikes and you're out” approach again, which the European Parliament threw out, but there are several other amendments that are being proposed that will effectively gut the Internet in Europe.

One of the most controversial issues is that of the three-strikes strongly and continuously pushed by France in the EU Council. Although most of the dispositions introducing the graduate response system were rejected in first reading of the Telecom Package, there are still some alarming ones persisting. France is trying hard to get rid of Amendment 138 which seeks to protect users’ rights against the three-strikes sanctions and which, until now, has stopped the EU from applying the three-strikes policy. Also, some new amendments reintroduce the notion of lawful content, which will impose the obligation on ISPs to monitor content going through their networks.

The UK government is pushing for the “wikipedia amendments” (so-called because one of them has been created by cutting and pasting a text out of the wikipedia) in order to allow ISPs to make limited content offers. The UK amendments eliminate the text that gives users rights to access and distribute content, services and applications, replacing it with a text that says “there should be transparency of conditions under which services are provided, including information on the conditions to and/or use of applications and services, and of any traffic management policies.”

To these, we must now add at least one more, which the indispensable IPtegrity site has spotted:

Six MEPs have taken text supplied by the American telecoms multi-national, AT&T, and pasted it directly into amendments tabled to the Universal Services directive in the Telecoms Package. The six are Syed Kamall , Erika Mann, Edit Herczog , Zita Pleštinská , Andreas Schwab , and Jacques Toubon .

AT&T and its partner Verizon, want the regulators in Europe to keep their hands-off new network technologies which will provide the capability for broadband providers to restrict or limit users access to the Internet. They have got together with a group of other telecoms companies to lobby on this issue. Their demands pose a threat to the neutrality of the network, and at another level, to millions of web businesses in Europe.

As you can read, this is a grave danger for the Internet in Europe, because it would allow telecom companies to impose restrictions on the services they provide. That is, at will, they can discriminate against new services that threaten their existing offerings – and hence throttle online innovation. The Internet has grown so quickly, and become so useful, precisely because it is an end-to-end service: it does not take note of or discriminate between packets, it simply delivers them.

What is particularly surprising is that one of the MEPs putting forward this amendment is the UK's Syed Kamall, who has a technical background, and in the past has shown himself aware of the larger technological issues. I'm really not sure why he is involved in this blatant attempt by the telecoms companies to subvert the Internet in Europe.

Since he is one of my MEPs (he represents London), I've used the WriteToThem service to send him the following letter:

I was surprised and greatly disappointed to learn that you are proposing an amendment to the Telecoms Package that would have the consequence of destroying the network neutrality of the Internet – in many ways, its defining feature.

Your amendment 105, which requires network providers to inform users of restrictions and/or limitations on their communications services will allow companies to impose arbitrary blocks on Internet services; instead, we need to ensure that no such arbitrary restrictions are possible.

As the inventor of the Web, Sir Tim Berners-Lee, has pointed out when net neutrality was being debated in the US (http://dig.csail.mit.edu/breadcrumbs/node/144):

“When I invented the Web, I didn't have to ask anyone's permission. Now, hundreds of millions of people are using it freely. I am worried that that is going end in the USA.

I blogged on net neutrality before, and so did a lot of other people. ... Since then, some telecommunications companies spent a lot of money on public relations and TV ads, and the US House seems to have wavered from the path of preserving net neutrality. There has been some misinformation spread about. So here are some clarifications.

Net neutrality is this:

If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level.

That's all. Its up to the ISPs to make sure they interoperate so that that happens.

Net Neutrality is NOT asking for the internet for free.

Net Neutrality is NOT saying that one shouldn't pay more money for high quality of service. We always have, and we always will

There have been suggestions that we don't need legislation because we haven't had it. These are nonsense, because in fact we have had net neutrality in the past -- it is only recently that real explicit threats have occurred.”

He concludes:

“Yes, regulation to keep the Internet open is regulation. And mostly, the Internet thrives on lack of regulation. But some basic values have to be preserved. For example, the market system depends on the rule that you can't photocopy money. Democracy depends on freedom of speech. Freedom of connection, with any application, to any party, is the fundamental social basis of the Internet, and, now, the society based on it.”

I'm afraid that what your amendment will do is to destroy that freedom. I am therefore asking you to withdraw your amendment, to preserve the freedom of the connection that allows new services to evolve, and innovations to be made without needing to ask permission of the companies providing the connection. Instead, the Internet needs net neutrality to be enshrined in law, and if possible, I would further request you and your colleagues to work towards this end.

If you are also based in London – or in a constituency represented by one of the five other MEPs mentioned in the IPtegrity story - I urge you to write a similar (but *not* identical) letter to them. It is vitally important these amendments be withdrawn, since most MEPs will be unaware of the damage they can do, and might well wave them through. Further letters to all MEPs will also be needed in due course, but I think it's best to concentrate on these particular amendments for the moment, since they are a new and distrubing development.

25 March 2009

With apologies for returning to the theme of patents, I'd like to direct your attention to a long and interesting piece that has appeared on the Digital Majority site asking a very important question: “Did Red Hat lobby for, or against software patents in Europe?”

Another impressive line-up of mega-academics denouncing the lack of logic for the proposed copyright extension currently being considered in the EU (I'll be writing about this again soon). Here's Rufus Pollock's intro, setting this open letter in a historical context:

The letter, of which I was a signatory, is focused on the change in the UK government’s position (from one of opposition to a term extension to, it appears, one of allowing an extension “perhaps to 70 years”). However, it is noteworthy that this is only one in a long line long line of well-nigh universal opposition among scholars to this proposal to extend copyright term.

For example, last April a joint letter was sent to the Commission signed by more than 30 of the most eminent European (and a few US) economists who have worked on intellectual property issues (including several Nobel prize winners, the Presidents of the EEA and RES, etc). The letter made very clear that term extension was considered to be a serious mistake (you can find a cached copy of this letter online here). More recently — only two weeks ago — the main European centres of IP law issued a statement (addendum) reiterating their concerns and calling for a rejection of the current proposal.

Despite this well-nigh universal opposition from IP experts the Commission put forward a proposal last July to extend term from 50 to 95 years (retrospectively as well as prospectively). That proposal is now in the final stages of its consideration by the European Parliament and Council. We can only hope that they will understand the basic point that an extension of the form proposed must inevitably to more harm than good to the welfare of the EU and should therefore be opposed.

Do read the letter too: the intellectual anger at this stupidity is palpable.

First, the government wants children to use social networking sites like Twitter:

Children will no longer have to study the Victorians or the second world war under proposals to overhaul the primary school curriculum, the Guardian has learned.

However, the draft plans will require children to master Twitter and Wikipedia and give teachers far more freedom to decide what youngsters should be concentrating on in classes.

Second, the government wants to monitor social networking sites like Twitter:

Social networking sites like Facebook could be monitored by the UK government under proposals to make them keep details of users' contacts.

Putting these together, we can deduce that UK government has decided that passively monitoring people isn't enough: now it's time actively to train future generations in the fine art of being spied upon.

24 March 2009

As well as being a great coder, RMS is a fine writer (he made a number of excellent suggestions when I sent him rough drafts of the relevant chapter of Rebel Code). So it's a pity that he doesn't write much these days.

And it's also a red-letter day when he does, as with his latest missive: "The Javascript Trap". This describes a problem he has spotted: non-free Javascript.

It is possible to release a Javascript program as free software, by distributing the source code under a free software license. But even if the program's source is available, there is no easy way to run your modified version instead of the original. Current free browsers do not offer a facility to run your own modified version instead of the one delivered in the page. The effect is comparable to tivoization, although not quite so hard to overcome.

...

It is possible to release a Javascript program as free software, by distributing the source code under a free software license. But even if the program's source is available, there is no easy way to run your modified version instead of the original. Current free browsers do not offer a facility to run your own modified version instead of the one delivered in the page.

He comes up with some interesting solutions:

we need to change free browsers to support freedom for users of pages with Javascript. First of all, browsers should be able to tell the user about nontrivial non-free Javascript programs, rather than running them. Perhaps NoScript could be adapted to do this.

Browser users also need a convenient facility to specify Javascript code to use instead of the Javascript in a certain page.

As I've written elsewhere today, there's a lot of activity happening around software patents at the moment. One forum where they're being considered is WIPO.

The FSFE has put together a suitably diplomatic submission to that one of its committees about why software should not be patentable; here's the key section:

the economic rationale for patents is based on providing incentives in cases of market failure, disclosure of knowledge in the public domain, as well as technology transfer, commercialisation, and diffusion of knowledge. The “three step test for inclusion in the patent system” should therefore be based on demonstrated market failure to provide innovation, demonstrated positive disclosure from patenting, and effectiveness of the patent system in the area to disseminate knowledge. Software fails all three tests, for instance, as innovation in the IT industry has been dramatic before the introduction of patents, there is no disclosure value in software patents, and patents play no role in the diffusion of knowledge about software development.

I think this is one of the best summaries on the subject. One to cut out and keep.

Yesterday I was writing about the latest moves in the TomTom saga, and its involvement with the Open Invention Network patent commons. But beyond that specific case, patents – particularly software patents – really seem to be in the air at the moment....

as civilization collapses, we're going to see horrific scarcities, creating massive personal and collective stresses that will break both individuals (to the point of suicide, terrorism and murder) and nations (to the point of insurrection, civil war, and anarchy -- a hundred Afghanistans). We're going to see dreadful pandemic diseases and poverty and famine that will be utterly shattering, like the abject horror the world witnessed during the Irish potato famine where millions simply sat around, hopeless and increasingly gaunt, until they died an agonizing death alongside those they loved and couldn't save. We're going to see the kind of spiritual vacuum and decay that is eating Russia and the former Soviet republics alive today, with population and life expectancy plummeting, drug addiction at epidemic levels, and crime and gang violence out of control. It is nature's last and most reluctant way of restoring to sustainable populations species whose numbers and voraciousness have run amok.

Or, as an alternative, we could be sensible and tackle the problems facing us - climate change, deforestation, overfishing, overpopulation, peak oil, peak water, poverty - seriously, not with political posturing and soundbites, and maybe come out the other side.

Ada Lovelace Day is an international day of blogging to draw attention to women excelling in technology.

Women’s contributions often go unacknowledged, their innovations seldom mentioned, their faces rarely recognised. We want you to tell the world about these unsung heroines. Entrepreneurs, innovators, sysadmins, programmers, designers, games developers, hardware experts, tech journalists, tech consultants. The list of tech-related careers is endless.

Recent research by psychologist Penelope Lockwood discovered that women need to see female role models more than men need to see male ones. That’s a relatively simple problem to begin to address. If women need female role models, let’s come together to highlight the women in technology that we look up to. Let’s create new role models and make sure that whenever the question “Who are the leading women in tech?” is asked, that we all have a list of candidates on the tips of our tongues.

Not surprisingly, my first thought was: who have we got in the world of free software? There are certainly some big names like Mitchell Baker, Chief Lizard Wrangler of Mozilla and Stormy Peters, Executive Director of the GNOME Foundation.

But notice that both of these occupy executive positions: they hack business/legal/social systems. And while there are plenty of female coders contributing to free software projects, I can't think of any high-profile ones that might stand alongside the obvious alpha males in the coding world.

Now, this is probably due to my ignorance as much as anything. So I'd like to put out a call for names that I ought to know in this context - women who code at a high level, and whose names I should be mentioning more often. And as a pendant I'd also be interested on people's thoughts as to how we can nurture more top-flight female hackers.

23 March 2009

The United States has unveiled an unlikely weapon in its battle against drugs gangs and illegal immigrants at the Texas-Mexico border - pub-goers in Australia.

The drinkers are the most far-flung of a sizeable army of hi-tech foot soldiers recruited to assist the border protection effort.

Anyone with an internet connection can now help to patrol the 1,254-mile frontier through a network of webcams set up to allow the public to monitor suspicious activity. Once logged in, the volunteers spend hours studying the landscape and are encouraged to email authorities when they see anyone on foot, in vehicles or aboard boats heading towards US territory from Mexico.

But the important point here is not just the quaint locale: it is the fact that the observers are completely disconnected from the observed. There is no human connection, so there would be no compunction in reporting anything required.

This is the perfect surveillance system: not where your neighbours keep an eye on you, but where total strangers the other side of the world do. (Via The Reg.)

Major media companies are increasingly lobbying Google to elevate their expensive professional content within the search engine's undifferentiated slush of results.

Many publishers resent the criteria Google uses to pick top results, starting with the original PageRank formula that depended on how many links a page got. But crumbling ad revenue is lending their push more urgency; this is no time to show up on the third page of Google search results. And as publishers renew efforts to sell some content online, moreover, they're newly upset that Google's algorithm penalizes paid content.

Let's just get this right. The publishers resent the fact that the stuff other than "professional content" is rising to the top of Google searches, because of the PageRank algorithm. But wait, doesn't the algorithm pick out the stuff that has most links - that is, those sources that people for some reason find, you know, more relevant?

So doesn't this mean that the "professional content" isn't, well, so relevant? Which means that the publisher are essentially getting what they deserve because their "professional content" isn't actually good enough to attract people's attention and link love?

And the idea that Google's PageRank is somehow "penalising" paid content by not ignoring the fact that people are reading it less than other stuff, is just priceless. Maybe publishers might want to consider *why* their "professional content" is sinking like a stone, and why people aren't linking to it? You know, little things like the fact it tends to regard itself as above the law - or the algorithm, in this case? (Via MicroPersuasion.)

News that TomTom is joining the Open Invention Network (OIN) reminded me that the latter is an example of a patent commons, where patents are shared on a like-for-like basis:

OIN grants patent license to licensee

– All OIN patents and applications for all products

Licensee grants patent license to OIN

– All licensee patents and applications for the Linux System

Licensee grants license to other current and future licensees

– All licensee patents and applications for the Linux System

It's an interesting approach, and one that's gradually gaining adherents. For example, IBM set up something called the Eco-Patents Commons:

The Eco-Patent Commons is an initiative to create a collection of patents on technology that directly or indirectly protects the environment. The patents will be pledged by companies and other intellectual property rights holders and made available to anyone free of charge.

What's interesting here is that one commons - that of eco-patents - is being used to protect another - the environment. There's more information about the idea in this post by someone who works for IBM and is involved in the project.

The 2003 SCO lawsuit, for those of you too young to recall, began as a modest request for $1 billion from IBM for allegedly “misusing and misappropriating SCO’s proprietary software” amongst other things....

A recurrent theme in these posts – and throughout Computerworld UK – has been the rise of vast, unnecessary and ultimately doomed databases in the UK.

But those stories have been largely sporadic and anecdotal; what has been lacking has been a consolidated, coherent and compelling analysis of what is going on in this area – what is wrong, and how we can fix it.

That analysis has just arrived in the form of the Database State report, commissioned by the Joseph Rowntree Foundation from the Foundation for Information Policy Research (FIPR).

22 March 2009

The little brouhaha concerning the Guardian and Barclays Bank is a wonderful object lesson in how the Internet changes everything. Once those super-secret documents were put up for even a few seconds, the game was over: taking them down from the Guardian afterwards really is the proverbial closing of the stable door after the horse has bolted.

Inevitably, a copy has made its way to Wikileaks; inevitably that link is being exposed all over the place, which has led to the site being overloaded (do make a donation if you can: I've given my widow's mite). Barclays Bank can apply for as many injunctions as they like, the judge can - and probably will - huff and puff as much as he/she likes, but the game's over: this stuff is out.

And quite right too: these documents either show the bank engaged in something dodgy, in which case they should be published, or they don't, in which case there's no problem in them being public anyway, since the bank is asking for serious scads of public dosh, and is effectively being part-nationalised.

But even if it weren't, it would be folly to try to keep them secret now: it would only ensure that even more people write about them, and point to them, and maybe even read them. The rules have changed.

Terrorism threatens the rights that all in this country should hold dear, including the most fundamental human right of all - the right to life. We know that terrorists will keep on trying to strike and that protecting Britain against this threat remains our most important job.

That tired old Blairite trope: the "right to life" as the "the most fundamental human right of all". Except that it's not a *right*: do I have a right to life when I'm suffering from a terminal disease? Do I have a right to life when I'm 123 years old? Do I have a right to life when the Sun explodes? "Right to life": an idiotic meme, which certainly has no "right to life".

What he should have said is this:

This government threatens the rights that all in this country should hold dear, including the most fundamental human right of all - freedom. We know that this government will keep on trying to strike and that protecting Britain against this threat remains your most important job.

[Via Google Translate: Richard Stollman. In 1990, he announced a crusade against Microsoft and several other whale computer business. He cracks the sites where the proposed purchase new software. And then handed out to the people free.]

Not quite sure why the newspaper has the word "Pravda" - truth - in its title given the utter incorrectness of this from just about every viewpoint. (Via Stargrave's blog.)

In the wake of news that Australia's blacklist has been leaked, I came across these interesting comments:

"Because this is a secret that has been leaked, everyone will be after it.”

“Every Australian will want to know what they were not they were considered so irresponsible to not leave alone.”

Guys said the leakage is proof that the list will be continually leaked if the Internet content filters are enforced, which he said will completely undermine its effectiveness.

Of course, the classic Streisand effect applied to blacklists: as soon as you create one, everyone wants to know whats on it, and some will manage to do so despite the blacklists - thus ensuring that those sites will get far more traffic than if no blacklist had been created.

So blacklists are actually one of the most foolish ways of trying to censor: wonder how long it will take the authorities to work this one out?

I'd not paid much attention to the Japanese "fashion" robot, until I came across this:

Japan's National Institute of Advanced Industrial Science and Technology (AIST) has demonstrated a Linux-based humanoid robot that will perform in a fashion show next week. The HRP-4C runs the robotics-focused hard real-time ART-Linux distro, which was released this week for Linux 2.6xx under GPL.

The HRP-4C robot and the open-source ART-Linux distro (see more farther below) were developed by AIST's Human Robotics Group (HRG). ART (Advanced Real-Time) Linux has been used in a variety of humanoid robot prototypes from the Japanese government-backed HRG/AIST, says the group. The newest HRP-4C model announced earlier this week has been a hit on YouTube (see below). Designed to look like a young Japanese woman, the robot stands (and walks) about five feet, two inches (158 centimeters), and weighs about 95 pounds (43 kilograms).

20 March 2009

"The economy is helpful. Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be."

I think this is a very frank analysis of the problem for Microsoft: after all, who's going to pay extra money just to get the Windows logo on a netbook, when they can get the same features for less with free software...?

One of the signs of a healthy ecosystem is that it is constantly expanding into new niches. Here's a new angle on opennes I hadn't come across before - a site devoted to the *teaching* of open source coding skills:

Open Source is becoming a dominant development model in the software industy. The next generation of software developers, computer scientists, system administrators, analysts, and build engineers need to understand Open Source and must be able to work efficiently within Open Source communities.

This is a neutral collaboration point for professors, institutions, communities, and companies to come together and make the teaching of Open Source a global success.

It already has its own planet of associated blogs, alongside a host of other useful content.

One of the many disgraceful aspects about the disgraceful ID card programme is the reluctance of the UK government to make key documents available. For such a momentous change in the relationship of government to governed, it is critically important that a full debate about all the issues be conducted; but without key details of the scheme, that is made more difficult – which is presumably why the UK government has resisted the publication of the so-called “Gateway reviews” so long.

Finally, though, we have gained the right to see these somewhat outdated documents. Despite their age, and the unnecessary redactions, some useful new information has come to light, which more than justifies the long battle to gain access.

There's some fine writing coming out of the current newspaper crisis. Here's some more, from one of the my favourite thinkers, Yochai Benkler. He's replying here to an earlier article in The New Republic; two paragraphs in particular caught my attention:

Critics of online media raise concerns about the ease with which gossip and unsubstantiated claims can be propagated on the Net. However, on the Net we have all learned to read with a grain of salt between our teeth, like Russians drinking tea through a sugar cube. The traditional media, to the contrary, commanded respect and imposed authority. It was precisely this respect and authority that made The New York Times' reporting on weapons of mass destruction in Iraq so instrumental in legitimating the lies that the Bush administration used to lead this country to war. Two weeks ago and then last Friday, The Washington Post was still allowing George Will to make false claims about the analysis of a scientific study of global sea ice levels without batting an eyelid, reflecting the long-standing obfuscation of the scientific consensus on the causes of climate change by newspapers that, in the name of balanced reporting, reported the controversy rather than the actual scientific consensus. On some of these, the greatest challenges of our time, newspapers have failed us. The question then, on the background of this mixed record is whether the system that will replace the mass mediated public sphere can do at least as well.

Absolutely: newspaper have their virtues, but as Benkler says, they certainly have their vices too. So criticising potential weakness in nascent news forms is perilously close to pots calling the kettle black.

This other point also struck a chord (well, it would do, wouldn't it?):

Like other information goods, the production model of news is shifting from an industrial model--be it the monopoly city paper, IBM in its monopoly heyday, or Microsoft, or Britannica--to a networked model that integrates a wider range of practices into the production system: market and nonmarket, large scale and small, for profit and nonprofit, organized and individual. We already see the early elements of how news reporting and opinion will be provided in the networked public sphere.

19 March 2009

As part of a global conspiracy of Glyns, Glyn Wintle has kindly pointed me to this very interesting decision from those fun-loving German judges in Wiesbaden:

As the first German court, the Administrative Court of Wiesbaden has found the blanket recording of the entire population's telephone, mobile phone, e-mail and Internet usage (known as data retention) disproportionate.

The decision published today by the Working Group on Data Retention (decision of 27.02.2009, file 6 K 1045/08.WI) reads: "The court is of the opinion that data retention violates the fundamental right to privacy. It is not necessary in a democratic society. The individual does not provoke the interference but can be intimidated by the risks of abuse and the feeling of being under surveillance [...] The directive [on data retention] does not respect the principle of proportionality guaranteed in Article 8 ECHR, which is why it is invalid."

Now, IANAL, and certainly not a German one, but it seems likely to me that the Administrative Court is not the highest authority in the land (which would be something like the Federal Constitutional Court), so there's probably lots of to-ing and fro-ing still to come on this before a definitive decision is reached. But it's certainly a good start since the that judgment is in tune with commonsense: that data retention is disproportionate and violates privacy.

Last week, I wondered whether I'd gone back in time. Everywhere I went online – on news sites, blogs and Twitter – people were celebrating the 15th birthday of Linux, it seemed. “How is this possible?” I asked myself. “Since Linux was started in 1991, that must mean we are in 2006: have I fallen through a wormhole into the past?”

18 March 2009

The Guardian has a nice story about the unexpected success of the "Keep Calm and Carry On" poster. But what struck me was the following:

This was the third in a series. The first, designed to stiffen public resolve ahead of likely gas attacks and bombing raids, was printed in a run of more than a million and read: Your Courage, Your Cheerfulness, Your Resolution Will Bring Us Victory. The second, identically styled, stated: Freedom Is In Peril.

Maybe this is the way to get the huddled masses roused up against the insane and disproportionate Interception Modernisation Programme:

The U.K. government is considering the mass surveillance and retention of all user communications on social-networking sites, including Facebook, MySpace, and Bebo.

Vernon Coaker the U.K. Home Office security minister, on Monday said the EU Data Retention Directive, under which Internet service providers must store communications data for 12 months, does not go far enough. Communications such as those on social-networking sites and via instant-messaging services could also be monitored, he said.

"Social-networking sites such as MySpace or Bebo are not covered by the directive," said Coaker, speaking at a meeting of the House of Commons Fourth Delegated Legislation Committee. "That is one reason why the government (is) looking at what we should do about the Intercept(ion) Modernisation Programme, because there are certain aspects of communications which are not covered by the directive."

Monitoring Face, MySpace, Bebo: just think of the embarrassing/illegal things you mentioned there. Now, do you really want a bunch of control freaks sifting through *that* little lot?

The Home Office has admitted that it has been trying to force ISPs to subscribe to the Internet Watch Foundation's (IWF) blacklist, even though it doesn't know what the organisation does.

Speaking exclusively to Computer Shopper, a Home Office spokesman thought the IWF deletes illegal websites and doesn't look at the content they rate.

He also revealed that the government's measures to ensure that the IWF is blocking illegal content only consist of "meeting with the IWF fairly regularly for updates on how they're doing."

Against the background of countries like Australia secretly blocking Wikileaks, this use of unappointed censors that are never questioned or even checked by any kind of review body is really getting dire. When will these politicians come to their senses?

[Via Google Translate: The company ALT Linux and OpenGO (Ventox Boundless Brasil) announce the opening of the representation of ALT Linux in Brazil.

Such a move by the ALT Linux is due to a desire to gather around the repository Sisyphus the widest range of developers and is part of a strategy to expand the market by other countries.

Now, products and services are available, and ALT Linux in Latin America

Attention to ALT Linux from Latin America, there has been a long time ago. It is as interested in our products and experience in implementation on the one hand, and the attractiveness of the open architecture on the other.

Together with the Portuguese version of the site, ALT Linux Brasil opens its online store, which can be purchased localized for Latin America ALT Linux distributions and support services to the distributions. Activities will focus on the representation of working with government agencies and educational institutions, as well as with representatives of business, which will provide technical support services, implementation and training.]

Yesterday, I wrote about the launch of the open source company Cloudera. It's always hard to tell whether startups will flourish, but among the most critical factors for survival are the skills of the management team. The fact that less than three hours after I sent out some questions about Cloudera to Mike Olson, one of the company's founders, I had the answers back would seem to augur well in this respect.

Olson explains the background to the company, and to Hadoop, the software it is based on: what it does, and why business might want to use it; he talks about his company's services and business model, and why he thinks cloud computing is neither a threat nor an opportunity for open source.

Interesting initiative from the European Telecommunications Network Operators' Association (ETNO):

ETNO is launching a new online content web site today, to raise awareness of attractive online offers put on the market by its members throughout Europe to download music, films or watch TV. ETNO members believe that offering a wide choice of online services is the best way to promote a legitimate use of the Internet and fight against illicit file-sharing.

The new ETNO web site gives a non-exhaustive overview of services available including IP TV, video on demand or music downloads, offered by ETNO members through different platforms and devices to meet user’s demands.

"User-demand for content is the basis of our actions. ETNO members develop and promote business models for content online offers, including music, films and TV. This list will of course need to be continuously updated,” says Patrik Hiselius, TeliaSonera, Chair of ETNO’s Content Working Group.

Increasing choice of legitimate content online and raising awareness among users are the best instruments to fight against illicit file sharing.

“Illicit file sharing represents a major burden for all stakeholders, including internet service providers. Education is key. Users should not be unreasonably criminalised or stigmatised. Through this new web site, ETNO members show their commitment to play their part and cooperate with rightsholders under the existing legal framework, in a scenario where choice and availability for the consumer, and rights and privacy for the citizen are all fully guaranteed”, added Bartholomew.

ETNO calls on policy makers and stakeholders to work together in order to ensure the wide availability of legitimate content offerings and to enable new creative market-driven business models to emerge.

This isn't perfect - I have problems with this "illicit file sharing", and the phrasing of "users should not be unreasonably criminalised or stigmatised", but what's interesting is that it shows an awareness of the broader issues, and of the fact that customers have rights as well as holders of intellectual monopolies. It suggests to me that the telecoms companies are beginning to understand that things are changing, and are beginning to change their own stance in response too.

Yesterday I wrote about India's opposition backing open source more fully, and here's a good update on what's happening in South Africa:

“The move to open source software has not been as fast as we would have liked, but we are now entering a new era. In the past, open source deployments were mostly spontaneous and ad-hoc. We now have a more systematic approach.” In years past many government departments pursued their own open source migrations, usually in isolation from one another, and with varying degrees of success.

...

Now, says Webb, the State IT Agency (Sita) is assuming the role of paving the way for OSS migration by finalising standards and conducting pilot projects to make it easier for all to implement open source software successfully. ... Webb also says that Sita expects all government department websites to be running on open source software “very soon”.

There's clearly a pattern emerging here around the world, as governments their loyal opposition first experiment with open source, and then commit to it whole heartedly.

16 March 2009

One of the most exciting experiences in blogging is when a post catches fire - metaphorically, of course. Often it happens when you least expect it, as is the case with my rant about Science Commons working with Microsoft, which was thrown off in a fit of pique, without any hope that anybody would pay much attention to it.

Fortunately, it *was* picked up by Bill Hooker, who somehow managed to agree and disagree with me in a long and thoughtful post. That formed a bridge for the idea into the scientific community, where Peter Murray-Rust begged to differ with its thesis.

Given all this healthy scepticism, I was delighted to find that Peter Sefton is not only on my side, but has strengthened my general point by fleshing it out with some details:

Looking at the example here and reading Pablo’s Blog I share Glyn Moody’s concern. They show a chunk of custom XML which gets embedded in a word document. This custom XML is an insidious trick in my opinion as it makes documents non-interoperable. As soon as you use custom XML via Word 2007 you are guaranteeing that information will be lost when you share documents with OpenOffice.org users and potentially users of earlier versions of Word.

He also makes some practical suggestions about how the open world can work with Microsoft:

In conclusion I offer this: I would consider getting our team working with Microsoft (actually I’m actively courting them as they are doing some good work in the eResearch space) but it would be on the basis that:

* The product (eg a document) of the code must be interoperable with open software. In our case this means Word must produce stuff that can be used in and round tripped with OpenOffice.org and with earlier versions, and Mac versions of Microsoft’s products. (This is not as simple as it could be when we have to deal with stuff like Sun refusing to implement import and preservation for data stored in Word fields as used by applications like EndNote.)

The NLM add-in is an odd one here, as on one level it does qualify in that it spits out XML, but the intent is to create Word-only authoring so that rules it out – not that we have been asked to work on that project other than to comment, I am merely using it as an example.

* The code must be open source and as portable as possible. Of course if it is interface code it will only work with Microsoft’s toll-access software but at least others can read the code and re-implement elsewhere. If it’s not interface code then it must be written in a portable language and/or framework.

As I've noted before, you can tell open source has entered the mainstream when political parties try to outbid each other in establishing their open credentials. Further evidence of this trend now comes from India, where The Bharatiya Janata Party (BJP), the largest opposition party, has released its “IT vision”, document which includes a healthy chunk of openness....

One of the favourite tropes in the music industry is that they'd all be rolling in it like the good old days if it weren't for those nasty people downloading music for free. Here's a perceptive analysis that explains why that isn't so:

the newspaper industry is in the same death spiral as the recording industry, without the lawbreaking that’s commonly blamed for the recording industry’s troubles. And it seems to me that this poses a philosophical challenge to DeLong’s theory that the problem is a lack of respect for “property rights.” The decline of the newspapers is clearly a story of technological progress producing increased competition and entrepreneurship—precisely the sort of thing libertarians normally celebrate. The news business has gotten far more competitive over the last decade, and we’re now seeing a normal shake-out where the least efficient firms go out of business.

I think the fact that this is happening in an industry without a piracy problem should give us second thoughts about blaming the decline of other copyright industries on BitTorrent. The newspaper example suggests that even if we could completely shut down peer-to-peer networks, we should still expect the recording industry to decline over time as consumers gravitate toward more efficient and convenient sources of music. Piracy obviously accelerates the process, but the underlying problem is simply this: the recording industry’s core competence, pressing 1s and 0s on plastic disks and shipping them to retail stores, is rapidly becoming pointless, just as the newspaper industry’s core competence of pressing ink on newsprint and dropping them on doorsteps is becoming obsolete. Not surprisingly, when a technology becomes obsolete, firms who specialize in exploiting that technology go out of business.

Beautifully-written piece by Roger Lancefield providing a lucid explanation of why free culture is both profound and inevitable. Here's the peroration:

So finally, free culture is nothing more than, and nothing less than, mankind’s natural propensity to communicate, collaborate and share. It is not a fad, it goes much deeper. Characterising it in narrow terms as a politically motivated cult, or as a commercially damaging movement is missing the big picture, for these things are not of its essence. It first and foremost is a technology-facilitated extension of our normal modes of behaviour — and this is why it is inevitable, profound and unstoppable.

Creating a business around free software is hardly a new idea: Cygnus Solutions, based around Stallman's GCC, was set up in 1989. But here's one with a trendy twist: a company based on the open source *cloud computing* app Hadoop, an Apache Project...

14 March 2009

Despite the good-natured ding-dong he and I are currently engaged in on another matter, Peter Murray-Rust is without doubt one of the key individuals in the open world. He's pretty much the godfather of the term "open data", as he writes:

Open Data has come a long way in the last 2-3 years. In 2006 the term was rarely used - I badgered SPARC and they generously created a set up a mailing list. I also started a page on Wikipedia in 2006 so it’s 2-and-a-half years old.

The same post gives perhaps the best explanation of why open data is important; it's nominally about open data in science, but its points are valide elsewhere too:

* Scientists are funded to do research and to make the results available to everyone. This includes the data. Funders expect this. So does the world.

* The means of dissemination of data are cheap and universal. There is no technical reason why all the data in all the chemistry research in the world should not be published into the cloud. It’s small compared with movies…

* Data needs cleaning, flitering, repurposing, re-using. The more people who have access to this, the better the data and the better the science.

Open data is still something of a Cinderella in the open world, but as Peter's comments make clear, that's likely to change as more people realise its centrality to the entire open endeavour.

13 March 2009

I was fortunate enough to spend last Thursday with a group of LAMP engineers who have some experience with Windows Server and IIS, and who are based in Japan.

The three - Kimio Tanaka, the president of Museum IN Cloud; Junpei Hosoda, the president of Yokohama System Development; and Hajime Taira, with Hewlett-Packard Japan - won a competition organized by impress IT and designed to get competitive LAMP engineers to increase the volume of technical information around PHP/IIS and application compatibility. The competition was titled "Install Maniax 2008".

A total of 100 engineers were chosen to compete and seeded with Dell server hardware and the Windows Web Server 2008 operating system. They were then required to deploy Windows Server/IIS and make the Web Server accessible from the Internet. They also had to run popular PHP/Perl applications on IIS and publish technical documentation on how to configure those applications to run on IIS.

The three winners were chosen based on the number of ported applications on IIS, with the prize being a trip to Redmond. A total of 71 applications out of the targeted 75 were ported onto IIS, of which 47 were newly ported to IIS, and related new "how to" documents were published to the Internet. Some 24 applications were also ported onto IIS based on existing "how to" documents.

So let's just deconstruct that, shall we?

A competition was held in Japan "to get competitive LAMP engineers to increase the volume of technical information around PHP/IIS and application compatibility"; they were given the challenge of getting "popular PHP/Perl applications on IIS", complete with documentation. They "succeeded" to such an extent, that "71 applications out of the targeted 75 were ported onto IIS, of which 47 were newly ported to IIS".

But that wasn't the real achievement: the real result was that a further 47 PHP/Perl apps were ported *from* GNU/Linux (LAMP) *to* Windows - in effect, extracting the open source solutions from the bottom of the stack, and substituting Microsoft's own software.

This has been going on for a while, and is part of a larger move by Microsoft to weaken the foundations of open source - especially GNU/Linux - on the pretext that they are simply porting some of the top layers to its own stack. But the net result is that it diminishes the support for GNU/Linux, and makes those upper-level apps more dependent on Microsoft's good graces. The plan is clearly to sort out GNU/Linux first, before moving on up the stack.

It's clever, and exactly the sort of thing I would expect from the cunning people at Microsoft. That I understand; what I don't get is why these LAMP hackers are happy to cut off the branch they sit on by aiding and abetting Microsoft in its plans? Can't they see what's being done to their LAMP?

Although Tim Berners-Lee made his “Information Management” proposal back in March 1989, the key moment for what became the World Wide Web was October 1994, when the start-up Mosaic Communications – later known as Netscape – released its browser, optimised for PC users and dial-up modems....

12 March 2009

Russia is rapidly turning into open source's best-kept secret. A little while back I wrote about plans to roll out free software to all schools; more recently, there has been talk about creating a Russian operating system based on Fedora. And now there's this:

[Via Google Translate: The site Minkomsvyazi Russian Federation published a draft document on the transition state authorities to free software. The document «Guidelines for the development and acquisition of software for use in public authorities and budget organizations» has recommended authorities gosvlasti and budgetary institutions to give preference to free software in the selection of software, except when the ACT does not have the necessary functionality.

...

It also published a draft plan of government bodies and agencies to use the budget of free software. The project includes a number of actions required for the phased introduction of ACT in the Russian government, including the training of public servants, a pilot project on introduction of ACT, support the development of free software in Russia.]

Aside from the scale of these plans, which foresee all Russian government departments using free software, and civil servants being trained in its use (a shrewd move), what's particularly interesting is the formulation that open source will be the default except where it does not have the necessary functionality. This approach has been adopted elsewhere, and is reasonable enough, although it's important not to allow lock-in to proprietary formats to lock out open source solutions based on open standards.

Whatever happens in detail, Russia's announcement is not only important in itself, but also provides a useful addition to the roster of governments making the switch to free software. As the latter grows, so will the pressure on other countries to follow suit.

11 March 2009

One of the things that disappoints me is the lack of understanding of what's at stake with open source among some of the other open communities. For example, some in the world of open science seem to think it's OK to work with Microsoft, provided it furthers their own specific agenda. Here's a case in point:

John Wilbanks, VP of Science for Creative Commons, gave O'Reilly Media an exclusive sneak preview of a joint announcement that they will be making with Microsoft later today at the O'Reilly Emerging Technology Conference.

According to John, who talked to us shortly after getting off a plane from Brazil, Microsoft will be releasing, under an open source license, Word plugins that will allow scientists to mark up their papers with scientific entities directly.

"The scientific culture is not one, traditionally, where you have hyperlinks," Wilbanks told us. "You have citations. And you don't want to do cross-references of hyperlinks between papers, you want to do links directly to the gene sequences in the database."

Wilbanks says that Science Commons has been working for several years to build up a library of these scientific entities. "What Microsoft has done is to build plugins that work essentially the same way you'd use spell check, they can check for the words in their paper that have hyperlinks in our open knowledge base, and then mark them up."

That might sound fine - after all, the plugins are open source, right? But no. Here's the problem:

Wilbanks said that Word is, in his experience, the dominant publishing system used in the life sciences, although tools like LaTex are popular in disciplines such as chemistry or physics. And even then, he says it's probably the place that most people prepare drafts. "almost everything I see when I have to peer review is in a .doc format."

In other words, he doesn't see any problem with perpetuating Microsoft's stranglehold on word processing. But it has consistently abused that monopoly by using its proprietary data formats to lock out commercial rivals or free alternatives, and push through pseudo-standards like OOXML that aren't truly open, and which have essentially destroyed ISO as a legitimate forum for open standards.

Working with Microsoft on open source plugins might seem innocent enough, but it's really just entrenching Microsoft's power yet further in the scientific community, weakening openness in general - which means, ultimately, undermining all the other excellent work of the Science Commons.

It would have been far better to work with OpenOffice.org to produce similar plugins, making the free office suite even more attractive, and thus giving scientists yet another reason to go truly open, with all the attendant benefits, rather than making do with a hobbled, faux-openness, as here.

Not surprisingly, given the nature of this blog, I'm pretty favourably disposed towards Google's Linux-based Android platform, even though I don't possess the only phone currently using it, T-Mobile's G1. But it's hard to tell just how well it's doing against the iPhone, say. If any one knows, it's T-Mobile, so I was interested to receive this morning some tantalising tidbits from Richard Warmsley, head of Internet and Entertainment at T-Mobile UK.

10 March 2009

I've written much about open access on this blog; but generally that's been about open access to articles in academic journals. Another huge class of material paid for by the taxpayer is academic books. So, applying the same logic as for articles, shouldn't we all have free access to digital copies? That's what Free Our Books thinks:

Public funds pay vast majority of the academic research; the results should therefore be public. Inexpensive electronic publishing should make this possible. But private publishing companies still own these results, and restrict access to them by charging exorbitant fees. In the case of academic journals, publishing companies are making huge profits by requiring publicly funded universities to pay very high subscription fees on behalf of students and academics.

We, the citizens, through the state, pay for the production of academic books and research papers twice, first through salaries and research grants, and second through the purchase of books and journal subscriptions. This is how the the most fundamental principles of academia, to study and to share its findings, are obstructed, and its operation is made far more expensive and cumbersome. Good news is that this has been partially recognised and Research Councils UK (RCUK) has pushed hard (2005) in the direction of both mandatory self archiving (2006) of all research outputs and open access in general.

When it comes to books, the argument, however, isn't as simple and as straight forwad as in the case of Guardian's campaign Free Our Data - whose name we're reusing. Nor has it been problematised widely, like it has been in the case of journals and RCUK recommendations. Significant contribution of editors, subeditors, proofreaders and other working on texts being produced (wages) and personal gain of authors of best selling works (share of sales) complicates the issue. In short, open access and self-archiving of publicly funded books, whose importance for social sciences and humanities is enormous (unlike in physics and maths) is yet to be widely discussed and there aren't immidiately obvious solutions visible. That is, unless we treat books, as we think we should, as just another form of research output - both when funded directly by one of RCUK councils, or by the individual universities.

The "O" word has been much on the lips of the UK government recently, what with all the nice things it's been saying about open source, and now this:

The independent Power of Information Task Force published its report on 2 March. The report contained 25 challenging recommendations to government aimed at improving the use of information in this new world. The Task Force's work has been recognised internationally as providing a cutting-edge vision, with examples of what modern public service delivery might be.

The Government welcomes the task force’s vision, accepts its overall messages and will be responding on the detailed recommendations shortly. We are already taking steps to implement this vision and in 2009 we will seek to deliver the following:

Open information. To have an effective voice, people need to be able to understand what is going on in their public services. Government will publish information about public services in ways that are easy to find,easy to use, and easy to re-use, and will unlock data, where appropriate, through the work of the Office of Public Sector Information.

Open innovation. We will promote innovation in online public services to respond to changing expectations. The Government will seek to build on the early success of innovate.direct.gov.uk by building such innovation into the culture of public services and public sector websites.

Open discussion. We will promote greater engagement with the public through more interactive online consultation and collaboration. We will also empower professionals to be active on online peer-support networks in their area of work.

Open feedback. Most importantly, the public should be able to have a fair say about their services. The Government will publish best practice in engaging with the public in large numbers online, drawing on the experience of the www.showusabetterway.com competition and the www.londonsummit.gov.uk, as well as leading private sector examples like www.ideastorm.com.

Open information, open innovation, open discussion, open feedback: well, that's just super-duper and fab and all that, but why not allow a little openness about what the UK government is doing? How about getting rid of the absurd Official Secrets Act, the very antithesis of openness? How about putting the teeth back in the Freedom of Information Act? How about not refusing to publish documents about the Iraqi war? How about letting us see details of MPs' expenses? How about letting us know where our MPs live? How about letting the public openly rate the government itself - the one group that seems excluded from the wonderful plans to "ebay-ise" UK public life?

Because, strange as it may seem, openness does not have hard lines: if you're going to be open, you're going to be *really* open, everywhere. Otherwise, it just further debases an increasingly fashionable concept, takes our cynicism up a notch or three, and alienates those of us fighting for *real* openness.

For years, the content industries having been trying to get laws passed that would stop people sharing files. For years they failed. And then they came up with the "three strikes and you're out" idea - and it is starting to be adopted around the world. First we had France, then countries like Italy, Ireland - and now South Korea:

On March 3, 2009, the National Assembly's Committee on Culture, Sports, Tourism, Broadcasting & Communications (CCSTB&C) passed a bill to revise the Copyright Law. The bill includes the so called, "three strikes out" or "graduated response" provision.

...

The provision gives authority to order ISP to send warning letters to the users, delete or stop transmission of illegal reproductions, suspend or terminate the accounts of the users, or close the bulletine boards to the Ministry. It also gives power to order information and telecommunication service providers to block connections to their information and telecommunication network of such ISPs.

...

The modified bill will be up for vote in April, and it is most likely that the bill pass in the National Assembly and come into force in April.

What's the secret? why has the "three strikes" idea caught on where others have failed? And what is the best way to stop it spreading further?

I would like to introduce Mark Stone, who will be a regular contributor to Port 25 going forward. Mark has a long association with open source.

He did his first Linux install in 1994 and, in the fifteen years since, has served as O'Reilly's executive editor for open source, editor-in-chief of the Journal of Linux Technology, publisher for the web arm of SourceForge's open source evangelism efforts, and later Director of Developer Relations for SourceForge.

During that time he helped Microsoft launch its first two open source projects on SourceForge.net. He has also co-edited two of the foundational books on open source: Open Sources and Open Sources 2.0.

At SourceForge, and as an independent consultant, he has worked with technology companies large and small to help them formulate their community engagement strategy around open source.

He has most recently been working at Microsoft to help identify and support community projects that advance open source on the Windows platform.

Alas for the well-intentioned souls in Redmond, such snuggling up to the open source community is rather vitiated by this kind of stuff.

At a time when most newspapers are talking doom and gloom, the Guardian is instead *doing* something - and thriving (maybe there's a correlation?). Here's its latest shrewd move:

The Guardian today launched Open Platform, a service that will allow partners to reuse guardian.co.uk content and data for free and weave it "into the fabric of the internet".

Open Platform launched with two separate content-sharing services, which will allow users to build their own applications in return for carrying Guardian advertising.

A content application programming interface (API) will smooth the way for web developers to build applications and services using Guardian content, while a Data Store will contain datasets curated by Guardian editors and open for others to use.

So far, so conventional. Here's the important bit:

The Guardian is positioning its Open Platform as a commercial venture, requiring partners to carry its advertising as part of its terms and conditions, while BBC Backstage states clearly that its proposition is for individual developers designers and not for "big corporates".

This is the future of content, which will be made available freely, but revenue-generating features will be bolted on to it as above. (Disclosure: I occasionally write for the Guardian; but not much.)

09 March 2009

If you're a fan of free data flow into and out of the government, Vivek Kundra seems like an ally. But we can't rest on our laurels. Now is exactly the time when lobbying for particular data and documents to be made accessible could be most effective.

We've established this wiki to help focus attention on valuable data resources that need to be made more accessible or usable. Do you know of a legacy dataset in danger of being lost? How about a set of Excel (or — shudder — Lotus 1-2-3) spreadsheets that would work better in another format? Data locked up in PDF's?

This is your place to report where government data is locked up by design, neglect or misapplication of technology. We want you to point out the government data that you need or would like to have. Get involved!

Based on what you contribute here, we'll follow up with government agencies to see what their plans are for that data — and track the results of the emerging era of Data.gov.

With your help, we can combine the best of new social media and old-school journalism to get more of the data we've already paid for in our hands.

We could do with something similar here: Free Our Data, are you listening?

About Me

I have been a technology journalist and consultant for 30 years, covering
the Internet since March 1994, and the free software world since 1995.

One early feature I wrote was for Wired in 1997:
The Greatest OS that (N)ever Was.
My most recent books are Rebel Code: Linux and the Open Source Revolution, and Digital Code of Life: How Bioinformatics is Revolutionizing Science, Medicine and Business.