29 August 2009

Two quotations from James Murdoch's speech at the Edinburgh International Television Festival:

So talking about a coming digital future, or a digital transformation, is to ignore the evidence that it has already happened. Why do I think we are getting this wrong? Why do I believe we need to change direction as a matter of urgency? It’s quite simple. Because we have analogue attitudes in a digital age.

GoGot that? "Analogue attitudes in a digital age." Now try this:

We don’t even have the basics in place to protect creative work. Whether it’s shoplifting at HMV or pirating the same movie online, theft is theft.

Er, what was that about analogue attitudes in a digital age, James....?

28 August 2009

The European Commission has published a review of the Europeana digital library (remember that?). There's one critically important section, which touches on the hot issue of digitising public domain content:

Much of the material accessible in digital format through Europeana is in the public domain; this means it is not or no longer covered by copyright and can in principle be accessed and used by all. This material is an important source of re-use by citizens and companies alike and a driver of creativity in the internet age. For this reason, the Commission has underlined the need to keep "public domain works accessible after a format shift. In other words, works in the public domain should stay there once digitised and be made accessible through the internet".

In practice this is not always the case. While some of the cultural institutions explicitly indicate that the material they bring into Europeana is in the public domain, others claim rights on the digitised copies and/or charge for downloads. A few institutions apply watermarks and, in one case, viewing the material in a reasonable size is subject to payment. The different practices reflect the wide range of approaches across the EU, which are sometimes dictated by increasing pressure on cultural institutions to raise direct income from the assets they hold. Requiring payment for digitised public domain works also reflects the fact that digitisation has a cost. At the same time it seriously limits the cultural and economic potential of the material.

From a legal point of view the question is whether digitisation in itself creates new rights. Normally this would not be the case. However, the level of originality needed for the creation of copyright is not harmonised at European level, so the answer to the question may differ from one Member State to another.8 It may also vary for different types of digitisation (for example the scanning of books is not the same as costly 3D rendering of objects).

The issue of principle is whether it is acceptable to lock up public domain material that has been digitised with public money by public institutions instead of turning it into a pervasive asset for the information society. The latter approach is in line with the Community policy on the re-use of public sector information, as well as the OECD Ministerial Recommendation on Enhanced Access and More Effective Use of Public Sector Information.9 This issue is essential for the functioning of Europeana, since in its conditions of use the site follows the policies of the contributing institutions.

Similar issues arise when public institutions grant exclusive arrangements to private firms for the digitisation and exploitation of their unique public domain assets in exchange for material advantages. Such arrangements risk locking up public domain content, but in some cases they may be the only way to finance digitisation. This dilemma was expressed by the High Level Group on Digital Libraries in its report on public private partnerships for digitisation. The Group recommended that "public domain content in the analogue world should remain in the public domain in the digital environment. If restrictions to users’ access and use are necessary in order to make the digital content available at all, these restrictions should only apply for a time-limited period."

This is a crucially important issue. At the moment, some publishers are trying to create a new copyright in public domain materials just because they have been digitised. This is not only absurd, but threatens to nullify much of the huge potential of turning analogue knowledge into digital form. The European Commission deserves praise for highlighting this danger: now it needs to do something about it by passing legislation that settles the issue once and for all. (Via At last ... the 1709 Copyright Blog.)

BT’s wifi network has reached half a million hotspots. Fon has made a major contribution toward its growth, since about 90% of the BT hotspots are BT Fon. The rate of growth is such that, together with BT, we are on the way to one million hotspots. This is the goal for February 2010.

Suppose, now, that people use some of those million hotspots to download copyright material: how easy is it going to be (a) establishing exactly who downloaded it and (b) cutting off that person?

Symbolics probably doesn't mean much to you, but it should. It was the main reason that Richard Stallman started the GNU project.

You can read the full story in Rebel Code - or, if by some mischance, you don't have the book to hand, in this speech by RMS. But to summarise an extremely complex tale, at first, Stallman fought Symbolics directly by matching their (proprietary) code with his own, which he gave to a rival; but later he realised that this was not really a sensible way of helping people to use and share software freely:

Once I stopped punishing Symbolics, I had to figure out what to do next. I had to make a free operating system, that was clear — the only way that people could work together and share was with a free operating system.

At first, I thought of making a Lisp-based system, but I realized that wouldn't be a good idea technically. To have something like the Lisp machine system, you needed special purpose microcode. That's what made it possible to run programs as fast as other computers would run their programs and still get the benefit of typechecking. Without that, you would be reduced to something like the Lisp compilers for other machines. The programs would be faster, but unstable. Now that's okay if you're running one program on a timesharing system — if one program crashes, that's not a disaster, that's something your program occasionally does. But that didn't make it good for writing the operating system in, so I rejected the idea of making a system like the Lisp machine.

I decided instead to make a Unix-like operating system that would have Lisp implementations to run as user programs. The kernel wouldn't be written in Lisp, but we'd have Lisp.

As well as provoking the creation of the free software movement, Symbolics has another claim to fame: it was the first registered domain name. Amazingly, only now is that name leaving its original owner:

Did you know the first .com domain name that was ever registered was Symbolics.com, on the 15th of March 1985 by the now defunct Massachusetts-based computer manufacturer Symbolics?

While the first that was created in January of that same year was Nordu.net (used to serve as the identifier of the first root server, nic.nordu.net), symbolics.com was the first domain name to actually be registered through the appropriate DNS process a few months later. This was of course long before there was a WWW, but you already had ‘the Internet’. In fact, the first TCP/IP-based wide-area network had already been operational for two years when nordu.net was created, right around the time the United States’ National Science Foundation (NSF) commissioned the construction of the legendary NSFNET, a university 56 kilobit/second network backbone. Only six companies thought it’d be a good idea to reserve the domain name on the root servers in 1985 (the others were bbn.com, think.com, mcc.com, dec.com and northrop.com). But Symbolics was first to make the move.

Remarkably, Symbolics.com hasn’t changed ownership once during the nearly 25 years that followed its initial registration. Marking an end to that era, domain name investment company XF.com Investments has just purchased the domain name for an undisclosed sum.

It's pretty extraordinary how all these trailblazing events were tied up together back then; pretty strange, too, how distant they all seem. And, of course, good for the world that ultimately it was RMS that won.

27 August 2009

This is so rich. The Criminal Records Bureau (CRB) is becoming less and less useful as it produces more and more errors; these arise in part because the CRB is getting far too big to be manageable as the insanely authoritarian UK government tries to get as many people as possible on it (currently 11 million and counting).

So what's the solution to having a broken system of surveillance? Use another one even more intrusive and even less useful:

Millions could be asked to provide ID card and fingerprint data to get a job under new systems being developed by the Home Office following a collapse in the accuracy of background checks.

News of the plans emerged in the response to a Register Freedom of Information Act request to the Criminal Records Bureau (CRB). Today campaigners warned it could be used to help impose ID cards through the back door.

The way that one failure is used to justify an even bigger one would be funny if it weren't so serious. Roll on the General Election...

Yesterday I wrote a quick analysis of the insane U-turn effected by the UK government over "three strikes and you're out". Below I've posted the corresponding letter that I've sent to my MP on the subject. I urge you to do the same if you're a Brit, since it's the only way we have of influencing the situation. I'm not holding my breath waiting for a result, but I feel it's my duty....

I am writing to express my deep disquiet at the UK government's U-turn over disconnecting those accused of sharing copyright materials on the Internet.

For the eminently sane and well-balanced conclusions of Lord Carter and his Digital Britain team, based on many months of hard work, to be thrown away in this manner is extraordinary. In the place of a carefully-considered view that access to the Internet is a right not to be removed lightly, and that doing so on the say-so of media companies would be an inappropriate response to alleged copyright infringement, we now have a diktat from on high that proposes precisely this punishment.

As the indecent haste clearly demonstrates, this has not been thought through.

First, it is completely disproportionate. Cutting off people's Internet connection for allegedly swapping copyright materials is not just, any more than cutting someone's electricity supply would be for watching the TV without a licence, or cutting someone's water supply off would be for brewing illegal spirits.

Secondly, it represents a fundamental assault on due process in this country. If people can be cut off from the most important communication medium of the 21st century on the whim of media companies, who don't even need to prove their accusations in court, then things have reached a pretty sorry state in this country.

Thirdly, the approach won't work from a technical viewpoint. All it means is that the more tech-savvy will start encrypting their traffic; those who can't take this route will simply buy a few huge external hard discs – ones able to hold a quarter of a million songs cost around £50 these days – and swap files personally when they visit their friends.

Fourthly, the idea is at odds with European legislation. Amendment 138 of the Telecoms Package currently being finalised in Europe forbids the cutting off of users without judicial oversight. And that's even before the ISPs start taking legal advice on other ways in which it breaks relevant laws. Moreover, the European Court of Human Rights would probably have something to say about legislation that allows what Viviane Reding has explicitly called a “fundamental human right” (http://opendotdotdot.blogspot.com/2009/05/internet-access-is-fundamental-fight.html) to be taken away so easily.

What's particularly bizarre about this move is that those who will suffer the most are likely to be traditional Labour supporters. For it is the poor who cannot afford to pay for high-priced digital downloads, and may therefore look for material on P2P networks. It is the poor who may well share an Internet across several families using a wifi connection in a block of flats, for example. If one user is accused of swapping copyright materials, several families will be severely disadvantaged – hardly something that fits with Labour's historical mission to help precisely these people.

For all these reasons - assuming this truly is a consultation and not just another rubber-stamping – I urge you to join your colleague, Tom Watson (http://www.tom-watson.co.uk/2009/08/filesharing-revised-consultation/), in passing on to Lord Mandelson and Stephen Timms the comments of myself and others who may write to you on this subject.

26 August 2009

Yet again, Cameron Neylon is daring to ask the unasked questions that *should* be asked:

Many of us have one or two papers in journals that are essentially inaccessible, local society journals or just journals that were never online, and never widely enough distributed for anyone to find. I have a paper in Complex Systems (volume 17, issue 4 since you ask) that is not indexed in Pubmed, only available in a preprint archive and has effectively no citations. Probably because it isn’t in an index and no-one has ever found it. But it describes a nice piece of work that we went through hell to publish because we hoped someone might find it useful.

Now everyone agreed, and this is what the PLoS ONE submission policy says quite clearly, that such a paper cannot be submitted for publication. This is essentially a restatement of the Ingelfinger Rule. But being the contrary person I am I started wondering why. For a commercial publisher with a subscripton business model it is clear that you don’t want to take on content that you can’t exert a copyright over, but for a non-profit with a mission to bring science to wider audience does this really make sense? If the science is currently unaccessible and is of appropriate quality for a given journal and the authors are willing to pay the costs to bring it to a wider public, why is this not allowed?

Why not, indeed? For as Neylon points out:

If an author feels strongly enough that a paper will get to a wider audience in a new journal, if they feel strongly enough that it will benefit from that journal’s peer review process, and they are prepared to pay a fee for that publication, why should they be prevented from doing so? If that publication does bring that science to a wider audience, is not a public service publisher discharging their mission through that publication?

Which is only possible, of course, in open access journals adopting a funder pays approach, since traditional publishers need to be able to point to the uniqueness of their content if they are trying to sell it - after all, why would you want to buy it twice? Open access journals have no such imperative, since they are giving it away, so readers have no expectations that the stuff is unique and never seen before.

An analogy I and others frequently use in discussing the media industries' refusal to consider new business models is that of the transition to the motor car, particularly with reference to obsolete accoutrements for horse carriages. But I've just read an article from TorrentFreak that links to a rather interesting Wikipedia page about the Locomotive Act - the one that required a person to walk in front of a car with a red flag. It had this interesting paragraph:

Under pressure from motor car enthusiasts, including Coventry manufacturer Harry J. Lawson, the government introduced the Locomotives on Highways Act 1896, which became known as The Emancipation Act, which defined a new category of vehicle light locomotives, which were vehicles under 3 tons unladen weight. These vehicles were exempt from the 3 crew member rule, and were subject to the higher 14 mph (22 km/h) speed limit.[5] In celebration of the Emancipation Act Lawson organised the first London to Brighton run.

The relaxation of usage restrictions eased the way for the development of the British motor industry.

Nearly one and a half centuries later the motoring journalist and author L. J. K. Setright speculated that the Locomotive Acts were put in place to suppress motor car development in the United Kingdom, because of the financial interests that some members of government and other establishment personalities had in the development and viability of the railway industry.

Although the newest oil rigs, which cost upward of $1 billion apiece, might be loaded with cutting-edge robotics technology, the software that controls a rig's basic functions is anything but. Most rely on the decades-old supervisory control and data acquisition (SCADA) software, written in an era when the "open source" tag was more important than security, said Jeff Vail, a former counterterrorism and intelligence analyst with the U.S. Interior Department. "It's underappreciated how vulnerable some of these systems are," he said. "It is possible, if you really understood them, to cause catastrophic damage by causing safety systems to fail."

Sorry, old chap, but "open source" and "security" are orthogonal, independent axes. And this, from the same article:

"The worst-case scenario, of course, is that a hacker will break in and take over control of the whole platform," Jaatun said. That hasn't happened yet, but computer viruses have caused personnel injuries and production losses on North Sea platforms, he noted.

25 August 2009

I'm struck by the almost unanimous chorus of indifference that has greeted the news that a court has reversed one part of an ealier ruling regarding who owns the Unix copyright:

For the foregoing reasons, we AFFIRM the district court’s judgment with regards to the royalties due Novell under the 2003 Sun-SCO Agreement, but REVERSE the district court’s entry of summary judgment on (1) the ownership of the UNIX and UnixWare copyrights; (2) SCO’s claim seeking specific performance; (3) the scope of Novell’s rights under Section 4.16 of the APA; (4) the application of the covenant of good faith and fair dealing to Novell’s rights under Section 4.16 of the APA. On these issues, we REMAND for trial.

And me to that list: SCO still has everything to prove, and very little money to prove it with. And even if it *did* prove anything, all it would gain would be the right to be ground into very fine particles of dust by IBM's legal department....

It's really striking how the idea of open government has gone from nowhere a few months ago to hot meme of the moment. Here's the latest convert - Sweden:

Opengov.se is an initiative to highlight available public datasets in Sweden. It contains a commentable catalog of government datasets, their formats and usage restrictions. The percent figure on the start page indicates the share of datasets that are available with an open license and in at least one open format.

The goal is to highlight the benefits of open access to government data and explain how this is done in practice.

It's interesting that the site links to the US Open Government Working Group, which wrote:

8 december 2007 - This weekend, 30 open government advocates gathered to develop a set of principles of open government data. The meeting, held in Sebastopol, California, was designed to develop a more robust understanding of why open government data is essential to democracy.

The Internet is the public space of the modern world, and through it governments now have the opportunity to better understand the needs of their citizens and citizens may participate more fully in their government. Information becomes more valuable as it is shared, less valuable as it is hoarded. Open data promotes increased civil discourse, improved public welfare, and a more efficient use of public resources.

The group is offering a set of fundamental principles for open government data. By embracing the eight principles, governments of the world can become more effective, transparent, and relevant to our lives.

Since Sweden currently holds the presidency of the EU, it would be good if it spread a little of that openness there, too.

22 August 2009

As long-suffering readers of this blog will know, I insist on calling patents and copyrights "intellectual monopolies". That's mainly because it is a better description of what they are; but there's another reason, which becomes clear from this post by a pro-monopolist who is conducting an revealing exchange with William Patry on his new copyright blog:

Bill --

You want to redirect the conversation to the question "why do copyright owners insist on describing copyright as a property right, rather than say as a regulatory privilege or a tort?" Fair enough. If I took a bit more time for research, I could probably come up with a very sophisticated answer, digging up 18th Century texts to support my position. But instead I'll give you a much simpler one, one that might not satisfy the philosophers, historians, and economists among your readers, but one that happens to be accurate (and will probably work for most lawyers): Because pretty much everyone refers to copyright as a form of property. Contrary to the premise of your post, it's not just "copyright owners" who use the term "property" in this context; it's exceedingly common for those on all points of the copyright spectrum.

There we have it: the more opponents collude by using the "eye-pea" term, the more the monopolists can point to this as "proof" that copyright and patents are property, not monopolies.

As someone who has been writing about open access for some years, I find myself returning again and again to the Public Library of Science. That's because, not content with pioneering open access, PLoS has time and again re-invented the broader world of scientific publishing. Now, it's done it again:

Today, after several months of work, I’m delighted to announce that PLoS is launching PLoS Currents (Beta) – a new and experimental website for the rapid communication of research results and ideas. In response to the recent worldwide H1N1 influenza outbreak, the first PLoS Currents research theme is influenza.

Note the emphasis on "rapid": this is absolutely crucial, as I've notedbefore. The current system of publishing papers is simply too slow to deal with pandemics, where speed is of the essence if we're to have a chance of nipping them in the bud. It's good to see PLoS stepping in to help address this major problem.

It's doing it in a very interesting way:

PLoS Currents: Influenza, which we are launching today, is built on three key components: a small expert research community that PLoS is working with to run the website; Google Knol with new features that allow content to be gathered together in collections after being vetted by expert moderators; and a new, independent database at the National Center for Biotechnology Information (NCBI) called Rapid Research Notes, where research targeted for rapid communication, such as the content in PLoS Currents: Influenza will be freely and permanently accessible. To ensure that researchers are properly credited for their work, PLoS Currents content will also be given a unique identifier by the NCBI so that it is citable.

...

The key goal of PLoS Currents is to accelerate scientific discovery by allowing researchers to share their latest findings and ideas immediately with the world’s scientific and medical communities. Google Knol’s features for community interaction, comment and discussion will enable commentary and conversations to develop around these findings. Given that the contributions to PLoS Currents are not peer-reviewed in detail, however, the results and conclusions must be regarded as preliminary. In time, it is therefore likely that PLoS Currents contributors will submit their work for publication in a formal journal, and the PLoS Journals will welcome these submissions.

PLoS Currents: Influenza is an experiment and a prototype for further PLoS Currents sites. It reflects our commitment to using online tools to the fullest extent possible for the open sharing of research results. As with any new project, we will be listening carefully to the reactions within and beyond the scientific and medical communities and welcoming suggestions for improvements.

This is really exciting from many viewpoints. It's pushing the ideas behind open access even further; it's reshaping publishing; and it may even save humanity. (Via James Boyle.)

So my blog turns seven today. On August 20, 2002, while hiding north of San Francisco working on the Eldred appeal, I penned my first (wildly and embarrassingly defensive) missive to Dave. Some 1753 entries later, I'm letting the blog rest. This will be the last post in this frame. Who knows what the future will bring, but in the near term, it won't bring more in lessig.org/blog.

The main reason is that he's too damn busy with other projects, although I suspect the imminent arrival of his third child also was a big factor.

Lessig surprised me before by moving from CC work to his transparency gig. I thought he was bonkers then...and I was wrong, he was just - as usual - prescient. Maybe his move away from blogging is the same: but I hope not.... (Via John Naughton.)

One of the perennial reasons people give for not using free software is that it is lacking some key piece of software. High on that list is personal finances management, a sector dominated by Intuit in the closed source world. So the following is great news:

The KMyMoney development team is pleased to announce a major step forward for what has been described as "the BEST personal finance manager for FREE users". KMyMoney 1.0 has been released. With over 3 years of development, this new stable release has many new features and a refreshed user interface.

...

Since our latest stable release, 0.8.9, a lot of effort has been put in by the developers and the community to add new features and test them to ensure a rock-solid release. Over 2 years of development have resulted in the addition of budgets, a forecast feature, many new reports, report charts, a complete redesign of the import feature, which allows for a much easier migration from other application and a swifter synchronization with online banking. The support of PGP encryption for the KMyMoney files has been improved too, including the option to have multiple keys for a single file, so no one can access your financial records. The summary view has been revamped to show more and more useful information, allowing you to have an overview of your financial situation at a glance. Also, there are now translations for 22 languages, though not all of them are as complete as we would like. We have users wherever KDE3 is installed. That results not only in a greater quality application, but also in one that can be customized to fit the needs of a wide range of users. In between all that work, we have fixed a lot of bugs and little annoyances to make this the best KMyMoney release ever.

Let's hope that the word gets out about KMyMoney, and that more people realise that free software really can cover all their needs.

What's interesting is how tightly focussed the Pirate Party is. I think that's wise: otherwise it would just become another Raving Monster Loony Party. By restricting its message to an area that it understands, and which is crying out for reform, I'm sure it will benefit in the long run. It will also, usefully, force the other parties to frame their own responses in this domain.

18 August 2009

in January 2010, Spain will take over the Presidency of the European Community. Spanish Government has already announced that one of their flagships will be reinforcing the control of the Internet and criminalizing the sharing culture in the digital environment. The consequences of those decisions will be noticed in the rest of the world.

This is the first I've heard of this: bad news if true. Anyone have any more details?

This is something I've been saying (without proof, admittedly) for a while: the UK's insane DNA database is doomed not because it doesn't work well, but because it works *too* well in a sense - in that it lets you frame anybody with perfect efficiency:

Scientists in Israel have demonstrated that it is possible to fabricate DNA evidence, undermining the credibility of what has been considered the gold standard of proof in criminal cases.

The scientists fabricated blood and saliva samples containing DNA from a person other than the donor of the blood and saliva. They also showed that if they had access to a DNA profile in a database, they could construct a sample of DNA to match that profile without obtaining any tissue from that person.

“You can just engineer a crime scene,” said Dan Frumkin, lead author of the paper, which has been published online by the journal Forensic Science International: Genetics. “Any biology undergraduate could perform this.”

This is actually an argument against expanding the database: what you want are just the real criminals, not all those who might possibly one day be one. The bigger the database, the more likely you will get a match with fake DNA.

Needless to say, our great and glorious government will ignore completely this inconvenient truth, and go on stuffing its database with DNA - the reason being this isn't about crime, but about control.

Still, looking on the bright side, it will be trivially easy to spread Gordon Brown's DNA at any crime scene in the future - all we need is a discarded coffee cup....

12 August 2009

One of the perennial teasers in the world of computing concerns IBM. On the one hand, you have a company that has embraced open source widely across its product line, and made major donations of code (to Eclipse, for example); on the other, it is a massive supporter of software patents, and also sells large amounts of proprietary software. So which is it: Big Bounteous Blue, or Big Bad Blue?

I think this submission [.pdf] to the court concerning the important Bilski case answers that question definitively:

In the months since the Federal Circuit issued its opinion, and to IBM’s great concern, a number of administrative and judicial decisions have rigidly applied the “machine or transformation” test to question—in some cases explicitly—the patentability of software per se. Software technology is vital in addressing society’s most pressing challenges. IBM is committed to ensuring that such technology is and remains patentable.

There we have it: "IBM is committed to ensuring that such technology is and remains patentable" - no two ways about it.

But wait - IBM goes even further, claiming that software patents are so desirable in part because they actually *powered* the rise of free software:

Given the reality that software source code is human readable, and object code can be reverse engineered, it is difficult for software developers to resort to secrecy. Thus, without patent protection, the incentives to innovate in the field of software are significantly reduced. Patent protection has promoted the free sharing of source code on a patentee’s terms—which has fueled the explosive growth of open source software development.

Well, actually, free software is produced *despite* software patents, in the teeth of their deleterious effects, as every one of your highly-paid engineers and lawyers understands full well.

So, no, IBM, that's a load of cobbler's, and it's disgraceful you should even try to pass off this apology of an argument that patents are somehow precursors of true "free sharing" in a submission to a court considering such an important matter, for self-proclaimed selfish reasons. This is clearly an attempt to head off the criticism that software patents harm free software, the most vibrant sector of computing today, and should therefore be scaled back by the US Supreme Court. Cynical ain't in it.

At least we know where we stand, now, with Big Bad Blue... (Via @zoobab.)

In a decision that shows how ridiculously unclear the situation around copying DVDs is:

A federal judge ruled here late Tuesday that it was unlawful to traffic in goods to copy DVDs.

U.S. District Judge Marilyn Hall Patel’s ruling came in a decision in which she declared RealNetworks’ DVD copying software was illegal. She barred it from being distributed.

Patel said the RealDVD software violates the Digital Millennium Copyright Act of 1998 that prohibits the circumvention of encryption technology. DVDs are encrypted with what is known as the Content Scramble System, and DVD players must secure a license to play discs. RealDVD, she ruled, circumvents technology designed to prevent copying.

But the decision, although mixed, left open the door that copying DVD’s for personal use “may well be” lawful under the fair use doctrine of the Copyright Act, although trafficking in such goods was illegal.

Making backup copies etc. is so eminently reasonable that it needs spelling out by the courts. Paradoxically, not spelling it out is worse for the film industry, since the boundary of what is and is not (morally) acceptable are ill-defined, letting people make it up as they go along.

10 August 2009

Remember Karoo? They were the strange ISP in Hull that was going to put in place a *one* strike and you're out scheme for *alleged* copyright infringement. Then they changed their minds. Now it looks like they've thought about this some more:

“We will no longer suspend a customer’s service unless we receive a court order from a copyright owner taking legal action. As a result it is the responsibility of the legal system, not Karoo, to ensure the accuracy of the information provided by the copyright owners.”

I predict that this will happen increasingly, as ISPs realise the implications of what the content industries are demanding with their "three strikes and you're out" insanity. They would clearly be on very dodgy legal ground if they carried out this threat based on mere accusations. Yahoo for Karoo.

09 August 2009

it is not open access per se that is threatening Elsevier (High Energy Physics since long have had almost 100% open access uptake, so critical mass has long been reached in this special field, quite different from other physics areas), but that they are loosing the battle for authors, possibly due to their reluctance to support SCOAP3. As I wrote, they have lost between 30% to 50% in submissions from authors during the last 4 years for their HEP journals. With such a massive reduction in size, prices also had to come down. In the new open access scholarly publishing market, journals will compete for authors even more than now. SCOAP3 certainly raised the awareness for both the scientific community's expectation to fully convert these journals to OA and the unsustainable prices that had risen to absurd record prices. It is clear that subscriptions are now under even more pressure because of the global economic crisis that especially hit american libraries very hard.

High energy physics (my old discipline) is certainly in the vanguard, but is probably just the first of many to follow this path. Go, open access.

08 August 2009

Talking of DNA, another brilliant use of it - and brilliantly obvious like all great ideas - is DNA Barcoding:

DNA barcoding is a new technique that uses a short DNA sequence from a standardized and agreed-upon position in the genome as a molecular diagnostic for species-level identification. DNA barcode sequences are very short relative to the entire genome and they can be obtained reasonably quickly and cheaply. The "Folmer region" at the 5' end of the cytochrome c oxidase subunit 1 mitochondrial region (COI) is emerging as the standard barcode region for almost all groups of higher animals. This region is 648 nucleotide base pairs long in most groups and is flanked by regions of conserved sequences, making it relatively easy to isolate and analyze. A growing number of studies have shown that COI sequence variability is very low (generally less than 1-2%) and that the COI sequences of even closely related species differ by several percent, making it possible to identify species with high confidence.

However, readers of this will probably have guessed the fly in the ointment here: DNA barcoding is such a powerful idea that the parasites have moved in, and started trying to *patent* bits of the idea:

Systematic and phylogenetics, indeed much of evolutionary science, has long and great tradition of making resources and knowledge freely available to other resources. Instead of cash, all an author asks for is a citation or a credit. Therefore, it sounded incredulous to me that one researcher was trying to patent a DNA barcode snippet for a plant gene that was being worked on over several years by a large group of researchers.

It's a classic situation: not only are scientific techniques being patented, they are techniques that are well established and have been used for years - something that is explicitly excluded even in the most deranged patent regimes. And people say the system is working just fine... (Via Jonathan Eisen.)

Nanotechnology is one of those subjects that seem to veer between hope and hype. DNA-based solutions look among the most promising, because of the fact that the material has evolved to solve many of the same problems as nanotechnology; more subtly, it is inherently digital, which makes its manipulation much easier - and promises structures of almost infinite complexity under computer control.

To do that, or course, you need software, so it's great to see that there is already free software that lets you create DNA-based nanostructures:

caDNAno is open-source software based on the Adobe AIR platform for design of three-dimensional DNA origami nanostructures. It was written with the goal of providing a fast and intuitive means to create and modify DNA origami designs. You can learn how to use it, download a copy of the program and some example designs, or even modify the source code.

The software makes heavy use of several fantastic open-source libraries and resources, especially Papervision3D for 3D rendering, Michael Baczynski's AS3 data structures and tutorials, the Tango Desktop Project for icons, and the Blueprint CSS framework for this website. Additional people and resources are acknowledged on the links page.

As you can see from this, there's already quite a rich ecosystem of free code in this area, which augurs well for the future. The last thing we need is for nanotechnology to turn into the smallest black box ever made.

07 August 2009

Well, not quite, but it's clear somebody really dislikes the Twitter user @cyxymu: it seems that the coordinated attack on Twitter, Facebook and LiveJournal were to silence him:

A Georgian blogger with accounts on Twitter, Facebook, LiveJournal and Google's Blogger and YouTube was targeted in a denial of service attack that led to the site-wide outage at Twitter and problems at the other sites on Thursday, according to a Facebook executive.

The blogger, who uses the account name "Cyxymu," (the name of a town in the Republic of Georgia) had accounts on all of the different sites that were attacked at the same time, Max Kelly, chief security officer at Facebook, told CNET News.

"It was a simultaneous attack across a number of properties targeting him to keep his voice from being heard," Kelly said. "We're actively investigating the source of the attacks and we hope to be able to find out the individuals involved in the back end and to take action against them if we can."

Sounds pretty incredible, but the chap himself confirms it on his Twitter account:

да, меня ДДоСили

which roughly means "yup, I was DDoS'd", and he also opines:

this hackers was from Russian KGB

Supporting this view is the fact that his LiveJournal blog is still unreachable.

Fascinating, of course, to see how events in the Caucasus - today's the first anniversary of the ill-advised attack of Georgia on South Ossetia, and Russia's gleeful counter-attack on Georgia - reach and affect even global online worlds like Twitter and Facebook. Interesting times.

06 August 2009

Embedded inside the card for foreigners is a microchip with the details of its bearer held in electronic form: name, date of birth, physical characteristics, fingerprints and so on, together with other information such as immigration status and whether the holder is entitled to State benefits.

This chip is the vital security measure that, so the Government believes, will make identity cards 'unforgeable'.

But as I watch, Laurie picks up a mobile phone and, using just the handset and a laptop computer, electronically copies the ID card microchip and all its information in a matter of minutes.

He then creates a cloned card, and with a little help from another technology expert, he changes all the information the card contains - the physical details of the bearer, name, fingerprints and so on. And he doesn't stop there.

With a few more keystrokes on his computer, Laurie changes the cloned card so that whereas the original card holder was not entitled to benefits, the cloned chip now reads 'Entitled to benefits'.

No surprises there, of course; but what's significant is that it's the Daily Mail that's pushing this jolly news out to its assembled readers. This means the message is going out to groups beyond the obvious Guardian greeny-lefties and Telegraph Tories.

The UK government will presumably just carry on blithely ignoring these inconvenient demonstrations of the deep lack of security at the heart of this lunatic project. Worryingly, it comes in a week where UK High Court ruled that the government's not liable for the consequences of its errors, which means that when your ID card is cloned and abused, *you* will have to foot the bill....(via Ray Corrigan.)

First, he blew it: he ignored the Net, declaring it of no interest. Then he hit the jackpot, buying MySpace for what seemed an incredibly low price: just $580 million, when Facebook was being valued at billions. That's looking expensive today:

News Corp specifically blames MySpace for a loss of $363 million to the company’s bottom line

"Quality journalism is not cheap," said Murdoch. "The digital revolution has opened many new and inexpensive distribution channels but it has not made content free. We intend to charge for all our news websites."

Good luck with that, Rupe.

I think it's interesting that I almost never quote from or link to News International titles: there's simply too little there of interest. By contrast, I *do* link quite often to New York Times and Guardian stories, both of which offer stuff not covered elsewhere. So I don't think I'm going to miss Mr Murdoch's titles when they suddenly fall off the digital radar...

05 August 2009

As a growing number of worldwide learners log on, free of charge, to video and podcast lectures and events at the University of California, Berkeley, the campus is leading an international effort to build a communal Webcasting platform to more easily record and distribute its popular educational content.

Dubbed "Opencast Matterhorn" and funded with grants from the Andrew W. Mellon and William and Flora Hewlett foundations totaling $1.5 million, the project will bring together programmers and educational technology experts from an international consortium of higher education institutions, including ETH Zürich in Switzerland, University of Osnabrück in Germany, Cambridge University in the United Kingdom and Canada's University of Saskatchewan.

..

The software will support the scheduling, capture, encoding and delivery of educational content to video-and-audio sharing sites such as YouTube and iTunes, so that learners can access lectures when and where they need it. With additional funding, expertise and labor from other members of the consortium, the Opencast Matterhorn platform is scheduled to be up and running by summer 2010.

They've got a new word for it (well, new to me):

Coursecasting is a growing trend in educational technology, enabling students and the general public to download audio and video recordings of class lectures to their computers and portable media devices.

Daft names aside, it's great to see institutions working together on a common platform like this; it should give a real boost to opencourseware - and may be even coursecasting. (Via Open Education News.)

04 August 2009

Yesterday, I wrote elsewhere about open standards, and how they sought to produce a level playing field for all. Similar thoughts have occurred to Stuart Shieber in this post about open access:

In summary, publishers see an unlevel playing field in choosing between the business models for their journals exactly because authors see an unlevel playing field in choosing between journals using the business models.

He has an interesting solution:

To mitigate this problem—to place open-access processing-fee journals on a more equal competitive footing with subscription-fee journals—requires those underwriting the publisher's services for subscription-fee journals to commit to a simple “compact” guaranteeing their willingness to underwrite them for processing-fee journals as well.

He concludes:

If all schools and funders committed to the compact, a publisher could more safely move a journal to an open-access processing-fee business model without fear that authors would desert the journal for pecuniary reasons. Support for the compact would also send a signal to publishers and scholarly societies that research universities and funders appreciate and value their contributions and that universities and funders promoting self-archiving have every intention of continuing to fund publication, albeit within a different model. Publishers willing to take a risk will be met by universities and funding agencies willing to support their bold move.

The new US administration could implement such a system through simple FRPAA-like legislation requiring funding agencies to commit to this open-access compact in a cost-neutral manner. Perhaps reimbursement would be limited to authors at universities and research institutions that themselves commit to a similar compact. As funding agencies and universities take on this commitment, we might transition to an efficient, sustainable journal publishing system in which publishers choose freely among business models on an equal footing, to the benefit of all.

03 August 2009

Remember Wolfram Alpha? That super-duper search engine - sorry, "computational knowledge engine" - that was going to change the way we looked for and found information, and also cure the common cold (OK, I made that last one up)? Seems to have disappeared without trace, no? I'm not surprised, if it misunderstands copyright as badly as this post suggests:

Try cutting and pasting from the results page. You can't, and with good reason. According to Wolfram Alpha's terms of use, its knowledge engine is "an authoritative source of information," because "in many cases the data you are shown never existed before in exactly that way until you asked for it." Therefore, "failure to properly attribute results from Wolfram Alpha is not only a violation of [its license terms], but may also constitute academic plagiarism or a violation of copyright law."

Copyright, as Wolfram seems not to understand, is a bargain between creators and their public. As an *incentive* to create, the former are given a time-limited monopoly by governments. Note that it is *not* a reward for having created: it is an incentive to create again.

Now consider Wolfram Alpha. This is essentially a computational process - remember, it's a "computational knowledge engine". So, it is simply a bunch of algorithms acting on data. Algorithms don't need incentives to create: outputting is what they do if they're useful. So copyright is completely inappropriate, just as it would be for the output of any other program processing information on its own (obviously, if that information is words fed in by a human, copyright would exist in those words because they were created by someone).

Wolfram's ridiculous claim to copyright in its results does have the virtue of providing a nice illustration of the real limits of this intellectual monopoly. For the rest, it might try finding out a bit more about copyright so that it can amend its licence accordingly - I suggest using a good search engine like Google.

02 August 2009

Er, shouldn't this utter insanity be sounding one or two alarm bells...please?

Thousands of the worst families in England are to be put in “sin bins” in a bid to change their bad behaviour, Ed Balls announced yesterday.

The Children’s Secretary set out £400million plans to put 20,000 problem families under 24-hour CCTV super-vision in their own homes.

They will be monitored to ensure that children attend school, go to bed on time and eat proper meals.

Private security guards will also be sent round to carry out home checks, while parents will be given help to combat drug and alcohol addiction.

Desite certain protestations to the contrary, isn't this rather clearly a total surveillance society, complete with jack-booted "security" guards? Why not just call them "Security Services" - "SS" for short - and be done with it?

01 August 2009

Many potential buyers of laptops priced under $300 in the U.S. had an unpleasant surprise over the weekend: The machines would not be eligible for a free upgrade to Microsoft's upcoming Windows 7 operating system.

Wal-Mart and Best Buy attracted plenty of buyers during a promotional offering of laptops priced under $300. Some of those laptops sold out just one day after the offers began. The prices were respectable considering the generous features, including large screens, better graphics and DVD drives, which are not typically found in most low-cost netbooks.

However, the laptops came preloaded with the Windows Vista Home Basic operating system, which does not include a free upgrade to Windows 7 in the U.S. Instead, consumers will have to shell out about $120 to upgrade the operating system.

About Me

I have been a technology journalist and consultant for 30 years, covering
the Internet since March 1994, and the free software world since 1995.

One early feature I wrote was for Wired in 1997:
The Greatest OS that (N)ever Was.
My most recent books are Rebel Code: Linux and the Open Source Revolution, and Digital Code of Life: How Bioinformatics is Revolutionizing Science, Medicine and Business.