Category Archives: History

Adobe is dropping Linux support for their Adobe AIR development platform. To be honest, I don’t really care. Why? Because I’ve been careful enough to not tie my efforts to a proprietary platform.

I’ve had several groups offer to write applications/activities for OLPC Australia using proprietary tools like AIR. I’ve discouraged them every time. Had we gone with the ‘convenient’ route and acquiesced, we would have been in quite a spot of bother right now. My precious resources would have to be spent on porting or rewriting all of that work, or just leaving it to bit-rot.

A beauty of Sugar and Linux is that they are not dependent on a single entity. We can develop with the confidence of knowing that our code will continue to work, or at least can be made to continue to work in the face of underlying platform changes. This embodies our Core Principle #5, Free and Open.

Free and Open means that children can be content creators. The television age relegated children (and everyone, for that matter) to just being consumers of content. I have very fond childhood memories of attempts to counter that, but those efforts pale in comparison to the possibilities afforded to us today by modern digital technologies. We now have the opportunity to properly enable children to be in charge of their learning. Education becomes active, not passive. There’s a reason why we refer to Sugar applications as activities.

Growing up in the 80s, my recollections are of a dynamic computing market. Machines like the ZX Spectrum and the early Commodore models inspired a generation of kids into learning about how computers work. By extension, that sparked interest in the sciences: mathematics, physics, engineering, etc.. Those machines were affordable and quite open to the tinkerer. My first computer (which from vague recollection was a Dick Smith VZ200) had only a BASIC interpreter and 4k of memory. We didn’t purchase the optional tape drive, so I had to type my programs in manually from the supplied book. Along the way, I taught myself how to make my own customisations to the code. I didn’t need to learn that skill, but I choose to take the opportunity presented to me.

Likewise, I remember (and still have in my possession, sadly without the machine) the detailed technical binders supplied with my IBM PC. I think I recognised early on that I was more interested in software, because I didn’t spend as much time on the supplied hardware schematics and documentation. However, the option was there, and I could have made the choice to get more into hardware.

Those experiences were very defining parts of my life, helping to shape me into the Free Software, open standards loving person I am. Being able to get involved in technical development, at whatever level of my choosing, is something I was able to experience from a very early age. I was able to be active, not just consume. As I have written about before, even the king of proprietary software and vendor lock-in himself, Bill Gates, has acknowledged a similar experience as a tipping point in his life.

With this in mind, I worry about the superficial solutions being promoted in the education space. A recent article on the BBC’s Click laments that children are becoming “digitally illiterate”. Most of the solutions proposed in the article (and attached video) are highly proprietary, being based on platforms such as Microsoft’s Windows and Xbox. The lone standout appears to be the wonderful-looking Raspberry Pi device, which is based on Linux and Free Software.

It is disappointing that the same organisation that had the foresight to give us the BBC Computer Literacy Project (with the BBC Micro as its centrepiece) now appears to have disregarded a key benefit of that programme. By providing the most advanced BASIC interpreter of the time, the BBC Micro was well suited to education. Sophisticated applications could be written in an interpreted language that could be inspected and modified by anyone.

Code is like any other form of work, whether it be a document, artwork, music or something else. From a personal perspective, I want to be able to access (read and modify) my work at any time. From an ethical perspective, we owe it to our children to ensure that they continue to have this right. From a societal perspective, we need to ensure that our culture can persevere through the ages. I have previously demonstrated how digital preservation can dramatically reduce the longevity of information, comparing a still-legible thousand-year-old book against its ‘modern’ laserdisc counterpart that became virtually undecipherable after only sixteen years. I have also explained how this problem presents a real and present danger to the freedoms (at least in democratic countries) that we take for granted.

Back in the world of code, at least, things are looking up. The Internet is heading towards HTML5/JavaScript, and even Microsoft and Adobe are following suit. This raises some interesting considerations for Sugar. Maybe we need to be thinking of writing educational activities in HTML5, like those at tinygames? Going even further, perhaps we should be thinking about integrating HTML5 more closely into the Sugar framework?

We’re working to make sure every school has a 21st-century curriculum like you do. And in the same way that we invested in the science and research that led to the breakthroughs like the Internet, I’m calling for investments in educational technology that will help create digital tutors that are as effective as personal tutors, and educational software that’s as compelling as the best video game. I want you guys to be stuck on a video game that’s teaching you something other than just blowing something up.

Of equal importance, how can we be sure that we can actually read those archives in the future? Literacy of Egyptian Hieroglyphs was long-gone by the 18th century, and it took the discovery of the Rosetta Stone for them to start making sense again.

I’ve written about this in the past, contrasting the thousand-year-old Domesday Book (which is still legible) with the BBC Domesday Project (which was rendered virtually unreadable a mere sixteen years after production).

The means of preserving our culture for digital preservation is to use open standards. If the means for ‘reading’ the information is widely documented and understood, without any encumbrances, we stand a much greater chance of being able to interpret it in a couple of hundred years.

I’ve got essays from school written only ten years ago, and I can’t read them any more as they’re stored in a proprietary file format that is no longer supported.

Imagine you ran a company that had important and valuable written records stretching back for decades. Storing vast libraries of paper is expensive and inefficient, so you decide to digitise them all. That’s great — you now have a system that is easy to manage and search. Ten years later, you want to migrate your now-ageing data management system to something more modern. Only, you can’t — it’s all stored in a proprietary format that cannot be accessed by anything else.

If you had kept those paper records, you would have still had access to that information. Your choices now are to continue with your old, obsolete system for all eternity, or hire some clever hacker to decipher the file format. With no equivalent of a Rosetta Stone, that’s no mean task. After spending buckets of money on this avoidable problem, and losing even more due to inefficiencies and competitive disadvantage from the old system, you’d be wise to make sure it cannot happen again.

This is a very common kind of scenario. If our information can’t even last ten years, how can it last a thousand?

From a business perspective, open standards protect the independence of a company. It means no vendor lock-in, so you are not stuck paying monopoly prices. Through the creation of a free market surrounding a method/technology, open standards give you the freedom to select the vendors, products, methods and technologies that suit your requirements best, or you can even create your own. They are the ultimate in risk mitigation, and through their flexibility can also open avenues for competitive advantage. They just make good business sense.

These movies deal with incredibly disturbing subject matter: the effects of war on a civilian population. Each movie took its own approach to the topic, but they all masterfully captured the despair and suffering that people go through. What I also like about these films is that they have dealt with incidents which were either ignored or forgotten by people in other countries. Hotel Rwanda covers the Rwandan genocide of 1994, The Killing Fields is set in the Khmer Rouge dominated Cambodia of the 1970s, and Grave of the Fireflies is about Japan during World War II.

Hotel Rwanda and The Killing Fields both deal with civil war. Who cares about that? After all, it’s not in my backyard. Most of the countries in Africa are in some sort of war, yet the West currently seems more concerned with Pope John Paul II’s funeral or Prince Charles’s wedding. In the case of Cambodia, Vietnam (with diplomatic support from the USSR) turned out to be the Good Guys (funnily enough), invading the country and deposing the Khmer Rouge with popular support (despite their misgivings about the Vietnamese). The USA, Thailand and China actively worked to support the Khmer Rouge. Did we hear about any of this on television? Is it in any school history books? Nope, it’s as (self) censored as the Japanese occupation of Korea is in Japan.

The Rwandan genocide was yet another shameful event in world history. The United Nations and economically developed countries had the power to intervene and halt the bloodshed, yet they didn’t. The US had been in Somalia only a couple of years prior, but I guess Rwanda wasn’t important since it it didn’t lie on any major shipping lanes. The UN itself, France and other countries also deserve much of the blame.

Grave of the Fireflies is somewhat different, yet the same. Firstly, it is animated. This is no children’s movie, however, even if the two protagonists are children. I don’t think more impact could have been achieved if it were a live action film. Grave of the Fireflies covers yet another ignored event in world history: the effects of World War II on the Japanese population. It is natural to ignore the aggressors (or even applaud their suffering), particularly ones as brutal as the Japanese in WWII, but it is important to remember that they are just as human as everyone else. Many Germans consider the Allied firebombing of Dresden as a war crime, but did you know that the firebombing of Tokyo caused more damage and loss of life than the atomic bombs on Hiroshima and Nagasaki (which BTW were dropped on non-industrial residential areas)? I won’t get into the debate over whether such attacks were truly necessary (it was a war, after all), but we shouldn’t forget the human suffering which took place as a result, regardless of whom it is.

The British people have acquired some notable information about the Falklands war in 2002 that they were denied 20 years ago, when the war itself took place behind a blanket of censorship. In the 1982 authorised Thatcherite version of events, Britain set out to recapture the Falkland Islands with strong but tacit American support, in the face of French duplicity, and won a brilliant victory against a demoralised Argentine enemy. Twenty years on, thanks to the memoirs of the then defence secretary, Sir John Nott, and an interview with the task force commander, Admiral Sandy Woodward, we are learning a very different version. Far from being an ally, Ronald Reagan’s US stands revealed by Sir John as persistently unreliable. Meanwhile under François Mitterrand, a willing France turns out to have supplied Britain with priceless technical details about the Exocet missile. Admiral Woodward has now revealed that the fighting in the south Atlantic was "a lot closer run" than we were told at the time. "We were on our last legs," the admiral says. If the Argentines had held out for another week, they would have defeated an exhausted Britain. Think how different our recent political history might have been then.

In other words, the USA stood aside while the territory of its closest ally was invaded by its belligerent neighbour. Maybe the British should boycott everything American? Even funnier was the revelation that the UK was aided by France!

The above-quoted article highlights the impact of censorship during times of war, not only on the part of government but also on the part of the media. Over the past few days on my television I have seen images of "Coalition" POWs held by the Iraqis, often followed by a statement claiming that these images were taken by Iraqis in violation of international law. And indeed they were. Yet nobody complains when the US does it! They did it in Afghanistan, Guantanamo Bay and, yes, even in Iraq! I’ve lost track of how many international laws the US has broken, not only in this war but also in previous wars. These include the use of chemical and biological weapons (I thought Saddam was the one using those?!), cluster bombs and depleted uranium, and the targeting of civilian facilities. What makes me sad is that my own government is an accomplice to this. There are (were?) Australian citizens being illegally and indefinitely detained in Guantanamo Bay like animals, and the Australian government doesn’t care.

Another thing I cannot understand is the ‘logic’ that some people seem to hold that since the USA helped France in World War II, France should help the USA invade Iraq. Why should France help the US when it is the aggressor? Note that I’m not trying to defend France, because I don’t like them much either. However, this doesn’t make any sense to me at all. If I wanted to use such ‘logic’ (which it isn’t), then I could mention that the French government practically bankrupted itself helping the American colonists achieve independence. Louis XVI basically gave his life for the American people, since the French Revolution might not have happened hadn’t he been forced to pay for his war debts through raising taxes. I could also mention that although World War II began in 1939, and France was invaded in June 1940, it wasn’t until December 1941 that the United States entered the war. Even then, it was Germany that declared war, not the USA. Some ‘friends’ they were! Of course, using such arguments would be excessively facile, so I include them only to show their idiocy.

Update: I just came across this hypothetical discussion between a warmonger and a peacenik. I found it quite amusing.

Update [2003-04-06]: Britain’s Channel 4 screened a great comedy/documentary on 5 January called "Between Iraq and a Hard Place". You can watch the whole thing over the Internet (streaming, requires Realplayer) here.