Subscribe and save more than 30% and receive our exclusive money back guarantee – click here to find out more.

By definition, a standard assumes a level of commonality that enables multiple implementations which are totally conversant with one another. The basic requirement of a standard data format for office suites is that it preserves the integrity and neutrality of the data. Governments and other organisations have a vested interest in the implementation of open standards because they want to ensure that the documents of today will be readable tomorrow. Vendors and developers want open standards because they allow the opportunity to develop alternative ways to edit, interpret and view the data.

Too many users save their documents in binary formats that are both proprietary and transitory. The justification is that the proprietary formats are de facto standards. A de facto standard effectively confers ownership of documents on the ‘owner’ of the standard. Currently, Microsoft owns the de facto standard for office documents. Not so long ago the de facto standards for office computing were owned by WordPerfect or WordStar or Lotus. A current monopoly does not ensure a future monopoly. A de facto standard has a limited lifespan and confers no guarantees.

Proprietary data formats offer little long-term data security, and are an unreliable choice for transferring data to prospective clients. The purpose of open standards is to promote interoperability between different applications on different operating systems. The effect of proprietary data formats is to encourage reliance on single-vendor applications and to discourage the implementation of competitive products.

Proprietary data formats give us no assurance of permanence or diversity, force dependence on the continuing popularity of a particular product, and are liable to alteration between different versions of the software. The user is locked into an involuntary upgrade cycle with an individual vendor, with few guarantees of consistency, and has little long-term control over the viability of the data.

In a world where people exchange information in many different languages and dialects, it is important that there are common reference points that make interaction possible. Once the basic rules are followed, everything else becomes possible. Open standards give us the means to talk to one another whatever applications, operating systems or computer language we use. “If I can’t talk the language of your proprietary format, I can’t hear what you say”, and conversation becomes impossible.

Much has been made of the myth of the paperless office – the computer as a miniaturised filing cabinet, offering rapid access to all kinds of data that can be retrieved in an instant. So it is surprising how little attention is paid to the digital hardware and software formats used to store and transfer this data.
Computer data has little permanence. In most offices, vast stores of vital company information, email transactions and copies of documents are stored on distributed PC hard drives and/or storage devices that are, by their nature, vulnerable to the vagaries of time, crashes and the individual user. A few years from now, the contents of the current word processor document will be hidden from view, and will be messy and expensive to retrieve, and we will be reliant on emulators and reverse engineering, computer archaeologists and digital preservationists, to disinter information that may still be precious.

Fortunately, there are projects such as the Open Planets Foundation, dedicated to the excavation and preservation of data – and tools such as Dioscuri, an open source all-purpose emulator, or the Sleuth Kit, an open source forensic analysis tool used to inspect, map and recover lost/deleted data. Such tools may be used to save us from the ‘digital black hole’ caused by the transience of proprietary data formats, but the best insurance against data obsolescence is the perpetuation of open standards across all platforms.