Thinking sustainably

One of the biggest levers that businesses have in promoting sustainability is by lobbying the companies that supply stuff to them. If a supplier is unable or unwilling to provide evidence of sustainable practices, or improving sustainability in their organisation, then a business should find a supplier who can do this. As businesses add sustainability credentials as criteria for their purchase decisions, the market will promote suppliers who provide these credentials.

The consumer segment looking for sustainable goods is known as “LOHAS” by marketing professionals, who love a good acronym. LOHAS stands for “Lifestyles of Health and Sustainability. Of course, once an acronym exists, you can bet there will be companies making a grab for the dollars. The LOHAS marketing segment was valued in 2014 as having a global value of $500 billion, growing at 10% annually.

Sadly, there doesn’t appear to be a similar term (or estimated market value) for business to business transactions. Which is a shame, because while consumer preference is a major driver for retailers, it has less direct influence on other business behaviour (of course, there will be some secondary effects). This market must exist, but I couldn’t find figures on it’s size.

It is usual for businesses that adopt sustainability practices to start by tackling the issues inside their own organisations. An excellent next step is to review their supply chain, and to select suppliers who are willing to take the sustainability journey. This is the so-called “greening of the supply chain”.

We can see this starting to influence the global IT companies such as Apple, Google, Xerox and Toshiba. Naturally, each company has its own interpretation of what sustainability means, and this is reflected in the approach each takes.

Who remembers using an early word processor, like WordStar? What about VisiCalc? In those dark old days, many computers didn't come with a hard drive - we stored all of our documents on removable storage media, like 8 ¼" floppy disks. How many of the documents and images that we saved then are still accessible now? Even if the storage format can be read, what about the machinery to read the media. Before floppy disks, everything was stored on reels of magnetic tape. Early in my career, a company I worked for had geodetic survey stored on banks of these tape reels. The information had cost a small fortune to collect - but even if these tapes still exist today, it's unlikely that the means to retrieve the information is around.

This is a real problem, that still exists today. While the storage media has changed, the issue of long-term information storage and retrieval remains problematic. How do you ensure that the thing you store today will be retrievable using tomorrow's technology?

For example, most countries require businesses to keep financial records for an extended period - in New Zealand, for example, it's seven years. Most businesses currently keep this information as paper records. However, as cloud accounting services like Xero become the new normal, it will be interesting to observe whether business owners have sufficient trust in the supplier to dispense with the paper. For younger business owners, the convenience of cloud-based access to financial records will almost certainly trump the "old fashioned" paper. Whether that's a sustainable model for their financial information will only become apparent over time. What would happen if Xero was to suffer a catastrophe? It seems unthinkable today - but today's unthinkable becomes tomorrow's normality.

How will you ensure that your critical business information will be available when you need it?

Energy is always going to be a problem when it comes to a discussion about sustainable information technology. I recently read an article on Dan Luu’s blog about the siting of power stations. The story was about the disparity between the cost of computing hardware (servers) and the cost of powering and cooling said hardware. But it also highlights how the changing economy (from industrial to knowledge) has an impact on energy generation. It’s a well-written article, worth a read if you have any interest in the topic.

The fact that data centres are power intensive shouldn’t come as a surprise to anyone outside the industry. The largest cloud-oriented companies have clearly recognised that supply (and the cost of power) is likely to become a problem in the future, as shown by theirpubliccommitment to renewable sources.

It also shows that an energy-saving justification for moving to cloud-based infrastructure is not as transparent as it seems at first sight.

We should be concerned at power consumption on user devices, whether directly connected to the mains or not. We should also be concerned about servers, both the powering and cooling. But as more of us move our essential IT infrastructure to the cloud, we should also be concerned about the energy consumption of those services. Cloud-based services should be more energy efficient, as they are built on scaled architecture, and should be reaping the economic benefits of that scale. However, this is all effectively hidden from the users of the service - not deliberately, for sure - but energy consumption is not disclosed (probably not even thought of by the service providers). Is this something we should be concerned with? Or must we trust the energy economies of scale that we believe are in place? I think that there should be a way to determine the net energy gain (or loss) when changing platform (cloud-based or otherwise).

We really want to know the impact of this change on an entity’s carbon footprint, but energy consumption is a good proxy for that in the meantime. Of course, in a world with a globally managed carbon market, a service price would reflect its embedded carbon, but we don't live in that world - yet.

"Deafening silence", "economic forecast", "military intelligence" - these phrases are familiar oxymorons, a term for a figure of speech in which seemingly contradictory terms appear together. You can find long lists of similar terms very quickly using your favourite search engine. "Sustainable information technology" doesn't seem to appear in any of them - but I think that it should - at least for the time-being…

As the threat of climate change continues to loom large, all sectors are starting to examine their working models to determine how to reduce their impact on the environment, most specifically on those emissions that affect the atmosphere. It seems as if everything we humans do has a negative impact, whether it's energy generation, making things, growing food, moving stuff around. And of course, almost all of the things we do require information, and particularly information technology to make them work.

The systems that move information around the world use electronic components, which are often composed of scarce elements, and use many oil-derived parts, particularly plastics. To make things worse, these systems are also massive users of energy. So how can information technology be sustainable? Is it possible? What are the challenges, where are the opportunities?

I’m going to use this blog to explore these issues further, and ideally, to figure out what businesses should do to influence their suppliers to create tools that are sustainable. So, while I think that sustainable information technology is an oxymoron today, my fervent hope is that this won’t be the case in the future.

I went to a networking/information sharing event in Wellington yesterday. The presenter was Myles Ward, the CTO of Inland Revenue. He spoke about the challenges of moving the New Zealand revenue ICT department to a services approach. It gave me pause for thought. While it was good to hear that one of the larger IT departments in the country was moving to this model, it seems odd that it isn't more common now. After all, this "technology" has been talked about for at least 15 years, maybe longer.

I asked the other attendees (as a group) to comment on whether they were considering service orientation, or had already implemented it. To my surprise virtually no-one had anything to say.

This seems bizarre. As a group, we want ICT to move beyond the plumbing of business, being a business enabler, and even providing ways to disrupt existing models (see newspapers, music and video for the prevalent examples).

Most C-level executives find it hard to understand and be able to articulate the value that IT delivers to their business. And to be fair, most IT people do too.

When you orient your IT service delivery around business outcomes, it quickly becomes obvious to the consumers of those outcomes where the value proposition lies. Which is why it is perplexing that so few IT teams deliver services in this way.

If we want IT to be considered strategically critical for business delivery, we must be able to demonstrate value. Service orientation is an excellent way to do this.

Nicolas Carr famously wrote that "IT doesn't matter" (HBR, 2003). It seems to me that if you're not considering how IT delivers value to your business or enterprise, you've already accepted his proposition.

Why do organisations need to have an information technology strategy ?

After all, they have a business strategy, right?

Perhaps a better starting question is why have a strategy? Let's start by defining the key term (dry, I know but bear with me). Dictionary.com defines strategy as being "A plan or policy designed to achieve a major or overall aim".

As a business leader, your focus is on achieving the goals set down for you, which are ideally aligned with the goals of the organisation as a whole. Goals tell us what an outcome looks like, but not how the goal will be achieved. A goal is a destination. Strategy is the roadmap or directions that tell us how to get there.

Information is a critical resource. Information must be managed with other more traditional resources (such as labour, materials and money) in order to get the most from it. So an information (technology) strategy is about how to use technology to get the information outcomes the business needs.