All posts by jonny

Web 2.0 has had a massive impact for good on the lives of modern humans. Web 2.0 has also been complicit in ushering in the most advanced, pervasive and Orwellian surveillance state ever witnessed by humanity. You could say that Web 2.0 created Surveillance 1.984.

How might we retain the benefits of a hyper-connected and computer-augmented society without being constantly watched by people whose interests may not always directly align with ours? How can we use technology to fashion a future that we actually want to inhabit?

The full details of the monitoring apparatus that the NSA, CIA and other “security” agencies have constructed are still trickling out from the cache of documents released into the wild by Edward Snowden. What has become clear is that every action performed in the digital arena, whether it be sending an email, making a phone call, browsing a website, tweeting an opinion, buying an item, taking a photo or just moving around with a phone in your pocket, can, and usually is, being intercepted, stored and mined for information. The technologies and services that allow us to be constantly connected to information, colleagues, friends and loved ones at the same time allow the government to snoop on private citizens in an unprecedented, unrequested and effectively unregulated manner.

Here’s a video of a recent talk I gave with my colleague Stew Gleadow in Sydney and Melbourne in Australia at our ThoughtWorks Live event in May.

It looks at strategies for successfully evolving mobile services and applications over time across a range of screens and platforms. We delve into some case studies on an Australian broadcaster’s second-screen application and a cross-platform approach for a major airline.

This article was originally published by InformIT and can be viewed on their site. It is reproduced here with kind permission.

Part 1 of this series examined the explosion of mobile and embedded devices that characterize our future, explored the challenges posed by these changes, and considered a methodology for reliable innovation in this environment and the technology enablers required to support that approach. In part 2, we look at what types of strategies are likely to be effective in this new world.

Visionary Strategies

Once you have a reliable methodology in place for fostering innovation and engaging the market, supported by the technology enablers mentioned in part 1, you are finally ready to start growing and developing visionary strategies to help you capitalize on the emerging world of ambient computing.

The big question becomes, “What should our vision and strategy be?” Unfortunately, there’s no stock answer I can prescribe (though I’ll be happy to help you figure it out), but I do have some pointers toward directions you should be considering.

The growing ubiquity of computing and omnipresent interfaces points to opportunities such as “any customer, anywhere,” and the explosion of profiling data opens up services based on the idea that “we know what you’re about to think.” The key is not what your exact vision is, but how you validate it and course-correct based on that feedback. This in itself is the strategy of rapid product evolution for which part 1 of this article attempted to lay out the foundations.

This article was originally published by InformIT and can be viewed on their site. It is reproduced here with kind permission.

The world is changing, and we all need to prepare for it. The proliferation of mobile devices we are witnessing right now, and the associated challenges related to creating applications that work across those devices, are just the thin end of the wedge of what the future holds. Cisco predicts that by 2020 each of us will own an average of 6.58 connected devices. People are interacting with organizations and services with an ever more diverse set of technologies, they are doing this in a growing number of contexts, and the data being created is growing exponentially. In two-part series, we’ll look at strategies for not just surviving (part 1), but thriving in and capitalizing on the opportunities provided by our hyper-connected future (part 2).

A Shattered Future

If we look closely at the technology trends, of which mobile is just one part, it becomes clear that we are witnessing a shattering of input and output mechanisms. In the past, interactions with computers have been through fairly narrow channels. The vast majority of inputs have historically been via keyboard, and outputs were predominantly through a single fixed screen. That simple past and the strategies we developed to operate in that world are no longer useful guides to the future. We are witnessing an explosion of channels for interacting with computers. Those channels are no longer tightly coupled to each other, and even the concept of “a computer” is being blown away.

As we talk with clients and prospects in the market we’re seeing a steady growth in interest around BYOD (Bring Your Own Device). This trend to allow employees to bring their own hardware (predominantly mobile phones) is putting new stresses and strains on existing IT infrastructure, operations and development practices. There are many pitfalls to watch out for, but if executed successfully, embracing the consumerizationof enterpriseIT can pay dividends by re-engaging a jaded work-force, simplifying cumbersome workflows and offering a launchpad to a next generation of more supple, usable and maintainable software.

The wave is inevitable

The days when organizations could mandate a limited set of issued (or supported) devices and provide access to services that were designed more around the constraints of existing IT than the users’ needs are ending abruptly. Organizations that are hesitating to overhaul their approaches are finding that employees are quickly finding ways to circumvent existing procedures and systems. It used to be the case that the software and hardware that enterprises offer their employees tended to be superior to what they encountered at home. With the advent of services like GMail, Dropbox and Skype and of hardware like the iPhone and iPad those days are well and truly behind us. Organizations that don’t respond swiftly to embrace this trend are finding themselves saddled with a disgruntled and unproductive workforce and a growing security attack surface as their employees find work-arounds to shoe-horn their favorite tools into their work lives.

BYOD introduces many challenges – security of services and data is high on the list as is distribution and provisioning along with exposing key systems, like email and calendar, to a range of native applications. However this article is focused on the challenges involved in building or migrating applications to work on a variety of devices and a range of contexts.

The landscape is changing

Since the dawn of the software era, systems have generally followed a lifecycle of develop/operate/replace. For the type of systems our company, ThoughtWorks, specializes in (typically built over the past 10-15 years), organizations expect as much as 5-10 years between significant investments in modernization. And some of the oldest core systems have now reached 40+ years – far longer than the average life-span of most companies today!

IT assets are relatively long-lived largely because modernization often represents a significant investment that doesn’t deliver new business value in a form that is very visible to managers or customers. Therefore organizations put off that investment until the case for change becomes overwhelming. Instead, they extend and modify their increasingly creaky platforms by adding features and making updates to (more or less) meet business needs.

For decades, this tension between investing in modernization versus making incremental enhancements has played out across technology-enabled businesses. Every year some companies take the plunge and modernize a core system or two, while others opt to put yet another layer of lipstick on the pig.Continue reading Dealing with creaky legacy platforms→

I was reminded today of a presentation I’d put together to help project managers who are new to Agile understand how to use the ubiquitous “burn-up” or “burn-down” chart. Since some people seemed to like it I thought I’d share it with a wider audience.

Adopting a new development methodology is less about process change and more about attitude change. The binder is useful, but the mindset is vital.

Much of my work over the last few years has involved helping organizations “adopt” Agile. It is, after all, a poor, unloved orphan and needs to find a good home. The key to whether the new approach sticks doesn’t seem to be affected by how many checklists, process maps or charts of roles and responsibilities we provide; what matters is whether an organization can adjust their collective and individual attitudes.

I always struggled to see how what Lean teaches us about pull systems can be applied to software development processes. That was until I had an “Aha!” moment a little while ago helping a client apply lean and agile principles to their delivery process.

The big fat lie

I understand how queuing theory can help identify and reduce bottlenecks in processes and have used finger-charts and kanban-boards to do this for a while, but I still find calling this a “pull system” to be slightly disingenuous. All that’s happening is that more “stuff” is being pushed based on a trigger when certain buckets get too low. This reminds me of my annoyance with early technologies on the web that were touted as being “push” but were really just “repetitive-pull” (but not in a good way). I’ve never seen a software organization where the developers have said to the business or product people “we’ve got nothing to do, can you think up some new projects or features for us please?”.

Have you ever opened up a file of source code and flinched at the complexity that comes screaming out at you from the screen? Well I’m imagining an IDE plugin that could do the screaming for you.

There are many measures of code that are objective and metrics driven, but there are others that are more subjective and taste based. I can’t tell you from a quick glance how many afferent couplings there are in a given piece of code, but I do have an almost immediate sense of how elegant the code is. In my mind elegance is all about solving complex problems with simple solutions. That’s the art rather than the science of computer programming.