Usability or security? The pendulum keeps swinging for end-user computing

Usability or security? Security or usability? Which way should the pendulum swing for the best result? I guess it depends on whether you’re an end user, a business owner, or in IT. What interests me more, however, is which way the pendulum is swinging nowadays and why. To glean some insight into this, I’ve asked expert James Rankin to help bring clarity to the subject. James works as a senior architect for HTG, a solutions provider in the UK, where he develops innovative technology solutions for enterprise customers. He is a current CTA, ACA, and vExpert and his expertise can be found on his blog and when he is presenting at industry events and user groups. In end-user computing (EUC), traditionally there are “pendulums” around particular schools of thought. Outsourcing vs. insourcing, thin client vs. thick client, virtual or physical, etc. — each has benefits and drawbacks, and end-user computing strategy often swings between them dependent on various external factors. But a newer addition to this list of incompatible bedfellows is security vs. usability, and it is one that IT departments need to consider very, very carefully. Let’s see what James has to say.

The trend toward user experience

Over the last 10 years, there has been an explosion in the focus of IT toward user experience — the overall usability and responsiveness of a system. Traditionally — 15 to 20 years ago — computing devices were delivered to the users from a monolithic single image, locked down by policy, with a pre-defined and pre-installed set of applications. Users had little to compare their systems to in terms of personal devices, and access to the Internet was very limited. In these situations, systems administrators simply delivered the experience required to accommodate a specific set of business applications.

But this was never going to remain the same. The explosion of the Internet, the boom in numbers of personal computing devices such as mobile phones and tablets, availability of SaaS applications and cloud platforms — all of these contributed heavily toward a much larger expectation around user experience. Some have termed it the “Apple effect,” and it is certainly true that the responsiveness, reliability, and slickness of the immensely popular iPhone and iPad devices among consumers have added to this. Instead of simply “accepting the hand they were dealt” in terms of devices and experience, over the last five to 10 years users have begun to demand flexible, dynamic, and intuitive workspaces that equate with that they have on their personal devices.

Pass the painkillers

Which has certainly given IT departments a headache. Firstly, measuring and quantifying that intangible metric known as “user experience” is not a straightforward task. And also, making old or incompatible legacy applications function in a reliable and responsive way can be very difficult. Most of the focus of the last few years has been on improving user experience whilst still retaining access to the old applications, and it has proved tricky, often involving a deal of investment in expensive monitoring tools and a lot of work in application remediation.

But improving the responsiveness of end-user computing solutions often involves saving system resources or cutting down on the overall footprint, particularly in hosted or VDI environments. And this can sometimes work at odds to the next issue to crop up on the horizon.

Security also became a hot potato back in 2003/2004 with the Blaster and Sasser attacks. Network worms spread around the world very quickly, openly exploiting unpatched vulnerabilities to wreak havoc in computer systems. IT departments, without tools to help them, had to manually secure their systems against this.

Fast forward to the present, and even armed to the teeth with patching and security software, security is still stuck in the same place. Recently, a network worm called WannaCry spread rapidly through enterprise computer systems exploiting an unpatched vulnerability. Credit agency Equifax was brutally hacked when a failure to apply an Apache hotfix resulted in them losing the data of over 100 million subscribers (author included). Personal data has never been more valuable than it is now — some people refer to data as the new oil. Fourteen years since Blaster fired a shot across our bows, it has become awfully obvious that we haven’t learned anything. We are still failing to patch and secure our systems, and from there the security of our users and our enterprise intellectual property comes under threat. Is the reason we haven’t learned the lessons of security because we are so focused on providing a great user experience? Certainly, security has not enhanced user experience. Security software such as antivirus, IPS, web filters — all these need agents that slow down the user’s endpoint. Patching systems involves downtime and interruption to user productivity. Securing an environment often comes at the expense of the overall user experience.

The pendulum swings again

But in the current climate, enterprises are swinging toward security. As well as the public hacking of large corporations such as Equifax, which can negatively affect share prices and company reputation, there is the looming arrival of the EU’s General Data Protection Regulations (GDPR) in May 2018. Contrary to what many think, these regulations do not only affect companies in the EU but apply to any company that processes or handles the data of citizens of the EU. GDPR is a radical overhaul of the existing European data protection laws and comes with one thing in particular that the previous laws lacked — a huge set of teeth.

Take, as an example, the case of British ISP TalkTalk, which was fined a total of £400,000 under the existing laws for a security incident exposing the data of hundreds of thousands of users. If that had happened under GDPR, which has the ability to levy a fine based on turnover, not profits, TalkTalk could potentially have been looking at a total fine of £73 million. This is the sort of huge number that finally gets the attention of C-level execs when it comes to talking about security — if the fine is so small that it is cheaper to pay the fine than adopt proper security measures, then nothing will change. The cost of lax security needs to be made so high that companies have no choice but to adequately secure their systems, and with GDPR, it is finally looking like that level has been reached.

So with security now back in the focus, does that mean that we are going to sacrifice user experience to achieve compliance? Are our users going to have to put up with unresponsive and frustrating systems because we’ve realized we have no choice but to secure our endpoints? Well, that entirely depends on how you approach the goal of security. If you do in a traditional manner, then the answer is quite possibly yes. The traditional approach to IT security has always been agent-based and multilayered. Antivirus agents, IPS agents, web filtering agents, encryption agents, policy lockdown — all of these add up to overhead on the machine and disruption to user workflows.

But GDPR doesn’t specifically say any particular technical method has to be used to achieve compliance. It doesn’t even mandate encryption. There is no particular technical solution, or class of solution, that must be implemented. So there is no need for an enterprise looking to improve their security to line up with GDPR to deploy any of the traditional technologies that they would have associated with security.

On the other hand, you can’t simply throw everything into a cloud solution and expect that to absolve you of GDPR responsibility. A cloud provider classifies as a data processor, rather than a data controller (which is the company that owns the data). It is a dual responsibility under GDPR — the controller is also on the hook should a processor fail to comply with GDPR rules, and it is the responsibility of the controller to ensure their processor of choice is handling data correctly.

So what’s the answer?

As with everything, it is difficult to provide a “magic bullet” solution to this. Security is not just a technical solution, it is a state of mind that permeates every business process. To properly secure an enterprise for GDPR compliance is more than just software or hardware.

But when it does come down to technical solutions, we often need to rethink our approach. As an example, take antivirus. Antivirus is reactive, has a lot of resource overhead, needs constant updating, and often causes problems through false positives and the like. However, you can achieve much better coverage through using application whitelisting technologies such as AppLocker instead, and without much of the overhead. Indeed, the only major drawback is keeping the whitelists updated so that all required applications are available.

There are other technologies — an honorable mention should be given to Bromium vSentry — that change the way security is handled. Bromium uses “microvirtualization” to prevent exploits in applications compromising the security of other applications or the operating system. With tech like this in place, there is no longer a need for a huge layer of high-resource agents that protect the endpoint — the potential for exploitation has already been seriously reduced. There are other vendors out there like Cylance, Carbon Black and many others that are challenging our traditional views on what is “security” technology, and evaluating these solutions may help you find a combination of tech that can secure your environment without throwing user experience firmly under the bus.

Where I work we believe that every enterprise is different, but that with a fresh approach, there is the possibility of making security and usability meet in the middle. A secured system will never be the fastest around, but it doesn’t have to be sluggish and frustrating to the end user. With the right blend of new technologies, and the right mindset throughout your enterprise, it just might be possible to have your cake and eat it as well.

Photo credit: Freerange Stock

Post Views: 181

Featured Links

Read Next

Mitch Tulloch

Mitch Tulloch is a widely recognized expert on Windows Server and cloud technologies who has written more than a thousand articles and has authored or been series editor for over 50 books for Microsoft Press. He is a twelve-time recipient of the Microsoft Most Valuable Professional (MVP) award in the technical category of Cloud and Datacenter Management.

Latest Podcast

Featured Freeware

Recommended

Follow Us

Usability or security? The pendulum keeps swinging for end-user computing

TECHGENIX

TechGenix reaches millions of IT Professionals every month, and has set the standard for providing free technical content through its growing family of websites, empowering them with the answers and tools that are needed to set up, configure, maintain and enhance their networks.