Technology is changing everything, and the pace of change is rapid. It’s so transformative that sectors are blurring, and its power is harnessed to drive anything from productivity and efficiency to wellbeing and sustainability. From fashion to banking, manufacturing to publishing, accountancy to agriculture, its impact is everywhere and offers much pause for thought. What are the costs of implementing new technology; are they worth it; and who’s responsible for ROI? Do I have people with the right skills to understand this technology, let alone implement it? What are the myths, what’s the reality, who do I believe? Will new technology such as artificial intelligence take away too many jobs or make roles more meaningful? And what are the risks? How can I protect my organisation? What do I prioritise? We put these questions to a group of people with technology embedded in their organisations. Here’s a summary of what they had to say.

Investment and responsibility

Education is key, explains Chris Sykes: “Every customer has a different understanding of
what artificial intelligence (AI) is, so we get key stakeholders in a room and re-educate them – tell them what we know about AI they won’t find on Wikipedia. We’ll also help them assess how ready they are.” That’s the challenge, agrees Terry Walby – technology is advancing at pace but is relatively unproven at scale in more significant, consequential areas. “I don’t think it’s wise to have a really directional AI and IT strategy, you have to have an innovation mindset,” he advises. “Be experimental, create a culture in your team that gives them permission to fail.” When he’s convincing businesses to invest, his counsel is to treat it as a journey: “It’s not a big bang decision, you can do it iteratively.

We’re trying to democratise the use of AI technology; to put it in the hands of people who can deliver a real difference.” This point resonates with Michael Vroobel, who works for a company where technology is in its DNA. As he explains, a good in-house tech team is expensive to run, so you need to make sure they’re utilised properly. “Think about the business case, plan quarters ahead, build a roadmap. Transactions over the past few years have increased by thousands of per cent, and our operations team has increased by one or two heads, so technology has a massive impact,” he explains.

Talent/skills

There’s an enormous demand for data science skills and machine learning engineers, confirms Rob McCargow; it’s a competitive landscape. His firm is improving its longer-term talent pipeline by creating technology degree apprenticeship programmes. “But the interesting thing here
is we’ve got a huge untapped population that’s not evidently showing up in the workforce, particularly around gender. We’ve got to have a socially representative AI workforce,” he says.

Technology is definitely changing recruitment says Michael O’Brien. His practice is hiring people from more diverse backgrounds than it traditionally would have. “We have relationships with schools in disadvantaged areas that wouldn’t have been on our radar three or four years ago. We take on people with a different understanding of what technology is and how it’s used. They may not have an A* in maths but are incredibly in tune with AI.”

Walby had to take the initiative when it was either too expensive or impossible to find the skills needed to grow his “fairly early-stage business”. If you can’t hire, he says, you train: “We developed a programme to bring first-jobbers into the business and make them
a core part of what we do; to learn as we were building.”

A great team has a mix of skills, says Vroobel, but that can sometimes be stymied by geography. “We’re business finance so we’ve got tech but we also need core finance skills. The [tech] engineers want to be in Shoreditch, it’s all set up for that demographic. We had a beautiful office in Richmond and it was hard to attract talent there.”

Your employees must be open to learning new skills, adds Sykes. In his business, they have to react just in time. “We’re working with natural language data models so our copywriters are being reskilled to become dialogue authors. What’s the tone of voice, how do they respond to off-topic questions, inject a bit of humour? We train people to interpret those data models.” And Kirstin Gillon says: “Our members aren’t going to be data scientists but they still need to have a higher level of skill in technology and AI and be an intelligent user; to have meaningful conversations with the data scientists and a common language.”

Myths and fear

Citing PwC research, McCargow says 30% of existing jobs could be highly susceptible to automation by the early 2030s. On the other side, technology will lead to substantial productivity gains, possibly adding 10% GDP to the UK economy (about £230bn). So what was the feeling around the table about job displacement and an innovative workforce? “If you don’t evolve and adapt, you will die,” says Sykes. Most professions do that anyway, suggests O’Brien. “If you look at accountancy, it was quill and ledgers to Excel spreadsheets to Xero and cloud computing, there is a continuous evolution. This is just the next step. It does depend on the job,” he clarifies. It’s not about jobs but about tasks within jobs, points out McCargow. “The heart of professional accountancy is about trust and assurance, the concept of that will never go away, the way it’s delivered will change immeasurably. You will need people that aren’t just classical auditors but people that can audit AI.”

The core value of what the profession does in terms of providing trust, supporting business, will always be there agrees Gillon. How that translates into tasks will look radically different. “There are great opportunities to help businesses add value, to make better decisions, but you have to recognise that for an awful lot of individual accountants there’s a big process of change,” she says. People must get this idea out of their heads that AI is a scary robot that will take over the world, insists Walby. “My fundamental belief is that most businesses are constrained by their access to skilled resources, so if you can liberate time from the people who work for you then you’ll be more productive.”

It tends to be extreme views, observes O’Brien – Doomsday or the next industrial revolution. “I don’t think we can be naval-gazing, worrying about a Doomsday scenario when the technology can be harnessed to extraordinary business benefits as well,” says McCargow.

Attitude/culture

There are definite differences in the way people adopt technology, and the speed at which they adopt, says Walby. “Generally, the UK population is not the most ready to adopt things compared to other geographies like the Nordics.” O’Brien agrees – the Nordics, Asia, New Zealand, these don’t tend to lag behind. And yet perversely, a lot of new technology is developed in the UK, adds Walby. Let’s not write off brand Britain too soon, says Sykes. “Half of our revenue, around £5m of business comes from Silicon Valley. Britain is trusted and seen as delivering”. Attitudes are changing, continues Vroobel.

“Once you get past the initial stumbling block, traditional family businesses love it,” he says. It comes back to customer education, he adds. Anxiety and worry is not confined to the UK, says Gillon. “I spent a lot of time in China where their [tech] adoption is enormous. But every presentation I did for students I was asked: ‘Am I going to have a job in five years?’ I think that core worry is everywhere, I don’t think it’s unique to the UK. They may be less risk averse than the UK in terms of rapid adoption, but there are still underlying concerns about the long-term impact of AI in China.”

Governance

At the World Economic Forum this year there was a huge focus on the UK setting out its stall
as a world leader in ethical AI, says McCargow. “There’s some sense behind this. We can’t compete with China, which has 800 million people online, a lot of data to train algorithms on.” But the UK is strong on “trust, maturity of regulations, and good business”. It should focus on this. From an accounting perspective, he adds, even people building the most sophisticated AI sometimes can’t explain how an optimal outcome has been reached. That’s an opportunity for accountants: “The tough, technical challenge of provability and transparency.” The key challenge for government and regulators is that they’ve got to move quickly, believes Vroobel. “Otherwise big tech companies will and it will be too late to do much about it.”

Another challenge is definition: what constitutes AI, asks Walby. “Where does it start and stop? There is no universally agreed definition. Is any algorithm in any computer a piece of AI that needs auditing?” There’s also privacy, says Sykes. “How much do you want people to know about your information, how much are you willing to give away?” It’s a hot topic in his industry, says Vroobel: “We’re going to learn that really soon with Open Banking.” On the subject of definition, Gillon says we shouldn’t compartmentalise: “It worries me that we put AI in a box; it’s actually just how you run your business, what your business model is, how you integrate and embed AI into your decision-making. It’s not a space in the corner where the AI specialists sit and think deeply. Everything is about data.” And AI lives off data, without it it’s nothing, says Walby. “It can’t be a bolt on,” agrees O’Brien.

Risk

Cybercrime is the obvious threat, and therefore you must invest in the best infrastructure, says Vroobel (especially with GDPR coming up). What also plays on his mind is customers. “If you automate everything and go all AI or all tech, do you alienate your customers by not having people on your customer service? We’re people plus technology; it’s hard to know how people are going to behave and react when you introduce all of this new technology.” Gillon thinks it’s about attitude and being ready to respond. “However good you are, if there are bad people who want to get into your systems they probably can, so there is a need to be ready, to have the communications ready, make sure you know what you’ll need to do.” McCargow has already turned his thoughts to how AI can be used in a harmful way. “When you start augmenting traditional cyber attacks with AI-powered attack vectors, that could be substantially more damaging. On the other side we’ve got great companies which are augmenting this into cyber defence. The future of cyber security could look quite different in a few years.”

How AI is being used today

A key priority for most large businesses is to strip out cost, says Sykes. He’s seeing early adoption of what he calls automating the first touch, to “push the humans to high-value interactions”. He gives the example of a 24/7 concierge, a conversational platform at a law firm that deals with an enquiry at any time and responds. He says businesses are investing in technology, such as virtual reality, for training – “putting people into scenarios that are otherwise dangerous or expensive to do”. Walby’s technology can learn over time, for example, what an invoice looks like, what the important information on it is, and how to extract that to execute the process. “It doesn’t need to be taught, it’s using AI and pattern recognition over time to understand what it’s looking for.”

At the other end of the scale he has technology that is becoming self-aware: “It’s understanding how long it takes to execute a process and if it finds it’s easier to do something at 3am because it takes less time, it will refine the process to deliver a better outcome.” McCargow has seen a combination of technologies applied together to great effect: a drone, image recognition, a deep learning neural network, a geospatial app and virtual reality.

“It’s not just looking at AI in isolation but how it’s harnessed with other emerging technologies,” he says. In Vroobel’s business, technology is used to deliver better customer service. “It helps with tasks where people tend to make errors, so if you’re doing a repetitive task over and over again, you’re likely to send money to the wrong place. That doesn’t really happen anymore.”

Conclusion

The overall sentiment in the room is largely optimistic. McCargow describes himself as “a responsible optimist. This can be great if we get full participation from the institutes, the start-ups, big business and the public.” It comes back to communication, he says: “How we break some of these myths and demonstrate how it is going to enhance lives. This comes through data privacy, trust, ethics, jobs, purposeful companies behaving well. If we get all that right then we can look forward to tremendous gains from the technology.”

Walby agrees: “Our biggest enemy is lack of awareness: people are largely uninformed about what technology could do for them, what it is, what it means, how they could leverage value from it.” Let’s stop using images of humanoid robots whenever we refer to AI for a start, pleads Sykes. “We work with humanoid robots and they’re not artificially intelligent in any way, they’re just beautifully engineered devices that we can train with a very narrow dialogue. Humans are controlling these things, they’re not wandering around autonomously.”

The biggest challenge in Gillon’s opinion is pace of change. Her plea is that readers (and regulators) keep up to speed so they’re not left behind. The beauty of all this technology, adds O’Brien, is that in three or four years’ time everyone will be using it without realising it. “We won’t call it AI anymore. It’s part of life now, and it will become the norm. It’s about how we get there, because it’s taking the profession and the rest of the world with it.” Vroobel agrees, it’s about implementation. “AI is a bit of a buzzword in a lot of boardrooms. Actually, where do you get the most benefit? You’ve got to build the best you can where you’re going to get the ROI.”