Robots in public service: onslaught or opportunity?

Artificial intelligence (AI) and machine learning are rarely out of the headlines. Autonomous vehicles are expected to be on British roads by the end of the year. AI has been found to be better than doctors at diagnosing conditions such as skin cancer. And just this month, IBM unveiled a machine that could effectively argue with a human on a subject it had not prepared beforehand.

It’s inconceivable that this technology will not dramatically change the way the country’s public services are planned and delivered, agreed the panel speaking at a Guardian event to debate how soon robots will run public services, supported by 2020 Delivery. Chair Jane Dudman, Guardian public leaders editor, pointed out that it’s happening already – automation, chatbots and voice-recognition systems are in use across many national and local government departments. At the Department for Work and Pensions (DWP), for example, algorithms are used to identify and prevent mass fraud in the welfare system.

But it is no easy task to predict when AI might take a leading role in the UK’s public services sector. Stephen Metcalfe, chair of the all-party parliamentary group on AI and member since 2012, point out that redicting the future of technology is “almost as difficult as predicting what the weather will be like tomorrow”.

The panel discussed concerns like the potential loss of jobs, possible bias in an automated world, bias in algorithms, the need for greater understanding about AI among civil servants, and the practicalities of data sharing. Amid the debate around issues, there was also a real enthusiasm about the benefits AI could bring and the role of the public sector in realising that potential. “The public sector needs to lead this revolution … [that] will bring a lot of optimism,” said Birgitte Andersen, chief executive and co-creator of the Big Innovation Centre.

But to make the state purposeful, it needs to have intelligence, she added. Smart cities, data collection and data sharing between organisations are underutilised by the public sector in the UK. “We still have Companies House using their data, and the Office for National Statistics using theirs, and the Bank of England using theirs … you really need to open up people’s data, business data, so services can become more efficient,” she said.

That’s something that will only be possible if the government builds trust with the public, agreed Metcalfe and Mike Clancy, general secretary of the Prospect trade union. A pilot to share health data via the NHS in 2013 proved unpopular, largely because the benefits weren’t adequately explained. “People felt like this was state intrusion in their private lives,” Metcalfe said.

“We’ve got massive issues with trust in this country,” added Clancy. Good communication between central and local government, the public and other stakeholders will be essential to move this debate forward and ease anxieties about the changing nature of work and potential job losses. A report found almost 250,000 public sector workers could lose their jobs to robots in the next 15 years. It will be down to public service leaders to effectively manage that change, he said, and inspire confidence in the need for the technology. “We’ve got a lot of examples in the public and private sectors of technology insertion in hope, rather than certainty of delivery,” he added.

Baroness Olly Grender, member of the Lords select committee on AI, which published a report in April 2017 about the UK’s readiness to embrace this technology, agreed there is a need to improve its understanding in the civil service, and pointed to the GovTech Catalyst challenge, which is investing £20m over three years to encourage tech firms to help fix public sector challenges. “Just like politicians, this is a real learning curve for [civil servants],” she said. But she said she thought people, especially young people, were less concerned about “the onslaught of AI” and data sharing than some might believe. “My 12-year-old would happily give up every bit of data he has for a free voucher to McDonalds,” she said.

One issue that Grender admitted does need to be acknowledged is that bias can exist in algorithms. Research has already shown that AI programs are picking up deeply-ingrained race and gender prejudices. Antonio Weiss, director and digital practice lead at 2020 Delivery, mentioned a controversial programme used by the Department of Justice in the US called Compas, which was found to have a disproportionately high predictive rate for some demographics, when used to measure an offender’s risk to the community. “That’s obviously worrying but we’re learning from that,” he said.

For now, AI can really help public servants improve efficiency for mundane tasks, said Weiss – such as the 8bn calls handled by UK call centres every year. But these developments must keep the end user in mind, rather than introduce AI for the sake of it. “[We need to] focus on the problem we’re trying to help our citizens solve and how technology can help that,” he commented.

Audience member Michael Mousdale, partner at the law firm Browne Jacobson, wanted to know where the money would come from to pay to introduce AI to public services, particularly among local councils, which still face austerity and are struggling to pay for adult social care. Metcalfe replied that money isn’t the only consideration, and there’s a need to look long term, but added that technology does have the potential to help councils intervene earlier and divert users away from more expensive forms of care.

Clancy reaffirmed the need to have a serious discussion the role of the state, and address the core problem for public services in the UK, where people want excellent provision but don’t want to pay for it. Any discussion about AI, he said, should drive a reconsideration of the value of public services. “If we can do that, we might stand some chance of garnering public sympathy for the money that needs to go into them.”