AI could negatively impact society without regulation, report warns

By Jack Loughran

Published Friday, May 11, 2018

Policymakers need to ensure that artificial intelligence (AI) technology is developed in a socially responsible way in order to minimise disruption to the job market and the wider societal impacts this will have, according to a new report by the University of Manchester.

“Ensuring social justice in AI development is essential. AI technologies rely on big data and the use of algorithms, which influence decision-making in public life and on matters such as social welfare, public safety and urban planning,” said Dr Barbara Ribeiro, of the Manchester Institute of Innovation Research at The University of Manchester.

She believes that investment in AI will essentially be paid for by taxpayers in the long term, and therefore the benefits should be fairly distributed throughout society.

“In these ‘data-driven’ decision-making processes some social groups may be excluded, either because they lack access to devices necessary to participate or because the selected datasets do not consider the needs, preferences and interests of marginalised and disadvantaged people,” she added.

The global market for industrial robots has been estimated at over $40bn in 2017, and is predicted to grow to over $70bn by 2023 according to a recent report by Mordor Intelligence.

In combination with AI, robotics are predicted to replace a lot of jobs in sectors where roles revolve around repetitive or simplistic tasks. In the UK for example a recent report by the Centre for Cities (an independent thinktank) predicts that jobs made up of routine tasks are those most exposed to automation. It is expected that this will disproportionately affect cities across the north and in the Midlands potentially exacerbating regional divisions.

The report states that overall one in five workers across the UK is in an occupation likely to shrink in the near future. However it also predicts that all cities are likely to experience jobs growth by 2030, with approximately half of this being in publicly-funded institutions.

“Although the challenges that companies and policymakers are facing with respect to AI and robotic systems are similar in many ways, these are two entirely separate technologies - something which is often misunderstood, not just by the general public, but policymakers and employers too. This is something that has to be addressed,” said Professor Anna Scaife, Co-Director of the University’s Policy@Manchester team.

One particular area the report highlights where robotics can have a positive impact is in the world of hazardous working environments, such a nuclear decommissioning and clean-up.

Professor Barry Lennox, Professor of Applied Control and Head of the UOM Robotics Group, adds: “The transfer of robotics technology into industry, and in particular the nuclear industry, requires cultural and societal changes as well as technological advances.

“It is really important that regulators are aware of what robotic technology is and is not capable of doing today, as well as understanding what the technology might be capable of doing over the next 5 years.”

The new report is not the first time that concerns have been raised about the impact that AI will have on the jobs market.

Last year accounting firm PWC warned that the technology could suddenly lead to a “cliff-edge” scenario where huge swathes of the working population suddenly lose their jobs as AI reaches financial viability.

Meanwhile the Trump administration has announced that it will not stand in the way of the development of AI in the United States despite its likely impact on jobs. At a White House summit that included companies like Google, Facebook and Amazon, technology policy advisor Michael Kratsios said the administration of President Donald Trump did not want to dictate “what is researched and developed.”

“To the greatest degree possible, we will allow scientists and technologists to freely develop their next great inventions right here in the United States,” he said.