The software, which is made by a Tokyo startup called Vaak, differs from similar products that work by matching faces to criminal records. Instead, VaakEye uses behavior to predict criminal action.

Company founder Ryo Tanaka said his team fed the algorithm 100,000 hours worth of surveillance data to train it to monitor everything from the facial expressions of shoppers to their movements and clothing.

Since VaakEye launched last month, it has been rolled out in 50 stores across Japan. Vaak claims that shoplifting losses dropped by 77% during a test period in local convenience stores. That could help reduce global retail costs from shoplifting, which hit $34 billion in 2017 according to the Global Shrink Index.

Using predictive AI in this way to flag up potential criminal behavious raises a number of ethical questions:

- Is it legal or even moral to prevent someone from entering a store based on this software ? - Do commercial facial-analysis programs have inbuilt skin-type and gender biases ? - Are the public aware of what is happening. Do they consent ? Is it meaningful consent ? - What happens to the data ? How is it protected? Might it be shared ?

Christopher Eastham, a specialist in AI at the law firm Fieldfisher, said the framework for regulating the technology is not yet in place.

"There is a need for clarity from lawmakers and guidance from regulators, who will ultimately need to decide in what circumstances the use of this technology will be appropriate or desirable as a matter of public policy," he said.