Data and context is key for robust A.I. in health

In evidence to the House of Lords Select Committee on Artificial Intelligence (A.I.), PHG Foundation’s Head of Science, Dr Sobia Raza highlighted the potential value of an NHS wide strategy on using health data for algorithm development to realise the potential of A.I. for patients and to ensure data sets are sufficiently representative of the UK population. The committee is considering the economic, ethical and social implications of advances in artificial intelligence.

Following written evidence from PHG Foundation submitted to a House of Lords inquiry on Artificial Intelligence, PHG Foundation were invited to give oral evidence on the implications of A.I. for healthcare to the Select Committee on 21 November.

Responding to a question on how to ensure the robust evaluation of machine learning tools before they are used on patients, Sobia used the analogy of driverless cars, which may have been trained and then successfully tested on broad open highways in California but are faced with a very different set of decisions on narrow British country lanes.

She said ‘It’s the same in healthcare – the A.I. algorithms have to be tested under the conditions and the population in which they will be used.’

Discussing the development of policy around A.I. Sobia emphasised the need for appropriate digital infrastructure in relation to storing, collecting, processing and transferring data - and the importance of retaining a focus on building patient and public trust for sharing health data. In order to gain access to large, representative data sets for A.I. development, it is key that there is greater engagement on the potential benefits and more widely on why data is vital to improving health, care and services.

Also giving evidence at this session were Dr Julian Huppert, Chair, Independent Review Panel for DeepMind Health and Nicola Perrin, Head, Understanding Patient Data, Wellcome Trust. The committee will report by the 31 March 2018.