Main content

Sinead O'Brien, Technology Strategy & Architecture's Lead Project Manager for Transformation Delivery shares her insights from this month's BBC Machine Learning Fireside Chat.

As more and more of our intimate data is collected, is there a price of convenience? If so, what is it, and is it worth paying? The decidedly thought-provoking discussion at last week’s sold out 'BBC Machine Learning Fireside Chats presents: The Price of Convenience’ was hosted by Ahmed Razek of BBC Blue Room.

The provocation…

There is an increasingly fine line between personalised services and invasive services. Do people understand that they’re trading their personal data for these services? Are they aware of the risks? Do they care?

On the panel…

Maxine Mackintosh, PhD student at The Alan Turing Institute. Maxine’s PhD involves mining medical records for new predictors of dementia. She is passionate about understanding how we might make better use of routinely collected data to improve our cognitive health.

Also on the stellar line-up was Josh Cowls, Research Associate in Data Ethics at The Alan Turing Institute, and a doctoral researcher at the Digital Ethics Lab, Oxford Internet Institute. Josh's research agenda centres on decision-making in the digital era, with a particular focus on the social and ethical impact of big data and AI and its intersection with public opinion and policy-making.

The third guest speaker for the evening, Martin Goodson, is Chief Scientist and CEO of Evolution AI. Martin is a specialist in natural language processing, the computational understanding of human language.

The discourse…

Maxine kicked off the conversation with a rather hard-hitting statement, that we misunderstand what “health data” really means. When we discuss health data, it is presumed that we refer to data that is collected when we interact with the health system – our medical records. Incorrect. That is “sickness data”. Health data refers to search data, the information captured when we Google, for example travel, which indicates how healthy we are.

Maxine is a member of DeepMind Health's independent review panel. The board looks to build trust through radical transparency. She argued that we cannot expect the NHS or academia to afford the computational power required to get things right. Therefore we have to work together alongside the large corporations. Corporates can play an innovating role but they, and not just DeepMind, should not be enabled to profit from our data. We, the citizens, own the data. The government has a regulator role to play in protecting society.

Martin spoke further to the tensions between privacy and innovation. If we are too private with our data, there will be less innovation. He argued that the privileged of society are more likely to benefit from AI in terms of convenience. Data needs to work for people and for society. Misuses of machine learning based systems that have led to cruel justice were pointed to as an exemplar of the negative impact of the less privileged of society.

The panel then moved on to the topics of ethics. There was a sudden interest in the ethical perspective. Ahmed asked if there is a risk of “ethics-washing”, using ethical defence to side-step issues such as privacy, autonomy, and agency? General consensus amongst the panel was that the UK is in a good place to be setting the agenda. Europe has a long tradition of setting human liberties. But we need to be ethical and enable innovation at the same time.

The panel argued that unless citizens are personally affected by data breaches, they don’t really understand the repercussions. The public perspective is as much about when and how you ask, as whom you ask. We don’t need to teach kids to code. We need to teach young people to think about how coding impacts and why the control of data may be important.

Maxine highlighted that NHS users are automatically opted in to their depersonalised confidential patient information being used for research and planning by the NHS, as well as commercial and academic partners. NHS data isn’t great but it does have scale. There are huge benefits for populational research. Health data was likened to taxes, a societal contract. Informed decision-making is important. I am happy to share my data in this scenario. Would you opt out of giving your health data?

The discussion closed with a last thought-provoking question: "Can we put data solely in the hands of non-profits?". The panel argued that our health and justice systems need to be able to engage with organisations commercially. And sufficient profit is needed to run these organisations. The panel concluded that we need to define what we mean by “reasonable profit” in this sense.

For more details about upcoming events, visit BBC Machine Learning Fireside Chat.

More Posts

Previous

IBC 2018: The little exhibition that forgot