I must admit that I found Leslie Scism and Mark Maremont’s paper on the approaches used by insurance companies to assess health risk to be quite shocking. Although much of the #LAK12 mooc has focused on the accumulation and exploitation of observed data to predict future behaviours, the approach apparently being trialled by many insurers seemed to me to be bordering on the subversive. Many of the mooc’s discussions around ethical issues and analytics have discussed aspects of privacy, though almost all in the context of sharing data generated by an individual’s use of a system to enhance or augment their future engagement with that same system (eg Amazon’s use of past choices to suggest future purchases).

What was being proposed here though was a wholesale trawl of seemingly unrelated data unknowingly left by customers to create profiles which could be used by insurance companies in place of their usual, expensive health checks. It seems that everything we do (or appear to do) can suggest that we are healthy and active/TV watching couch potatoes/high risk extreme sport fanatics and affect our insurance risk accordingly. Scism and Maremont conclude that industry standards may well kick in to prevent abuse of these data checks and that insurers haven’t quite made the final leap into relying wholly on such models of behaviour, but I’d guess that it’s not so far off. Time to start subscribing online to Jogging Monthly and to post a few spurious record times to mapmyrun.com before they work out how much time is really spent sitting stationary at my laptop….

Share this:

Like this:

LikeLoading...

Related

About sharonslade

Dr Sharon Slade is a senior lecturer in the Faculty of Business and Law at the Open University in the UK working to support both tutors and students on Open University distance learning modules and programmes. Her research interests encompass online delivery learning and tutoring, online learning communities and ethical issues in learning analytics. Project work includes the development of a student support framework to improve retention and progression and the development of a university wide tool for tracking students and triggering relevant and targeted interventions. She is leading the development of new policy around the ethical use of learning analytics within the Open University, UK.

2 Responses to Profiling risky clients #LAK12

I had a similar reaction. Why I feel the most important part of a manifesto is to define the purpose, spirit and “fair use”. I think by being explicit about it, the participants can understand intent and agreements and make informed decisions regarding participation, or if no opt-out is offered, feel confident that their data is being used responsibly.

Agree – although the issue for me here was that data was collected from a range of activities and sites in a way that folk would find it very difficult to anticipate (and so prevent). What might the ‘penalty’ be if an insurance customer, say, opted out of data collection which would contribute to their insurance costs? I’m guessing that assumptions might be made that such folk had something to hide and would be hit by a group fine of some sort. It’s all a bit worrying, methinks.