By John Lloyd

 

  • Doctor, doctor, there's a hole in my privacy…
  • WhatsApp?
  • How did you know?

I was on my way to work the other day (I know, how quaint, travelling to work, but it is still a thing) and was struck by the enthusiasm with which I was being exhorted not only to pop pills to become a wellman (or wellwoman or wellkid – making no assumptions) but also to download an app or two to support my mental health. Whether it is living well, wellness or wellbeing, the world of tech now seems to have more wells than Mohammed bin Salman. If data is the new oil then health data is rocket fuel. It is a wonder Elon Musk has not launched a health app instead of wittering on about Twitter. Besides if ever anyone needed a wellness app it is surely a man with an unaccountable quantity of children... 

The pandemic has provided a massive booster to the health tech data rockets, with a rapid rise of remote consultation joining the wealth of wellness apps in the frontline of patient care. Or even one’s own self-care as there seems to be an unhealthy degree of ambiguity over what constitutes health data in these situations. When does a wellness check in become a mental health record? It is enough to make me lose sleep at night.

People often ask me, can you recommend a wellness app? [Actually nobody has ever asked me that, this is just a rhetorical device.] The answer, of course, is as negative as a North Korean PCR test. The problem with all these apps is that they want to have their sensitive data cake and eat it. There is a lot to applaud in attempts to empower people to take control of their own health information. Call me cynical but are the hypervaluations of these apps based only on your one to one consultations with your virtual counsellor? Have they declared all the uses of the data they are collecting from you? Is it all right for your sleep score, diet, alcohol intake and other similar information being used to enrich the other data they have slurped from your Apple Watch (other wrist-worn sensitive data scrapers are available) and everyone else’s so that they can use that information to tempt researchers (and boost their IPO)? How do you feel about that?


If you are not concerned about the pooling of global health information to benefit mankind/the NHS/Big Pharma/healthtech [no need to delete as appropriate – they will all get a slice one way or another], what about your employer? In the breakneck race to the bottom of the privacy well among 21st century corporations, nothing appears to say how much a company cares about its workforce than a little wellness initiative here, a wellbeing app there. Of course there are excellent ways to support staff through difficult times and to offer something distinctive to demonstrate a commitment to corporate values but do those values include a respect for privacy?

Even if you are happy to share your weekend binge drinking, dirty burgers or bedroom activities with your boss, spare a thought also for what (else) people do in the shadows and who those shadowy people are. Two relatively recent cases of large volumes of patient data being uploaded onto Github illustrate the gap between professional conduct as practised by the technology community and what the GDPR describes as “the responsibility of a professional subject to the obligation of professional secrecy” (not that all doctors are paragons of virtues of professional virtue, of course, as anyone who actually knows any medics will attest).

If some of your best friends are doctors, consider then your conversations with friends, and the question of virtual consultation, so back to that opening WhatsApp joke… in fact WhatsApp is a serious matter when it comes to data privacy. Putting the Meta into metadata, remember that while your messages may be encrypted the details of the people involved is open to our friends in the metaverse to harvest and sell to their friends. The recent US Supreme Court decision on Dobbs vs Jackson Women’s Health Organization has had the almost miraculous side effect of people noticing how it is not only overt health data that informs us about the activities related to healthcare but also information about whom we are contacting, when, where and how.

As is so often the case, the tech wheezes seem to be running ahead of the ethical considerations (there is a lot more to be written on that another time, no doubt). Of course, there are great opportunities to use technology to improve people’s lives but there are also ways to do this without doing any damage. First, do no harm… and the next time you are invited to talk to ‘someone’ who is not an actual person, maybe just have a word with yourself instead.