Intimacy and trust
Suzie is a 27-year old administrative employee of a brewing company. She wants to get pregnant someday, but not now. Suzie is in a relationship with Ian, a 24-year old physiotherapist. They are now discussing moving in together, although it is hard to find a suitable apartment. Suzie uses a period tracker, called Intim, to keep track of her somewhat irregular cycle. The tracker purports to log ‘everything’ that is ‘female health’ related and makes health reports and ovulation and period predictions.
One Sunday morning, while eating a croissant, having a cup of coffee and reading the news on her tablet, she finds a short article on Intim. It turns out that the Intim developers use tools developed by social media companies. These tools apparently pass on personal information, including information on sexual activity, periods and use of contraception, in addition to identifying information, such as name, location and age. According to the CEO of Intim, the period tracker is free and so “customers should expect their information to be monetised”.
Suzie feels betrayed, and immediately deletes the app.
She decides that it is time to clean up the digital traces she leaves everywhere. She picks up her phone and starts by adjusting the settings on her phone. The job turns out to absorb the rest of the beautiful morning. The more she searches, the more default settings she discovers of which she had no idea they were there. She discovers that her phone has been feeding data to the developer of the operating system, a large American tech company, on what apps she uses and for how long – which would seem to include Intim. She reads blogs and discussion threads and with the help of some of the comments there, she manages to step by step block some of the default information sharing settings. She also finds an app called “digital wellbeing”, which cannot be uninstalled, but which apparently monitors a variety of data – although she does not discover which exactly. What personal information is being transferred and to which companies, Suzie does not know. Do ‘third parties’ now also have information that she has been sharing with Intim? Suzie does not know that either, and she does not succeed in finding out. What information have all the other apps been collecting of her? Who has access to those data? She does not know. The experience leaves her tired, frustrated and angry.
Discussion – privacy, surveillance, asymmetry
Some people are not bothered at all by these types of discoveries. A lot of people are, however, and many feel that their privacy interests are insufficiently protected at the moment. But even if nobody objected to the use of their personal data, there are ethically worrisome aspects to the case of Suzie, relating to the accumulation of data and asymmetry of power.
It is noteworthy that even if an intelligent and motivated user attempts to customize user data settings, this is in many cases made difficult or even impossible. As a consequence, much of the process of data collection, storage and analysis is almost completely opaque to the regular user. At the same time, companies and other institutions that offer mHealth services know a lot about the user, even very intimate information, and are able to combine different bits of information to offer detailed profiles of their customers – typically for advertisement purposes. For example, manufacturers of diapers are willing to pay money in order to target users of period or ovulation tracking apps (often referred to as ‘Femtech’) that appear to be or aspiring to get pregnant. Similarly, manufacturers of chocolate might be interested in knowing at what time of the month they can best advertise their product. Independent from the question whether such targeted advertisement is very effective (hardly), it seems that this asymmetrical situation is problematic purely from the perspective of fair transaction: in order for the user and the mHealth developers to be on somewhat of an equal footing in their interactions, it seems necessary that such power imbalances are adequately addressed. The European GDPR attempts to do just this, but in practice it is not easy to curb the power of ‘surveillance capitalism’.
Literature
Miller, Franklin G., and Alan Wertheimer. “The fair transaction model of informed consent: an alternative to autonomous authorization.” Kennedy Institute of Ethics Journal 21.3 (2011): 201-218.
Rosas, Celia. “The future is femtech: Privacy and data security issues surrounding femtech applications.” Hastings Bus. LJ 15 (2019): 319.
Source
Intim is a fictitious app, but period and fertility apps are common, and many of these monetize their data in much the same way as Intim. Android indeed uses a pre-installed tool called “digital well-being”, but Suzie is confused about what it entails; it is unrelated to health apps.