Making the world a better place as an app developer

Penelope is an app developer with an idea that could make the world better. However, for now, the exact details must remain a developer’s secret. Her idea involves using biodata to improve users’ general health and well-being and to help prevent the most common causes of disease. She also plans to account for social determinants of health and illness, such as poverty.

As someone who follows debates in global health, Penelope is aware of some of the major challenges of our time, such as climate change, antibiotic resistance, and growing inequality, and wants to help find solutions. She is pragmatic and knows it will not be possible to ‘solve’ any of these problems with a grand plan. Rather, she believes everyone should strive to adjust their environment in ways that support health and make it easier to access health services.

At the same time, Penelope needs to make money to support her innovative project and herself. She has invested some of her savings in a start-up and managed to secure a loan from a bank. Additionally, her project receives funding from an NGO that encourages entrepreneurship in her home country, Greece.

Penelope wants to be successful, but she also wants to do this the best way possible. She aims to develop an ‘ethical app’, one that does not prey on its users by extracting as much information from them as possible and disregarding their vulnerabilities. However, the field of ethical considerations is complex and daunting for her. There are many things to consider: algorithmic discrimination, unfairness due to differences in user skills, the blurring of boundaries in healthcare, shifting responsibilities from healthcare providers to users, the potential for harm, the psychological impact of preventative tools, data security, risks of stigmatization, and more. These categories of considerations seem to partially overlap, and their impact emerges on different levels: individual, societal, and global. It is difficult to find ethical guidance that would offer a clear and accessible overview of relevant considerations.

This complexity has motivated Penelope to contact an academic ethicist who was willing to discuss these issues with her. However, they quickly found out that there was a large gap between them. They met several times on video calls to discuss common ground and possibilities for future cooperation. Yet, whenever Penelope tried to explain how the app would work, she noticed that the ethicist’s attention wandered to the other side of the screen. Conversely, the ethicist, Christos, who is a rather abstract-thinking academic and sometimes gets lost in his own sentences, often failed to provide Penelope with hands-on concepts and approaches that she could apply. Neither Penelope nor Christos are willing to give up, but bridging the interdisciplinary gap and finding a common language is tougher than they had imagined.

Expertise and interdisciplinarity

While dealing with somewhat overlapping concerns, the development of health apps and the ethics of mHealth can be like two different worlds. Both fields involve specialized expertise, but many of the technical and ethical aspects of mHealth are not always clear or even noticeable for people outside of each field. How exactly apps work and what this implies for how they collect, process, store and potentially commercialize data and which algorithms they use, is something that only developers will know. Despite a rather common idea that acting ethically just requires having a ‘moral compass’, the field of ethics works with a wide array of theories, concepts and approaches, which it applies and implements into practice. These theories and the ethical issues they account for originate from particular parts of the world and cannot always be applied to each and every socio-cultural context, which complicates things, even more so when apps are used internationally.

Much of digital health ethics is concerned with data privacy, informed choice about user data and algorithmic ‘bias’, but still fewer ethicists have done work that would be deeply concerned with issues of justice, including algorithmic and health discrimination and with issues of structural and individual responsibilities of health care. These concerns include which data sets apps use, where do they come from, how were the training data labelled, which population groups these data (do not) represent and what will the implications be on individual, social and health-political levels. For example, were apps trained with the health data of all population groups, who might use them, with respect to sex/gender, racialization/ethnicity, or class? What if the exclusion of some groups leads to algorithmic discrimination in that the app will over-diagnose or misdiagnose these population groups? What if an increasingly individualized concept of health care leads to losses on the level of public health care and improvement of social determinants of health?

Vast amounts of empirical evidence have now proven that algorithmic discrimination will affect typically already marginalized population groups and so one should really expect this issue, when designing an app or conducting an ethical analysis. But which population groups are most affected in different geographical contexts and how to best mitigate these negative effects? Can apps contribute something also on the level of public health, health equity and social determinants of health? What about the environmental impact of digital health in a world that is struggling to mitigate global warming? And how does one go about weighing these factors if some of them are in tension with one another? These are no easy matters and finding a common and more accessible language to address these puzzles will be a good start on the journey. It will also be necessary to improve funding schemes and health-political or other support for app development that tries to prioritize ethical and justice-oriented goals. To improve this, many actors and stakeholders should be involved, including academic ethicists.




More Projects