13 Ethics & Do No Harm
CONTENT
- Ethics & Do No Harm
- Consider The Following Examples
- Lyme Disease & Public Health
- Strava’s Global Heatmap
- Therac-25 Radiation Therapy Machines
SLIDE DECKS
Some of the material presented in this chapter will be discussed in lab 2. It is your responsibility to ensure you cover all the concepts presented both in lab and in this textbook.
Ethics & Do No Harm
As software designers, we must consider the potential harm the software we design may cause to our users, the environment, and society more broadly. We all want to believe that the things we build will only improve the lives of those around us, but that doesn’t mean we are free of unintended consequences or the responsibility of ensuring ethical designs that cause no harm.
These considerations shouldn’t wait until the minimum viable product is produced. We must consider potential direct and indirect harm at every stage of design. This includes when researching to understand the problem we want to solve. We should ask ourselves:
- Whose knowledge is informing the problem? Is it the correct knowledge base to use? Is it the only knowledge base to use?
- Are we working backwards – identifying a solution that needs a problem?
- Does our solution affect the environment? Are large servers needed? What measures might mitigate these environmental concerns?
- Are we overcomplicating the solution? Could we provide our client with a simpler, more cost-effective solution?
- Is our design inclusive? Are we ensuring that groups aren’t inadvertently excluded by our design?
- Does our software protect the user’s data? Are we only collecting the data that are relevant to the needs of the software?
- Does our software use of a user’s data put them at risk? Does it predict something about their identity that could put them at risk?
- Are we ensuring our software doesn’t reinforce or introduce bias, discrimination, or inequality?
- If the software malfunctions or fails, might it cause harm?
- Does the software facilitate or encourage harmful behaviours?
- Are there legal obligations we need to meet?
Consider The Following Examples
Lyme Disease & Public Health
Several years ago, the students of CIS3750 were tasked with designing a system for Wellington Dufferin Guelph Public Health. The system allowed users to learn about ticks and tick bites, and submit georeferenced photos of ticks and tick bites, with feedback provided regarding potential treatment options. Ultimately, the system was designed to track and prevent the spread of Lyme disease – which is spread by certain ticks. An early sign of Lyme disease is a skin rash that looks like a “bull’s eye” at the bite location.
To educate users about ticks and tick bites, it was decided that the system should include images of the “bull’s eye”. The students searched for “tick bite” on Google and found numerous examples. If you are inclined, search Google for images using the same prompt, and consider what you see. Yes, Google provides excellent photos illustrating what the “bull’s eye” looks like. But consider what Google doesn’t show.
If your search looks the same as what the students saw several years ago, you should notice the initial search results are images of people with white skin. At the time of writing this, the first image of a person of colour appeared far down the list of results. Further, that image didn’t show a tick bite. Instead, it depicted a young woman using bug spray.
Why is this important?
Our search results might suggest to software designers that ticks only bite white people. If this suggestion remains unchallenged, design decisions will lead to software that excludes a significant portion of the population. This ultimately could lead to negative health outcomes for any person of colour. It also means that Public Health will have access to a much smaller set of georeferenced data for tracking purposes. A lack of data can lead to less-than-optimal decisions.
Strava’s Global Heatmap
Strava, a social fitness app, released a global heatmap that visualized the activity of its users based on GPS data. While the heatmap was intended to showcase popular running and cycling routes, it inadvertently revealed the locations and movements of military personnel at secretive bases around the world. The algorithm behind the heatmap aggregated data from millions of users without sufficiently anonymizing or considering the sensitivity of certain locations. The data included routes taken by individuals on military bases, which could be pieced together to reveal patrol routes, base perimeters, and other sensitive information.
Why is this important?
This unintentional exposure compromised the security of military operations and personnel, leading to concerns about national security. Although Strava offered privacy settings, the default settings and the lack of awareness among users about how their data was being used contributed to this risk.
Therac-25 Radiation Therapy Machine
The Therac-25 was a computer-controlled radiation therapy machine used to treat cancer patients. A software design flaw led to several instances of massive overdoses of radiation, resulting in severe injuries and deaths. The software controlling the machine had concurrency issues where multiple commands could be executed in an incorrect order, leading to dangerous outcomes. The interface also did not provide clear feedback to operators, making it difficult to recognize when an error occurred.
Why is this important?
At least six patients were overdosed, with several fatalities resulting from the software errors. This case is often cited in discussions of the critical importance of safety in software systems, particularly in the medical field.