From how to choose our dead to how to look after the living.

Sira Abenoza

Photo: Coronavirus nurses/John Mage

In the mid-20th century, Philippa Foot devised one of the best-known ethical thought experiments in history: the trolley problem. This is a moral dilemma in which we have to imagine there is a runaway trolley careering down a track on which there are five people tied up. If no one stops the trolley, it will kill them. The only alternative we have is to pull a lever so that the trolley switches to another track on which there is just one person tied up. The question is, what do we do?

Do we allow the trolley to continue along the track and kill five people, or do we pull the lever so that it kills only one person? For most people, the answer is immediate and obvious: utilitarianism prevails – it is better to kill one person rather than five. However, Foot herself did not agree with this solution.

For most people, the solution to the trolley problem is utilitarianism

This experiment has spawned dozens of variations, including the one with the fat man on the bridge watching the scene; if we push him off the bridge onto the track, he will serve as a buffer to stop the trolley; in another variation there is an alternative track which is circular and rejoins the main track where the five people are. The experiment has even inspired books such as Would you kill the fat man? and engendered a 'discipline' called trolleyology.

But there is more. In 2016, a team of professors from the MIT launched the Moral Machine, a platform which has taken the basis of the same dilemma to elicit responses to numerous variations of the trolley problem from millions of people around the world; this has led to analysis of whether the answer we give is dependent on religious, political, social, cultural or other factors.

The answers to the trolley problem vary according to culture and not to age, education, gender or social class

In 2018, the professors published an article in the journal Nature, in which they asserted that the answers varied according to culture and not to age, education, gender or social class. They defined several clusters with similar response profiles: the Western cluster (North America and part of Europe); the Eastern cluster (countries influenced by Confucianism and Islam); and the Southern cluster (Latin and Central America).

Through this platform, they sought to contribute to determining how self-driving vehicles and other machines that use artificial intelligence should operate, that is to say, who they should protect and who they should leave unprotected, 'kill' or 'allow to die' in the event of a possible collision, accident or other scenario.

Moral Machine
The Moral Machine explores numerous variations of the trolley problem from millions of people around the world (Photo: Moral Machine/MIT)

Foot's dilemma, which could eventually condition how machines behave tomorrow, had its origins in war. It was inspired by the dilemmas that arose during the Second World War, when Hitler was planning to launch bombs on London and the British government had to decide whether it should try and persuade Hitler through double agents that the rockets he was targeting with his bombs were not situated in the centre of London, the seat of power, but in suburban areas. In other words, the British government had to decide who it would 'kill' or 'allow to die' and who it would save. 

Clearly, these dilemmas are essentially very similar to the bioethical dilemmas currently faced by doctors on a daily or even an hourly basis. Who do I allow to die?

Ultimately, the dilemma faced by the British government and the dilemmas proposed by Foot and the MIT boil down to the same thing: in a situation in which I am forced to choose between the life of some and the life of others, what choice do I make? Who do I save and who do I allow to die? Do I favour the youngest or the oldest? Humans or other species? Men or women? (These are questions that the Moral Machine asks us if we decide to contribute to their experiment).

The answers to ethical dilemmas are neither a game nor an experiment, but a sad reality

Right now, the answers to these questions are neither a game nor an experiment, but a sad reality, and in the case of the doctors, the questions should ideally be addressed in a committee.

In fact, these questions conceal a long chain of previous decisions which transcend them: decisions concerning the state of the public health system today; how much we have invested in it in the last few years; how much attention we paid to the alarms raised by scientists once we had begun to hear about the cases in China, Iran, and so on. Indeed, the doctors find themselves at the battlefront, because before them many others fought and fought to the point that a war broke out, to continue with the tiresome warlike terminology that we have been using during the course of the virus.

Nurses coronavirus
A doctor during the coronavirus pandemic in April 2020 (Photo: Vinh Dav/iStock)

And if we look at the other examples, the question about 'who I allow to die? also begs many questions that precede it. In the case of the trolley experiment, perhaps we would ask why on earth these people are tied up. How could it have happened? And perhaps in the case of the Second World War: can we manage to prevent these bombs from being launched? Can the zones under threat be evacuated before the arrival of the missiles?

Lawrence Kohlberg, the American psychologist well-known for his theory of moral development, presented those he surveyed with the Heinz dilemma: a husband must decide whether or not he should steal from a pharmacy, in order to provide his wife with the drug he cannot afford, but which she needs to stay alive.

Once again, we find a deep dilemma between two evils – this is why it is a dilemma – which presented a black and white choice, a decision between A and B. In this case, the aim was to define the respondent's level of moral development; in the dilemmas raised by Philippa Foot and the MIT, the aim is more about understanding the moral or ethical theory of the respondent.

When Carol Gilligan – an American philosopher and psychologist and a disciple of Kohlberg – began to read the results of Kohlberg’s work, she gradually came to the conclusion that something didn't fit. Women often returned a lower level of moral development because they questioned the dilemma and its options.

Carol Gilligan discovered that women often returned a lower level of moral development because they questioned the dilemma and its options

They did not believe there was an imperative to steal or kill; they defended the idea that there had to be better solutions. This was the point of departure for Gilligan to investigate the differences between boys and girls, and she found that, in general, for educational rather than biological reasons, whereas the boys answered 'yes' or 'no', the girls would look for third options, seeking better alternatives.

Having verified this, the philosopher coined the term the 'different voice', which described the way of looking at the world and relating with others traditionally exemplified by women, due to the education they have received; however, Gilligan claims that in reality these characteristics can be found if we scratch beneath the surface of all human beings: the ethics of care and the voice that is primarily concerned with finding the best alternatives and protecting others.

But let us return to the beginning. The British government, Philippa Foot, the millions of people who have given responses to the Moral Machine, and thousands of doctors today are all asking themselves: who do I 'kill' or 'allow to die', and who do I protect or 'save'. This is, of course, not only a pertinent question, but also one which, when motivated by a real situation, must be answered and quickly. But, as the girls who were angered by the dilemmas Gilligan put before them said: is there really no better option?

On the battlefield, there probably isn't. In other words, you have to decide 'who you allow to die' and 'who you save', however cruel this may be and however much it may anger us. But perhaps this anger should serve to make us realise, as the girls made us aware, that there must be better options. And these better options are related with not seeing or understanding ethics as a thought experiment where the answer is A or B, almost as if it were a board game, but rather as something that must help us to take the best decisions beforehand, thereby avoiding a point at which we have to ask ourselves: 'who do I kill' or 'allow to die' and 'who do I save' or 'allow to live'.

All written content is licensed under a Creative Commons Attribution 4.0 International license.