Programmed To Kill

18 Apr 2019

Ethics

Ethics, in simple terms, means to do good. However, this still does not explain what Ethics is. There are many layers to what Ethics is and what it does in programming. Programming-wise, ethics helps keep the programmer in check; they’re essentially guidelines that one must follow.

We must do good with our program; only publish if contributes to society and does the least amount of harm to someone as possible. Along with how we interact with others, like our bosses, co-workers or the clients; treat everyone fairly and those less privileged must be treated better. Client comes first after all.

But simply put, in the most case, it’s just common sense.

The Case

What isn’t so simple, however, is Autonomous cars.

While “Programmed to Kill” may seem a bit drastic, that’s essentially what the problem is. The future of self-driving cars is coming upon us, in fact, there’s already self-driving cars out on the road right now, Tesla being one of the big picture ones that people tend to think of when someone says “self-driving cars.” While there’s a study by the MIT and many more going on about this problem, it’s still not something that we can just brush aside as if it were nothing.

Accidents happen all the time. Either it be from a software perspective or a hardware perspective or just simply misuse of the product. This is what we are focusing on right now. Say the break doesn’t work or doesn’t work on time, and there’s an unavoidable accident, then what?

Someone has to die, that’s what.

If you’re a normal person, you don’t want that to happen. But in this case? It has to. But how are we going to handle it? As the programmer, it’s in our hands to decide where the car goes in this unavoidable accident.

The Options

The code of ethics(ACM CoE) says the client, the public, comes first. So, we save the driver, case closed, right?

Wrong.

As a programmer, we must do as little harm as possible. But what happens if it’s one driver against five pedestrians?

What do we do if it’s three occupants against three pedestrians? Three young or three old?

Who decides whose life is more valuable? The boss? The programmer? The public? The client?

And if we do choose to go for the least amount of deaths, what would happen if the driver is sacrificed?

Who’d want a car that sacrificed them?

My Opinion

Of course, we need to minimize as much deaths as possible. Most people agree with that, it’s just when they are the driver that the tend to change their own opinion.

But the code of ethics say programmers have a duty to the client first. While it’s not what some people want - minimizing the harm I mean - it’s what we have to do. We just have to hope that it would never come to that.