Ethics of Autonomy

03 Apr 2020

Over-View:

Fully autonomous vehicles are the dream of many people. Right away I think of autonomous taxis that are scheduled through an app on your phone that’s linked to your work schedule or calender that picks ypu up and drops you off. Being a science fiction fan, i also see a future in asteroid mining and the resource transportation required to transport that material to the place it needs to be, for example, a base on a distant planet or to the construction site of a large space ship. Our main topic today though, autonomous cars, will inevitably get into situations were it needs to make difficult decisions that most people would have trouble making. A huge question on every ones mind is, How do we program the A.I to make ethical decisions?

What is ethical programing?:

Ethics: “moral principles that govern a person’s behavior or the conducting of an activity”. With this Definition and after reading ACM: Code Of Ethics, i would have to say that ethical programming can be grouped into a few different sections.

How does That apply to Autonomy?:

With autonomous cars, we can use a lot of the same principles when writing our programs. If you had the chance to jump into the first autonomous taxi service, wouldn’t you feel more safe knowing the creator of that autonomous vehicle follows ACM: Code Of Ethics versus some some random team with sloppy coding standards? Im sure we all know by now that clean code is easier to read and work on making the production quicker and without bugs. You can see why a professional image is important.

Issues With Autonomy To Address:

Its well know by now that cars are dangerous, so know imaging writing a program that does everything a human does when driving. There are a lot of rules to adhere to. First the car needs to be able to stay on the road at all times and avoid other cars and pedestrians. This is the basic package when it comes to autonomous cars. I would say tesla is at this stage as of now because the driver of their cars still needs to be focused on the road while the car is in autonomous mode. One of the most controversial topics with autonomous vehicles is the “lesser of two evils” problem. If were about to crash and your only choices were to hit someone or turn directly into a brick wall which would you choose? Which one should the A.I choose for you? There are several different versions to this problem, but the idea is always the same, who dies when death is the only choice?

Conclusion:

I am a strong supporter in autonomous vehicles because i believe it can make the roads safer everyone. “The lesser of two evils” problem is an exact scenario that doesnt happen very often, but is an issue that needs a lot of attention.