Skip to main content

Exit WCAG Theme

Switch to Non-ADA Website

Accessibility Options

Select Text Sizes

Select Text Color

Website Accessibility Information Close Options
Close Menu
Robert E. Craven & Associates Rhode Island Personal Injury Attorney

The Ethics And Morals Of Self Driving Cars

Autopilot6

Self driving cars have the promise of making driving quicker and safer, as well as more convenient. But although self-driving cars can replace human beings behind the wheel, computers and artificial intelligence cannot replace the ethical and moral dilemmas that human drivers may face on the roadway.

Who Gets Hit?

Ethicists, scientists, and philosophers have pondered the question of when a self-driving car should purposely be programmed to hit another car, or another person. You may ask why on earth is this even a question.

It’s because in many cases, drivers of a vehicle, human or computer, will make decisions about what to hit, or where to direct a car, in order to avoid a larger accident.

So, for example, imagine that 10 pedestrians dart out in front of your vehicle, and your only option to avoid them is to veer your car off the road. However, going off the road will potentially lead to your vehicle hitting a tree or other object, and thus, injuring people inside your vehicle.

Do you drive off the road—after all, your car only has maybe 1-3 people in it, and there were ten pedestrians? Or do you hit the pedestrians, putting you, and your friends or family inside the car, first?

What about an animal on the road? Is it then worth hitting the animal, instead of driving off the road, and potentially injuring the individuals in the car? The animal will certainly die, but the passengers in your car may only be injured. Does it matter what kind of animal it is?

Cars Need to Handle These Questions

There is, of course, no “right” answer to this dilemma. And to some extent, these questions may seem like minutia, or even silly. But they are questions that those who program self-driving cars must deal with, because the cars must be programmed to react and make these ethical and moral decisions.

And people are no help either. That’s because there is a contradiction between what we tend to say, and how we really feel.

When asked, most people say they would want a self-driving car to make whatever decision saves the maximum amount of people, or to make whichever choice injures the fewest amount of people. But when you ask people if they would drive a car that would automatically do that, most people say no—the people still want some degree of control over these kinds of decisions.

Who is Liable?

Then there are legal questions, such as who gets sued, if a self-driving car makes a decision to save one person, and the expense of others. The car’s manufacturer? The owner of the car? There may well have to be some immunity for car manufacturers, or the owners of self driving cars.

Whatever happens, the ethical dilemmas with self-driving cars will need to be addressed, and we all may have to come to terms with the idea that computers, and not people, are making ethical and moral decisions on the roadways.

Contact our Rhode Island personal injury lawyers at Robert E. Craven & Associates at 401-453-2700 today if you have been injured in any kind of car accident.

Sources:

popularmechanics.com/cars/a21492/the-self-driving-dilemma/

businessinsider.com/self-driving-cars-already-deciding-who-to-kill-2016-12

Facebook Twitter LinkedIn

By submitting this form I acknowledge that form submissions via this website do not create an attorney-client relationship, and any information I send is not protected by attorney-client privilege.

Skip footer and go back to main navigation