What is life’s biggest "trap" people fall into?
Imagine you are 50 years in the future. You come to a car-sales event to purchase your very first self-driving autopilot car.
These cars will revolutionise road safety, having far better reaction speeds and split-second decision-making capabilities than a human driver.
However, the sales agent explained to you that in the extremely rare event of an unavoidable accident, it will sacrifice you if it can save more people and minimise life casualties as a result.
Say if, one day, while you are driving along, an unfortunate set of events causes the car to head towards a crowd of 10 people crossing the road. It cannot stop in time, so it will avoid killing 10 people by steering into a wall, killing you.
You’re like, Fuck that, I'm out. And you leave the event without purchasing the car.
You see, this is the ethical dilemma of the self-driving car.
Researchers have done polls on large groups of people on different ethical scenarios when it comes to self-driving cars. The results are interesting. In general, people are comfortable with the idea that self-driving vehicles should be programmed to minimize the death toll.
Yet, they actually wished others to cruise
in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves.
People are in favor of cars that sacrifice the occupant to save other lives—as long as they don’t have to drive one themselves.
So what’s the moral of the story?
Ethics and morality are not straight-forward. The biggest ‘trap’ people fall into, is thinking that they are good people, simply because they want to do good.
But having good intentions does not mean you are actually doing good. You lose sight of the big picture, and get tunnel-vision, as if looking out from the bottom of a well.
Sadly, not everyone understands this.