Remember the development that BMW has been doing of autonomous vehicle technology–advancements like the BMW i3 that can park itself for you? Apparently the automaker has run into some difficulties–and not just in the technological sense.
As BMW has been testing autonomous (referred to as “highly automated”) driving technology in places like China and Europe, its encountered a series of difficult situations. For instance, if your self-driving vehicle had to either hit a child running into the street or another car passing by, which should it hit? How would it decide?
Those are the types of scenarios which compel BMW executives to report that self-driving or driverless technology is still “a long way off.”
Related: Premiere of the X5 hybrid
Can You Program Morality in Driverless Cars?
Difficult moral situations that we face every day as drivers–and are difficult enough for us to figure out–are even harder to program into self-driving automobiles.
While automakers like BMW have already developed and released features that allow vehicles to independently move through stop-and-go traffic and park themselves, they’re not ready for unexpected stimuli that involve ethical choices.
“Fully-automated driving is in my view still a long way off,” said Ian Robertson, BMW’s head of sales, in an interview with Bloomberg. “The technology will be held back by the ultimate moral question on who’s responsible.”
The ultimate intent of such technology is supposed to make drivers feel comfortable, focus on other tasks, and allow technology to work for them. So what happens when an accident happens? Whose fault is it? This has also been one of the biggest issues concerning insurance for driverless cars.
“The technology is capable of doing these things in a much safer way than a human can,” stated Robertson. “However, an algorithm will make a decision which might not be acceptable from a cultural or societal point of view.”
Although many are claiming (or hoping) driverless technology will be common in the next ten years, the likelihood of that is questionable.
News Source: Bloomberg