Self-driving cars were slated to be the next big thing just a few short years ago, but it looks like the hype has stalled out. Although self-driving navigation is still a work in progress and many are hopeful for the future of this technology, for now, it looks like the world will need to wait a little bit longer for autonomous vehicles to go mainstream.
So what went wrong? Did self-driving navigation just not live up to the promises of futurists? The fact that this technology is still being worked on means that it can’t be all bad, so what exactly is the issue? If you haven’t followed developments in autonomous navigation recently, you may be wondering how something that had so much positive press could have virtually disappeared overnight.
To help you understand a little bit about the barriers holding self-driving tech back, below are some of the most pressing issues affecting autonomous navigation today:
Judgment Calls in Question
One of the biggest questions concerning self-driving vehicles is whether or not people should put their lives in the hands of machines that can’t make moral judgment calls. When a human is driving and a snap decision needs to be made to protect the lives of others, the brain fields a flurry of moral questions in order to make a decision. If a squirrel darts out in front of your car while pedestrians are on the sidewalk next to you, there’s a very good chance you will choose to make the unfortunate-but-necessary decision to hit the squirrel rather than risk the lives of pedestrians.
A machine, on the other hand, searches through its programming to find the most efficient route to achieve its goal. If the vehicle’s goal is to protect the lives of people in the autonomous car or get to its destination as quickly as possible, pedestrians may become collateral damage when a vehicle needs to decide between swerving to avoid something in the road. This weighty subject has been the topic of debate among ethicists, philosophers, computer scientists, lawmakers, automakers and more. As of this moment, no clear answer has been found to solve the dilemma.
In keeping with the above, the legal status of an autonomous vehicle’s actions is up for debate as well. If a self-driving car navigates into a crowd of people in an attempt to take a shortcut through a road closed for a parade, who gets the blame?
Is the automaker liable, or is the programmer who designed the code used by the vehicle to blame? Or, if someone was in the vehicle when the accident occurred, can they be held responsible for not stopping the vehicle? These questions and many others have yet to be debated in court, meaning people considering autonomous vehicle purchases may find themselves at the center of controversy should an accident occur. As a result, autonomous navigation development has slowed.
Difficulty Navigating Special Terrain
Self-driving navigation has already been proven to work on a variety of terrain, but it seems like this kind of technology only really works well when the ground is clear. During snowy conditions, for instance, self-driving cars seem to struggle with identifying lanes and objects. With snow in particular, self-driving cars have a hard time with depth perception. This can lead to accidents and injuries if the owner of such a vehicle lives in a snowy climate.
Precipitation isn’t the only threat to self-driving cars during the winter either. Most autonomous vehicle technology is electric, meaning these vehicles rely on electric batteries to start and get moving. In extremely cold climates, electric batteries can drain quickly, even when the vehicle is not in motion. This can lead to trouble starting an autonomous electric vehicle, and it may also greatly reduce the vehicle’s range.
The Future of Self-Driving Vehicles
Although the hype surrounding autonomous navigation technology has slowed, it hasn’t gone away entirely. As experts continue to look for solutions to some very challenging problems, the world will just have to wait and see how everything plays out in the end.