More and more driverless cars are hitting the road, as companies continue to develop and test prototypes. Sometimes, however, the drive hasn’t been smooth. According to a recent report by the Telegraph, one leader in the driverless car industry has admitted to a complication with an earlier form of autonomous vehicle that could have led to dangerous accidents, forcing the company to move in a different direction. These developments go to show that drivers should be cautious around this new technology.
What Is an Autonomous Vehicle?
An autonomous vehicle is a car that navigates the road by itself in some capacity. Under the direction of an onboard computer, such a car can take in information about its surroundings and use that information to make decisions about how to drive.
There are varying levels of autonomy for these vehicles. Some, known as partially autonomous vehicles, require a human driver to make some decisions, and others, known as fully autonomous vehicles, need no human input whatsoever. Partially autonomous vehicles may have seemed safer at first, but as one company found, they may be even more dangerous than fully autonomous vehicles.
Self-Driving Car Trouble at Google
With an entire division named Waymo devoted to autonomous vehicles, tech giant Google is one of the leaders in driverless car development. But according to the Telegraph, the company’s experiments hit a rocky patch in the road in 2013. At the time, Google was experimenting with partially automated vehicles. The drivers were mostly in charge, with the cars “assisting” them in some driving functions.
As it turned out, many human drivers were unwilling to do their share of the work. Believing that the car had things under control, some drivers became distracted with other tasks, such as doing their makeup, or even taking a nap. These actions left them dangerously unprepared when the partially autonomous cars needed them to take over.
When Google found out that its test drivers were undertaking such dangerous behaviors behind the wheel, it decided it had no choice but to pull the plug on its partially autonomous vehicle operations and move toward fully driverless cars.
Problems With Fully Autonomous Vehicles
Despite taking driver error out of the equation, fully autonomous vehicles are not perfectly safe, either. Aside from the potential for malfunctions, driverless cars can ironically pose a risk to drivers around them by being too cautious. The computers that run these cars follow traffic laws exactly, and never exceed the speed limit. They also take extreme care in basic maneuvers such as slowing down before a stoplight or making a left turn. Sometimes, the drivers around these cars hit them after incorrectly assuming that they will act like a human driver.
What to Do If You Are in a Crash With a Driverless Car
Whether partially or fully automated, autonomous vehicles are not yet a perfect technology. But they may not be too far from being a part of everyday life. It’s not out of the question that you could end up in an accident with a driverless car. If you find yourself in this situation, make sure to alert the authorities and seek medical attention, as you would after any crash. If the driverless car was at fault, you may be able to secure compensation from the company operating it in a personal injury suit.
If you or someone you love was hurt in any type of traffic accident, contact the St. Louis car accident lawyers of the Bruning Law Firm today by phone or online for more information about how we can help you.