The future is now: driverless cars are now closer than ever to becoming a reality for the American driver. Once merely the stuff of science fiction, driverless cars will redefine what it means to “drive.” Imagine: those who cannot drive themselves can enjoy greater freedom and mobility. Commuters can regain time that was once lost sitting in traffic. Countless accidents can be avoided simply by sitting back and letting your vehicle handle everything.
Emerging at the forefront of the race for self-driving technology are Google, Apple, and Uber, with the latter two keeping their plans largely mum until the time is right to release their prototypes. Currently, Google is testing prototypes in California and Texas of “Level 3” vehicles, a designation requiring humans to take over automated steering with a comfortable transition time. Level 4 vehicles will be designed to be fully autonomous. So far, legislation has been passed in four U.S. states (CA, NV, MI, and FL) and Washington, D.C. concerning driverless cars.
How do they work? Self-driving vehicles are designed to navigate through city streets and highways using a combination of sensors, GPS tracking, radar, and special software that allows the vehicles to react to traffic signals and signs. Unlike human drivers, cars that drive themselves eliminate factors like distraction, aggressive driving, or intoxicated driving.
The legality of these vehicles, however, remains ambiguous, as questions about what constitutes a “driver” and various other ethical considerations are still up in the air. For instance, who can be blamed when a self-driving car causes an accident? As Google’s autonomous vehicles have already been involved in 13 “minor” accidents this year, one of which involved a self-driving vehicle in manual steering mode, this question is an important one when it comes to liability. Currently, there are no testing regulations in place for autonomous vehicles, so the details behind how and why these accidents occurred are still not fully understood.
South Carolinians Shape & Respond to Autonomous Vehicle Technology
Here’s what South Carolina has been up to on the topic of driverless cars:
- Just this summer, the city of Greenville, SC has approved financial backing for engineering students at Bob Jones University to create a prototype for a self-driving golf cart to be used as a campus shuttle (you can read more about this story here).
- A University of South Carolina student addressed this topic in an article entitled “Sue My Car Not Me: Products Liability and Accidents Involving Autonomous Vehicles.” The student points out that no technology is perfect, and that accidents are inevitable. But, he adds that forcing people to accept liability in the event of a car malfunction defeats the very purpose of the technology.
Who Can Be Blamed When There Are No Hands on the Wheel?
Pioneers of driverless vehicles are fond of touting the statistic that 94 percent of accidents in the United States involve human error, largely because their vehicles are expected to save millions of lives each year by eliminating the human element of car accidents. However, the concern is that self-driving cars will lull users into a false sense of security that nothing can ever go wrong, when the fact is that, much like autopilot on an aircraft, safe use depends to some degree on a human’s knowledge and monitoring of car systems.
Even the most technologically-advanced self-driving vehicles can malfunction, and if a user is not aware of which systems are or are not engaged, or how to override the vehicle, the potential for accidents still exists.