Have you been injured? you may have a case. GET A FREE CONSULTATION

Google’s Driverless Cars, Crashes and Liability

March 11, 2013 by Steven M. Gursten

What if cars could drive themselves? It’s the stuff of science fiction. Except, that it’s likely to be a reality in only the next couple of years. Here is Eric Schmidt, former executive chairman of Google, interviewed by Daniel Franklin, executive editor of The Economist and editor-in-chief of “The World In…” publication, at The Economist’s World in 2013 Gala dinner on December 6th 2012 talking about his experience with driverless cars:

According to Schmidt:

“If you know someone, who has lost someone in a traffic accident in America, you understand why we’re doing this. 30,000 people, we don’t even cover it in the news anymore it’s so common. And by the way, that’s a record-low death rate given miles driven. We’re doing better at 30,000, which is ten times more people killed than in 911, every year. This is terrible.”

Driverless cars, talking cars, technology sure is amazing. But as a lawyer who sees the consequences of car accidents every day, I can’t help to wonder whether this technology will really take a huge bite out of traffic fatalities.

Of course, I certainly hope that it can and does. However, no technology is completely error-proof. And so, I wonder what the legal consequences of driverless car accidents might be. And I’m not alone. Over at The Volokh Conspiracy, Kenneth Anderson contemplates the legal infrastructure for driverless cars (citing Bryant Walker Smith’s Slate article):

Today’s self-driving systems, however, are “intended to work with existing technologies.” They use sensors and computers to act as individual vehicles responding to the environment around them individually, without having to be a cog in the larger machine. This means that they can adapt to the existing infrastructures rather than requiring that they all be replaced as a whole system. Smith’s real point, however, is to go on from physical infrastructure to include the rules of the road. Infrastructure also includes, he says,

“laws that govern motor vehicles: driver licensing requirements, rules of the road, and principles of product liability, to name but a few. One major question remains, though. Will tomorrow’s cars and trucks have to adapt to today’s legal infrastructure, or will that infrastructure adapt to them?”

Smith takes up the most basic of these questions – are self-driving vehicles legal in the US? They probably can be, he says – and he should know, as the author of a Stanford CIS White Paper that is the leading analysis of the topic. Self-driving vehicles

“must have drivers, and drivers must be able to control their vehicles—these are international requirements that date back to 1926, when horses and cattle were far more likely to be “driverless” than cars. Regardless, these rules, and many others that assume a human presence, do not necessarily prohibit vehicles from steering, braking, and accelerating by themselves. Indeed, three states—Nevada, Florida, and most recently California—have passed laws to make that conclusion explicit, at least to a point.

Still unclear, even with these early adopters, is the precise responsibility of the human user, assuming one exists. Must the “drivers” remain vigilant, their hands on the wheel and their eyes on the road? If not, what are they allowed to do inside, or outside, the vehicle? Under Nevada law, the person who tells a self-driving vehicle to drive becomes its driver. Unlike the driver of an ordinary vehicle, that person may send text messages. However, they may not “drive” drunk—even if sitting in a bar while the car is self-parking. Broadening the practical and economic appeal of self-driving vehicles may require releasing their human users from many of the current legal duties of driving.

For now, however, the appropriate role of a self-driving vehicle’s human operator is not merely a legal question; it is also a technical one. At least at normal speeds, early generations of such vehicles are likely to be joint human-computer systems; the computer may be able to direct the vehicle on certain kinds of roads in certain kinds of traffic and weather, but its human partner may need to be ready to take over in some situations, such as unexpected road works. A great deal of research will be done on how these transitions should be managed. Consider, for example, how much time you would need to stop reading this article, look up at the road, figure out where you are and resume steering and braking. And consider how far your car would travel in that time. (Note: Do not attempt this while driving your own car.)

Technical questions like this mean it will be a while before your children are delivered to school by taxis automatically dispatched and driven by computers, or your latest online purchases arrive in a driverless delivery truck. That also means we have time to figure out some of the truly futuristic legal questions: How do you ticket a robot? Who should pay? And can it play (or drive) by different rules of the road?”

And at Discover Magazine, Veronique Greenwood considers who should take legal responsibility for the accidents (citing a Popular Science article):

When a company sells a car that truly drives itself, the responsibility will fall on its maker. “It’s accepted in our world that there will be a shift,” says Bryant Walker Smith, a legal fellow at Stanford University’s law school and engineering school who studies autonomous-vehicle law. “If there’s not a driver, there can’t be driver negligence. The result is a greater share of liability moving to manufacturers.”

The liability issues will make the adoption of the technology difficult, perhaps even impossible. In the 1970s, auto manufacturers hesitated over implementing airbags because of the threat of lawsuits in cases where someone might be injured in spite of the new technology. Over the years, airbags have been endlessly refined. They now account for a variety of passenger sizes and weights and come with detailed warnings about their dangers and limitations. Taking responsibility for every aspect of a moving vehicle, however—from what it sees to what it does—is far more complicated. It could be too much liability for any company to take on.

I don’t doubt that driverless cars willl revolutionize American roads. I like to believe that these cars will greatly reduce traffic accidents and fatalities.

However, I don’t think anyone can really predict how such a fundamental change to the operation of motor vehicles will impact liability in crashes. As an experienced lawyer, my guess is that there will be a lot of finger pointing. But how different is that from our current state of affairs? Not very.

Who do you think should be liable for a driverless car accident?

[Community Guidelines]

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts
Michigan Car Accident Red Light Law: Your Legal Rights Explained
Michigan Car Accident Red Light Law: Your Legal Rights Explained
May 30, 2024
Suing someone after a car accident in Michigan
Suing Someone After A Car Accident In Michigan: FAQs Answered
April 10, 2024
Someone Hit Shannon Murphys Car: What should she do?
Someone Hit Shannon Murphy’s Parked Car (From Mojo In The Morning) and Didn’t Leave a Note: Here is What Shannon Should Do
March 13, 2024