Who is Responsible when a Driverless Car Crashes?
Federal investigators are probing a fatal crash in Florida involving a Tesla car that was in Autopilot mode. This was believed to be the first fatality involving a self-driving car.
The accident happened when a tractor-trailer made a left turn in front of the Tesla. According to the the National Highway Traffic Safety Administration (NHTSA),
Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.
The Tesla cars use "artificial vision" technology. Cameras sense when drivers are at risk of rear-ending another vehicle and can trigger an emergency braking system.
Tesla claimed that its cars had driven more than 130 million miles in Autopilot mode without a fatal accident. The Autopilot feature is built into 70,000 Tesla vehicles made since 2014.
Unsafe at Any Speed?
A 2015 study published by the University of Michigan's Transportation Research Institute found that, per million miles driven, self-driving cars had a higher crash rate than traditional cars.
Regardless, a study by Pew Research Center suggested that almost half of Americans would be willing to ride in a driverless car.
When a driverless car gets in an accident, causes property damages or injuries, or breaks the law (for example, by speeding) who's responsible?
There are various candidates:
- The manufacturer of the car
- The owner of the car, who decided to operate it in driverless mode and failed to take over in time
- The make of the software, sensors, or other systems that went into the car
17 US states and the District of Columbia have considered laws governing driverless vehicles. Stanford's Center for Internet and Society has set up a Wiki on other legislative and regulatory developments on the issue.
In Florida and DC, for example, a car manufacturer's liability for an accident is limited if an accident or injury happens when a car is made self-driving by after-market parts. The party installing those parts would then be responsible.
In Florida, California, and Nevada, the "operator" of a driverless vehicle is defined as the person who engages the technology. Thus, if a driverless car heads out without an occupant to pick up a relative at the airport (or a pizza), the person who sent it on its way would be responsible for whatever trouble it gets into.
Limiting Driverless Vehicles
In 2015, the California Department of Motor Vehicles published a draft of regulations that would require all driverless cars to still have steering wheels and brake pedals and to have at least one human occupant with a valid driver's license.
At public workshops on the regulations, various groups -- included disability advocates -- opposed these regulations.
More recently, a California Assemblywoman introduced a bill that would legalize cars without human drivers. GoMentum Station, a testing ground for self-driving vehicles, is located in her district and she wants the cars to be able to be tested on public roads.
Since making (and testing) self-driving cars benefits the local economy, states may end up competing to create the most "driverless friendly" laws.