Who Should Be Responsible for Driverless Car Accidents?

Driverless CarDriverless cars might be the wave of the future, but self-driving carmakers must overcome public skepticism before it will be commonplace to see vehicles on the highway that have no human in the driver’s seat. Pilot projects claim that driverless cars have a good safety record — better on a per mile basis than that of cars with drivers — but road tests are usually conducted under optimal conditions, often in secret, making it difficult to evaluate their true performance. How well driverless cars will function day-to-day in a variety of driving environments is not yet clear.

Commercial vehicles, including heavy trucks, may be the first driverless vehicles to log regular miles on the road. People who own cars typically like to drive them, while companies that own trucks might be happy to save labor costs by firing their drivers. With no need to compensate for driver fatigue, a driverless truck can deliver its cargo more quickly by operating 24/7. That’s bad news for truck drivers, who are likely to lobby against driverless vehicles. Other commercial drivers may be in the same boat, given Lyft’s optimistic announcement that it hopes to field a fleet of driverless taxis before the end of the year.

Changing Laws to Meet Changing Technology

Google, the current leader in developing self-driving car technology, points out that nearly all of the 35,000 American traffic accident fatalities each year are caused by driver error. Cars that drive themselves, that obey traffic laws, and that manage to avoid bumping into each other have the potential to save lives. Realizing that potential, however, is no easy task.

Designers of driverless cars face enormous challenges. Sensors (like drivers) have trouble distinguishing the color of traffic lights in the glare of bright sunlight. Sensors also have difficulty interpreting ambiguous data (is the dark patch on the road a hole, an oil spill, or a shadow?). Snow obscures the centerlines that driverless cars follow, while a heavy rain might cause sensors to read false data.

The “judgment” exercised by software may also be a poor substitute for the experience of a seasoned driver. For example, a self-driving Google car hit the side of a bus because the car expected the bus to yield. Drivers are more likely to understand the unwritten rules of the road, including the rule that smaller vehicles yield to bigger vehicles. (Google announced that it was adding “busses are less likely to yield” to its programming.) If an accident is unavoidable, drivers might also make better decisions than computers as to where the crash should occur. For example, given a choice, it’s better to hit an empty car than one that is occupied.

Assuming those problems can be overcome, some states may need to modify their laws before driverless cars are allowed on the road. New York law, for example, requires a driver to have at least one hand on the wheel at all times. That law defeats the purpose of a driverless car. Whether driverless cars will even have steering wheels is an open question.

On the other hand, state legislatures might want drivers to be in a position to take control of a driverless car if something goes wrong. Laws to that effect might displease the commercial delivery or taxi services that want to make vehicles truly driverless. And 50 different laws in 50 different states might make it impossible for companies to design a single driverless car that can be operated everywhere.

Liability for Self-driving Car Accidents

When a driver fails to see a toddler in the street, the driver’s negligence is clear. Who is to blame when a driverless car strikes a pedestrian or another car? Should the car owner be held responsible when the owner was not driving? The manufacturer of the car (or of its software) is probably to blame, but being forced to sue a giant like Ford Motors every time a driverless car is involved in an accident — a lawsuit that will may depend on experts to prove that the car was at fault due to a design or manufacturing flaw — would make it much more difficult and costly to process routine accident claims.

The first fatality involving a driverless car occurred May 7, 2016 in Williston, Florida. According to news reports, the “car's cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate the brakes.” The car’s owner was in the driver’s seat, but he failed to react to his car’s blunder in time to prevent a crash.

Who is responsible for the fatal crash? Perhaps the driver shared responsibility for failing to control the car. The automaker was at least partially responsible for failing to build a system that could tell the difference between sunlight and a truck. A lawyer at a recent conference on law and technology suggested that software developers, hackers, security providers, and data storage providers might all be potential defendants in lawsuits involving self-driving cars.

Another lawyer at the conference imagined that self-driving cars might be so safe that they would put auto insurance companies out of business. An alternative view, however, might suggest that auto insurers should be forced to rewrite their coverage. Perhaps state laws should hold owners of self-driving cars responsible for (and require them to insure against) accidents that the self-driving car should have avoided. Laws of that nature would make it easier for injury victims to recover damages without forcing them to bring expensive products liability lawsuits against car manufacturers for minor accidents.


(Photo Credit: "Google Self-Driving Car" by smoothgroover22 is licensed under CC BY-SA 2.0.)

comments powered by Disqus

Find the Right Lawyer for Your Legal Issue!

Fast, Free, and Confidential

Call us today for a free consultation (855) 466-5776