Home Page The Publication The Editor Contact Information Insurance Key issues Book Subscribe

Vol. 7, Iss. 4
May 9, 2018


Driverless Vehicles: This Is The Point I Was Making

In the last issue of Coverage Opinions I discussed certain liability and insurance issues associated with driverless cars. My main point was this:

No matter how much safer driverless cars turn out to be (let’s assume they do), they will not be foolproof. They can’t be. So, given the gargantuan number of auto accidents that take place in this country, there will still be a significant number of accidents, even when driverless cars are involved.

Right now, when there is an auto accident, it is rare to see the automobile manufacturer named as a defendant. Auto accidents are generally matters between the involved drivers. But when a self-driving automobile is involved, drivers will no longer be fighting over which one had the red light, but whose car is to blame.

In other words, if the car involved in the accident was designed not to have accidents, it is easy to see the automobile manufacturer, and the companies that made the component parts for the self-driving aspect, being named as a responsible party in lawsuits for countless automobile accidents. Accidents that are now simple, and quickly resolved, will become complex, drawn-out, technological fights between drivers and manufacturers over who’s to blame. Car crashes will go from one of the laws simpler problems to resolve to complex product liability litigation.

On account of this shift in focus, the financial exposure for automobile manufacturers, and the companies that made the relevant component parts -- or their insurers -- will be astronomical. Even if the manufacturers ultimately win the suits, and establish that their cars were not at fault, they will likely be looking at 7-figure legal fees in many cases. Congratulations GM, you won the case….

Well what do you know… This precise scenario is about to play out in California. In late March a driver was killed in Mountain View, California, when the Tesla he was driving crashed into a concrete highway median.

According to Tesla, “in the moments before the collision … Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.” Tesla continued: “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.” Tesla has also stated that owners have driven the same stretch of highway with autopilot engaged “roughly 85,000 times... and there has never been an accident that we know of.”

The family of the driver killed retained legal counsel to explore its options. The law firm stated: “(Our) preliminary review indicates that the navigation system of the Tesla may have misread the lane lines on the roadway, failed to detect the concrete median, failed to brake the car, and drove the car into the median.”

So it sounds like Tesla, at least for now, sees this as a driver-error situation and the family of the driver is looking to blame Tesla’s technology.

In non-autonomous vehicle situations, when people hit highway medians, in single car accidents, they usually do not bring lawsuits. Who’s there to sue? [It could be a UM phantom vehicle insurance claim, but that’s a different issue.] But here, because the accident took place in an autonomous vehicle, what would have very likely been a no lawsuit situation, could now become one that takes years to resolve and costs Tesla (and/or its insurers) millions of dollars in defense costs. Ultimately Tesla may prove that its vehicle was not to blame and the accident was caused by driver error.

If so, congratulations Tesla, you won the case….

Website by Balderrama Design Copyright Randy Maniloff All Rights Reserved