In the last few weeks we have read several news reports about self-driving car accidents. Tesla and Uber, two companies leading innovation in driverless automobiles, have recently experienced fatal collisions which have hampered their autonomous testing. These are not the first instances of fatal crashes using the self-piloting systems. However, the collisions happened in such a close time frame, the public had to take notice.

On March 18, an Uber autonomous vehicle (AV) was involved in a fatal crash with a pedestrian. A Phoenix Business Journal article describes video of incident as follows:

“The video shows the victim Elaine Herzberg walking her bike in the middle of the road. It does not show the actual collision “due to the graphic nature of the impact,” said Det. Liliana Duran in an email. The video also shows an interior view of the driver looking down at something off and on, possibly a phone or computer screen, before looking up in surprise right before the car hits the woman.”

Due to the graphic nature of the video, we have decided not to share it here. There appears to be some elements of distracted driving involved in this crash. Human error seems to have combined with a failure by the autonomous (self-piloting) system, to identify the pedestrian and brake or take evasive action to avoid the collision.

About 5 days after the Uber crash, Tesla experienced a similar incident while their autopilot system was engaged. Engadget reported on this accident explaining:

“The driver of a Model X has died after his electric SUV collided with a median barrier on Highway 101 in Mountain View and was subsequently struck by two other vehicles. The incident destroyed the front half of the vehicle and sparked a fire that involved the battery, leading to Tesla sending an employee to investigate. Witnesses reported seeing a fireball during the crash.”

In a follow-up article today, Engadget has gone on to state that the NTSB is unhappy that Tesla shared information about the accident. Tesla CEO, Elon Musk, went ahead and blogged that autopilot was engaged but that the driver had removed his hands from the steering wheel for the six seconds prior to impact. The NTSB says Tesla has been cooperative in all previous accident investigations, but evidently they did not want this information made public. Also, it seems the deceased driver, had some concerns about the autopilot system according to his family.

The family claims “he had brought concerns to a Tesla dealership that his Model X had previously swerved toward the same median where the accident happened.”

What gets investigated when autopilot fails?

Readers may think that some elaborate investigation needs to take place since we are dealing with driverless automobiles. The truth is, this boils down to an automotive / vehicular accident reconstruction issue.

Certainly there is advanced programming involved and the crash data retrieval (CDR) may require new methods or new technologies to access information, but the data must be recovered nonetheless.

The NTSB even states, “At this time the NTSB needs the assistance of Tesla to decode the data the vehicle recorded.” They probably require help in accessing the data from Tesla’s proprietary system, but it is still a matter of CDR. If Elon Musk knows that the driver removed his hands from the wheel for six seconds prior to impact, he must have learned of this through the data retrieval process used by Tesla.

The same is true in the Uber crash. They already have dash-cam footage that shows the vehicle did not slow before striking the pedestrian. In that instance, an accident recontstructionist, automotive engineer, or automotive software engineer will have to analyze the self-driving sensors, data, and response of the software, to determine why the car failed to respond while on autopilot.

Both of these accidents require failure analysis. What seems new to us as a society, is that these crashes involved a failure of software, rather than brakes, tires, steering columns, or seat belt failures (failures that have become common and often result in a recall to fix a feature).

The technology and collection methods may change. However, the theories of liability and the investigation remain pretty constant. We have two automobile crashes resulting in death. They require a thorough accident reconstruction investigation to determine the cause of the accidents. Once determined, matters of negligence, product liability, and fault still apply.

 

Posted by nickrishwain

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.