Waymo Van Involved in Accident in Chandler Arizona

Home » Waymo Van Involved in Accident in Chandler Arizona
May 16, 2018
Edward Smith

Waymo Van Involved in Accident in Chandler, Arizona

Waymo Van Involved in Accident in Chandler Arizona

I’m Ed Smith, a Sacramento self-driving car accident lawyer. A recent accident in Chandler, Arizona, between a Waymo autonomous vehicle and a Honda sedan resulted in injury to the human operator of the self-driving car. Details about the accident were sketchy at first, but later, police confirmed that the car was in manual mode when the collision occurred. Reportedly, the Honda tried to avoid a collision with another vehicle and ended up striking the driverless car.

Accident Details

A driver in a silver Honda was approaching the intersection of Los Feliz Drive and Chandler Boulevard when the traffic light turned red. A second vehicle proceeded through the intersection on a green traffic signal heading north when the Honda reportedly ran the traffic light. According to police reports, it was moving at a speed of 40 miles per hour. The Honda veered off to avoid hitting the vehicle. It continued across the median and entered westbound traffic on Chandler Boulevard while still traveling in an easterly direction. A third vehicle, a Waymo Chrysler Pacifica minivan, was already in the westbound lane and slowing as it approached the intersection. The Honda collided with the front-end of the van.

Waymo Vehicle Not at Fault in the Accident

Police in Chandler said the Waymo driver was not at fault in the crash. The driver was in control of the self-driving van at the time it was hit by the Honda. The driver of the Honda was charged with running a red light.

Accidents Involving Autonomous Cars

California maintains a database of all accidents that involve autonomous vehicles. Arizona does not have a mandatory database, but in both recent autonomous car accidents in the state recently, the vehicles were in the manual driving mode, meaning a human driver was in charge. Of all the crashes noted in California between autonomous cars and other vehicles, the self-driving unit was not responsible for the accident except for one instance. In this case, the autonomous vehicle was operating in manual mode, collecting data.

Difficult Decisions

One question that has surfaced is whether purely autonomous vehicles can make difficult, split-second decisions. It depends on the decision. Autonomous vehicles undergo vigorous testing and learning protocols. Millions of miles are used in training the vehicle’s computer “brain” to determine the appropriate response for any given situation. However, the autonomous vehicle is designed to protect the occupants, and its decisions are based on that. Ethical decisions could be preprogrammed by the manufacturer.

How Learning Works for Autonomous Vehicles

Systems designed to ensure safety in autonomous vehicles exceed that of standard computer accuracy. The cars are taught to make the right decision regardless of weather conditions, road surface or visibility. To do that, every possible driving scenario is introduced in an effort to teach the appropriate decision-making process. This involves petabytes of data. The vehicle’s “brain” learns from new experiences without forgetting past ones. This is a massive undertaking. Yet, it is reaching the level where its ability exceeds that of the human driver

Legal Issues

Once autonomous vehicles are operator-free, liability for an accident will rest with the manufacturer as an auto products liability case if the driverless car is at fault. For now, the question of who is liable when a human safety operator causes the accident is more pertinent. If the human operator is negligent and operates the vehicle outside the way a reasonable person would, they may be considered liable. However, if Uber or Lyft trained human drivers to operate the vehicle, they too may be considered responsible for inadequate training and supervision. In addition, since Uber and Lyft both pick up passengers for transit to a destination, they might be facing elements of common carrier law, similar to taxis and buses.

Are Human Drivers More Capable?

That question differs from driver to driver. Given the accident statistics, the overall human safety profile is not excellent. Many traffic accidents are caused by drunk, drowsy or distracted drivers, something that will not happen with autonomous vehicles.

Ethical Issues

Autonomous vehicles act in the way they were trained to act. Ethical issues may intervene with the outcome in some cases. For instance, what if an autonomous vehicle is traveling down a narrow roadway with drop-offs on either side. If a child runs into the road, the vehicle must decide whether to avoid the child and plunge over the drop-off or hit the child. This is a complex decision for even humans to make. At this point, there is no guarantee as to what an autonomous vehicle will do.

Sacramento Self-Driving Car Accident Lawyer

I’m Ed Smith, a Sacramento self-driving car accident lawyer. If you have been involved in an accident with a self-driving vehicle, you need the help of an attorney who has experience in this area. Call me at (916) 921-6400 or (800) 404-5400 for friendly and free advice. Reach out to me online if it is more convenient for you.

I’ve helped many residents of Sacramento and surrounding areas since 1982 obtain compensation for wrongful death claims and accidents resulting in traumatic brain injury.

If you want more information about my practice, just click on any of the following links:

I am a California member of the Million Dollar Forum, a group of trial attorneys from around the country who have won $1 million for their clients.

Photograph Credit: https://pixabay.com/en/technology-police-car-roof-2500010/

:cd ds [cs 943] cv