The back-up driver of a self-driving Volvo has been charged with negligent homicide after the vehicle hit and killed a pedestrian while she streamed an episode of music entertainment show The Voice.
The unfortunate victim was a 49 year old woman named Elaine Herzberg, who was wheeling her bicycle across a road in Tempe, Arizona when she was hit by the vehicle which was travelling at 39mph (63km/h). The back-up driver was a woman called Rafael Vasquez, who has pleaded not guilty to the charges and has been released awaiting trial.
The incident was the first recorded death involving a self-driving car, and has resulted in Uber stopping its trial of self-driving cars in Arizona. Given the severity of the incident, it was scrupulously investigated by the US National Transportation Safety Board (NTSB) who concluded that it had occurred primarily due to human error. It has also been reported that Uber will not face criminal charges given a previous decision that there was little grounds for criminal liability.
In the case of this particular self-driven car, Ms. Vasquez remained in the driving seat and so theoretically was able at any time to regain control of the car, including in an emergency. It was clear from analyzing the dash-cam footage that at the point of impact, Ms. Vasquez was not looking at the road. When further records from streaming service Hulu were analyzed, it showed that at the corresponding time, she had been streaming a television show, therefore piecing together the puzzle and providing clear evidence that Ms. Vasquez had been distracted at the time of the crash. On these grounds, the police concluded that the accident could have been very easily prevented had she simply been paying more attention to the road.
Ms. Vasquez was initially charged with the offence on the 27th August and appeared in court on the 15th September. As she pleaded not guilty, a full trial will now commence in February of next year. That said, Uber has not been left completely blameless and there has been much criticism directed at the company, including reports from individuals inside the organization who claimed that warnings over the safety of the vehicles had been ignored.
The public has been reassured that self-driving cars are programmed to be ‘over-cautious’ and to spot any potential threats ahead of time, yet in this case it appears that someone was clearly walking across the road in a manner that was not at all unusual and yet the car did not see her as a potential collision hazard. Children, adults and animals dart into the road at unexpected times and part of being a good and competent driver is being alert to these surprises whilst reacting quickly to avoid any harm. Ms. Vasquez clearly was not paying attention to the road in the same way that she would have been if she had been driving a normal car, but was she lulled into a false sense of security by the firm with regards to just how safe the car actually was? Was she provided with enough training and was she alerted to the need to remain fully vigilant at all times? Many of these questions will no doubt be addressed in the trial, but it does raise an important point about where the responsibility lies for any accidents or damage, the self-driving technology or the back up driver?
Clearly this incident has sent shock waves across the self-driving community and appears not simply to be considered a random, isolated incident. There were reportedly a whole host of collisions by Uber’s self-driving cars which had raised concerns with personnel internally, with other competitors such as Waymo and Baidu pushing back on their trials and planned roll-outs. In the case of Baidu, it appears that their plans for a full rollout of robo-taxis will now be delayed until 2025.
Whilst you can clearly see the benefits that self-driving vehicles could bring to our daily lives, the reality is that until they can be proven to be ‘totally’ safe, it simply isn’t possible to roll them out on a large scale with confidence. Interestingly, I think it also highlights the continued challenge that comes with AI and robotic technology, that of imitating human thought, intuition and the ability to respond in lightening speed to potential threats. Our natural fight or flight system is what allows us to step out of the way of an unexpected car before our brain has even processed the threat, or swerving in our car the second we see a potential hazard. Until a machine is able to react to unseen dangers in the same way as a human driver can, a fleet of truly self-driving cars remains a pipedream for the future.