After autonomous vehicle death, vulnerable road users deserve answers
Earlier this week an autonomous vehicle in the U.S. state of Arizona collided with a pedestrian who was walking her bike across the road. The woman died a short time later — reportedly the first pedestrian death associated with self-driving technology. The incident has been a reminder that while much progress has been made in the autonomous vehicle space in recent years, there’s still a long way to go.
A little over 18 months ago we published an article on CyclingTips looking at the development of autonomous vehicles and what they might mean for cyclists. Now, in the wake of Monday’s tragic incident, we consider the progress that’s been made in the past year and a half and what the Arizona incident means for the future of autonomous vehicles and for cyclists.
The author of the following article works in transport policy and has more than 10 years experience in the field. He is also a keen racing cyclist. He has asked to remain anonymous as he is a government employee.
In my last article I discussed some of the potential, broad changes to the transport system that may result from autonomous vehicles and how these could benefit and penalise cyclists. Since then, some of the hype has given way to scepticism and a lack of progress. Some observers have started issuing more pessimistic assessments of how long it will be before developers can safely release them on the roads at a commercial level.
Some governments and developers have responded by redoubling their efforts. Arizona has ‘led’ the world in allowing driverless vehicles to be tested without a supervising driver behind the wheel. They led what some have characterised as a race to the bottom – irresponsibly slashing regulation of driverless vehicles in a desperate bid to attract industry, investment and jobs.
Then on Monday March 19, an Uber vehicle hit and killed a pedestrian (who was reportedly crossing the road, walking a bicycle), in Tempe, Arizona. There was a driver present, but the car was operating in autonomous mode at the time. What is known is that the autonomous driving system failed to avoid colliding with the pedestrian.
Tempe police chief Sylvia Moir was quick to defend Uber, saying “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway. I suspect preliminarily it appears that the Uber would likely not be at fault in this accident”.
Moir’s response highlights the challenge faced by authorities in adjudicating over autonomous driving incidents. Who is responsible? Who is “the Uber” she refers to? The driver behind the wheel? Or the autonomous vehicle company (Uber in this case)? And what driving standards should be applied?
With over a century of experience, we have a pretty good feel for what drivers can and can’t do behind the wheel. But Moir’s assessment is that Uber’s autonomous driving system would have struggled to avoid the collision. Does she really know that? Do we know that?
Automated driving systems don’t sense pedestrians and other objects the same way that human drivers do. They primarily use video footage, which an automated vehicle’s computer processes against what it ‘knows’ about a given visual form. It’s not much different to how humans identify objects – including cyclists. Some also use Light Detection and Ranging, or LIDAR, which builds a 3D model of objects by measuring reflected laser light signals.
As the methods of assessing video and LIDAR images improve, so does the accuracy with which they can identify smaller, more variable objects like pedestrians and cyclists. This reportedly involves building a database of cyclist images – the bigger and more sophisticated the database, the more likely a match with a cyclist’s image captured while driving. 2D video images are being augmented by 3D rendering of LIDAR and radar data.
Despite progress, developers are having trouble making their cars reliably recognise cyclists. Renault CEO Carlos Ghosn described cyclists in 2016 as “one of the biggest problems for driverless cars.” They confuse the vehicles, he said, because at times they behave like pedestrians, at other times like drivers, and “they don’t respect any rules, usually.”
Ghosn’s statement hints at why police chief Moir’s hasty defence of Uber in the death of a pedestrian on Monday deserves greater scrutiny. As Ghosn says, detecting another road user is one thing, how you program your car to avoid them is another altogether.
Questions that should be asked of Uber in this collision are: At what point did the vehicle detect the presence of the pedestrian? What was it programmed to do in response, whether or not it successfully detected them? Unlike drivers whose answers are notoriously unreliable – or just dishonest – this information can be obtained in a precise form from an automated vehicle company.
In my previous article I predicted that automated vehicles may turn out to be good for cyclists – because they would try to avoid colliding with us. But we discussed some limitations on that. How conservative should an automated vehicle be made to drive, to reduce that risk? Should it be programmed to slow down if it detects a pedestrian on the median strip, adjacent to the lane in which the vehicle is travelling – as was the case with the fatality in Arizona? Or can it assume the pedestrian will not step in front of it?
A similar dilemma applies to automated vehicles anticipating cyclists’ next moves. Motor vehicles are more predictable – they don’t turn on a 10 cent piece. The kinematics of a cyclist riding their bike is more complex for a computer to interpret. Subtle weight distribution and counter-steering by a cyclist give subtler visual clues than the progressive steering of a motor vehicle’s wheels.
Worryingly, in their haste to curry favour with high-tech automated vehicle companies, governments around the world are falling over themselves to remove any rules that might discourage attracting investment by developers. Arizona is the ‘chief cheerleader’ on that front, but the US Federal Government is not far behind – emphasising that they see their role as staying out of the way of industry development.
Elaine Herzberg, 49, was the victim of Monday’s collision with an Uber self-driving car. She died after being transported to hospital. Her death demands answers. What steps did Uber take in designing their vehicle to avoid this type of outcome? What steps will they and other developers take to prevent further collisions?
It’s not clear that regulators are willing to ask these tough questions, for fear of driving away investors in a high innovation, high growth, young industry. But at what cost? And what are the long-term implications for how automated vehicles will protect and be made accountable for protecting vulnerable road users, including cyclists?
It’s unclear, but we are off to a rocky start.