Professor Nick Reed, founder and CEO of Reed Mobility, urges a wider analysis of crash statistics, particularly when it comes to self-driving vehicles, in order to create systems that the public trusts
“The legendary American test pilot Chuck Yeager is associated with the quote “If you can walk away from a landing, it’s a good landing.” By the same rationale, one could assert that any road journey completed without a collision is a safe journey. After all, it’s a crash that leads to death and serious injury. No crash means no death or serious injury, which equals ‘safety’ – right?
While this may be statistically true, road transport is not the same as military aviation. Road design and driving behaviors have ripple effects that impact wider society. This distinction highlights the importance of human factors in road safety. Unlike the controlled environment of a test flight, road networks are dynamic, unpredictable and populated by a diverse range of users, each with varying levels of skill, experience and attentiveness.
This matters when it comes to the development of driver assistance systems. These technologies may prevent collisions but we have to understand how humans interact with them – how they perceive and trust them and how these facets shape individual and societal behaviors. A salient example is over-reliance such systems leading to complacency and reduced vigilance, potentially increasing the risk of collisions when the system fails.
Perhaps more significant is how this relates to the metrics we might use to assess safety in automated driving. It is commonly accepted that self-driving vehicles need to be safer than humans. A simplistic way to assess this would be to measure the rate of serious incidents caused by AVs and ensure this is lower than that for careful and competent human drivers.
I think we need to go further. I believe our assessment needs to include a measure of the degree to which the behavior and operation of automated vehicles is compatible with societal expectations – and recognition that those expectations may be very different depending on who you ask and where they are from.
“We risk a backlash against rare but serious AV incidents that might be very different to human-caused crashes”
What might happen if we just accepted casualty rates as our sole metric? This risks a public backlash against rare but serious incidents that might be very different in nature to human-caused crashes, no doubt be accompanied by media uproar about ‘killer robots’. The upshot could be delays to the uptake of the very technology we anticipate will improve safety.
There could be further undesirable consequences of assessing performance only by crashes. Requiring ‘careful and competent’ driving infers something more than just collision avoidance; it suggests being mindful of other road users, especially the vulnerable. Driving behaviors that are perceived as being aggressive may intimidate those who are walking, wheeling, cycling or riding – again with the risk of raising antipathy towards the technology.
Ultimately, journeys free from serious collisions must be our ambition but a truly safe road transport system must also acknowledge and address the complex interplay of human factors. We may not be breaking the sound barrier, but addressing societal barriers to technology deployment will be critical in creating a safe, sustainable road environment for all.
This article was first published in the May 2025 edition of TTi Magazine