Features
22 aoû 22

How self-driving vehicles took a major step forward this summer

Within a few short weeks this summer, the prospect of self-driving vehicles has advanced from science fiction trials to reality, raising complex political and ethical issues as governments and manufacturers race to lead this potentially lucrative market.

In May, Mercedes-Benz became the first vehicle manufacturer to offer self-driving functionality as a factory-fit option, with its DRIVE PILOT for the S-Class and EQS. The technology enables drivers to hand over control to the car at speeds of up to 60kmh.

DRIVE PILOT (pictured below) guides the vehicle in its lane, controlling the car’s speed and distance from other traffic, and reacting to unexpected traffic situations by braking or taking evasive manoeuvres.

In the world of autonomous vehicles, DRIVE PILOT is ‘Level 3’, and currently only authorised for use on 13,000km of motorways in Germany, the first country to create a legal basis for the use of self-driving systems. Mercedes-Benz hopes to secure further approval for DRIVE PILOT’s deployment in two US states, California and Nevada, by the end of this year.

EC supports self-driving technology

Following hot on the heels of Germany, the European Commission has promised to adopt technical rules this summer to advance the uptake of Level 3 and 4 automation. The Commission wants to increase the speed limit for Level 3 automation on motorways to 130 km/h from 60 km/h from January 2023, catching up with a new UN regulation, adopted in June.

Levels of autonomy  
Level 1 Driver assistance technology, such as cruise control.
Level 2 Driver assistance, such as lane assist and automatic emergency braking, with the driver still in control.
Level 3 The vehicle can take over from the driver in certain environments eg a motorway, but the driver may need to take back control within a few seconds' warning.
Level 4

The vehicle operates without human intervention along a programmed route, in use cases such as driverless taxis and public transport. 

Level 5 the vehicle can drive itself anywhere with no human interaction.

The technical rules will set out type approval test procedures, cybersecurity requirements, data recording obligations, and the compulsory monitoring and reporting of collisions and near misses.

Robotaxis

The same test requirements will also pave the way for completely driverless vehicles, such as urban shuttles and robotaxis (Level 4 automation), to operate within the EU from September 2022.

However, the European Transport Safety Council (ETSC) has warned that the Commission’s proposals do not sufficiently protect vulnerable road users, such as pedestrians and cyclists. Driverless vehicles must avoid pedestrians walking at up to 5km/h and cyclists riding at up to 15km/h, thresholds that are too low, according to the ETSC, which argues that these are average urban speeds, so half of pedestrians and cyclists will be travelling more quickly.

With human error blamed for 95% of traffic collisions, governments and OEMs believe that self-driving technology is essential to achieve road safety targets, although where the blame should lie in the event of failure has historically been less clear cut.

OEMs responsible for self-driving crashes

Now, however, there is a growing body of opinion that the vehicle manufacturer, not the ‘driver’ sat in a self-driving car will be responsible for collisions.

This month, the UK Government promised legislation that makes manufacturers: “responsible for the vehicle’s actions when self-driving, meaning a human driver would not be liable for incidents related to driving while the vehicle is in control of driving.”

The UK’s Department for Transport has pledged £100 million to help British businesses win the race to develop safe, self-driving technology, and said self-driving cars, coaches and lorries could be operating on UK motorways as early as next year, followed by the wider rollout of self-driving vehicles by 2025. 

“Self-driving vehicles have the potential to revolutionise people’s lives,” said Kwasi Kwarteng, UK Business Secretary. “This funding will help unlock the incredible potential of this industry, attracting investment, developing the UK’s growing self-driving vehicle supply chain, and supporting high-skill jobs as these new means of transport are rolled out.”

Last week (19 August), the UK Government also launched a consultation on standards for self-driving vehicles to ensure they are: “as safe as a competent and careful human driver.”

Ethical considerations

The consultation coincided with a report from the Centre for Data Ethics and Innovation (CDEI), setting out proposals for the regulation and governance of self-driving vehicles.

“Under our current legal and regulatory systems, we licence drivers as competent to drive and then hold them accountable for their actions,” said the CDEI. “In the context of vehicles that are self-driving, we will need new mechanisms to ensure that the systems these vehicles use, and the organisations that develop and deploy them, are similarly held accountable for performing in a safe and ethical manner.”

This includes requiring OEMs to install ‘black boxes’ that capture vehicle driving data and allow external investigators to analyse all of the key decisions taken by the self-driving system in the event of a collision or near miss.

Public anxieties

The CDEI acknowledges that there is no fact-based answer to the question, ‘how safe is safe enough’, but warns that if the public does not consider self-driving technology to be safe enough, it will not be accepted. Research into risk perception reveals that dangers which people consider to be new, uncontrolled, catastrophic and artificial are consistently amplified in the minds of the public.

“The hope is that AVs will offer dramatic improvements in overall road safety, but in changing the scale of risk, they will also affect the type and distribution of risks experienced by road users. Average improvements in road safety, even if they can be clearly demonstrated, will not engender public trust if crashes are seen as the fault of faceless technology companies or lax regulation rather than fallible human drivers,” said the CDEI.

“If AVs are seen by the public as equivalent to trains or aircraft, mobility technologies that users feel are not under their control, the public could expect a 100x improvement in average safety over manually-driven vehicles.”

Images: Shutterstock, Mercedes-Benz

 

 

 

Authored by: Jonathan Manning