An example, reflections after looking at a publicity from Toyota
Toyota is running an advertising in the US for one of its new cars. The publicity shows the Toyota car driving behind a camping car, and performing an automatic emergency braking procedure when the camping car in front suddenly brakes for a deer crossing the road.
Disclaimer: The text hereafter gives some reflections after looking at this publicity, that, probably unexpectedly and unintentionally, is also suitable for use as an illustration of some societal issues. The quality and characteristics of the specific brand and car are not questioned and are not the subject of the discussion.
One could question whether this publicity is advocating responsible driving, or rather ‘sporty’ and less responsible driving.
But more fundamentally, why is there the camping car in the publicity?
Well, the presence of the camping car may be hiding the real issues to be addressed by society:
- a technical question: would the automatic braking system be able to brake for a deer?
- an ethical question: should the automatic braking system brake for a deer?
Answering these two questions is central to ‘assisted driving’ and ‘autonomous driving’. And, in fact, many similar technical and ethical issues in other, related or even seemingly unrelated applications of smart technology, including in the medical domain.
The technical question is more complex than often realised; it requires, at least:
- complete knowledge of the working of the algorithms used (there are e.g. face recognition algorithms in use not understood by providers of products and/or services applying them . . . )
- sufficient testing of the systems in a sufficient set of representative situations
- failure risk assessment
- maintenance and upgrade management
The ethical question may be the more difficult; it requires, at least:
- complete knowledge of the situation:
- can the car be stopped in time / within the distance given?
- can an emergency brake procedure be executed without risks for persons seated in the car, for following cars, etc.?
- can other vehicles be warned and/or consulted?
- If not, what compromises are possible and what are the consequences?
- Is there enough time to consult cloud based applications advising on the impact as a function of all relevant parameters, including the size and estimated weight of the deer?
An agreed guidance for taking decisions, understood by other road users
Legal versus Societal aspects
In many discussions the legal aspects are put on the foreground. It could be argued that the legal issues are important, but possibly are less complex and less fundamental than is often presented, and that the perceived legal issues overshadow the societal issues.
On the legal aspects
- an autonomous driving vehicle could be considered equivalent to a car equipped with an automatic parking function: the automatic parking function may create damage or injure a person ‘on behalf of the user’
- the user may hold the owner or service provider (rent, lease) of the car liable, the owner may hold the manufacturer of the car or its representative, or the maintenance service provider liable.
In this respect, additional legislation may be useful, but may be not strictly required immediately, while ground-breaking cases will establish case law, interpretation of the existing laws and give possibly rise to additional laws.
Beyond ‘break-glass’: stopping an autonomous vehicle
In the medical domain the need for an overriding emergency function has been recognised, it is often referred to as a ‘break-glass’ function. The use of the ‘break-glass’ function is assumed to be reserved for use by medical and para-medical personnel, or at least persons with sufficient knowledge and training to be able to use it.
Whereas such as function is certainly necessary, it may not be sufficient: e.g. in case of a life-saving implant such as a cardiac stimulator, the ‘break-glass’ function may be too binary in nature, and needs to be protected against misuse, so there is a need for a more complete ‘Protected Emergency Function’.
Industrial applications likely require the equivalent of a ‘Protected Emergency Function’ (PEF):
- simply stopping a process may not be practical, safe, or may even cause dangerous situations (e.g. stopping a steel oven, a chemical process or nuclear power plant may not be a great idea), likely one or more a graceful interruption and arresting processes will be required
- the access to the PEF needs to be accessible in emergency situations as well as sufficiently protected against misuse, abuse, terrorism
Autonomous and assisted driving vehicles will need more than that:
- in addition to a ‘Protected Emergency Function’ (PEF), they need a ‘Policing Response Function’ with very clear rules and protection/security
- a ‘Protected Emergency Function’ and a ‘break-glass’ function need to be protected, but as the same time in emergency situation accessible to . . . anyone able to help prevent or contain dangers; this may require a 100% reliable, fool-proof, hacking proof system to provide, in real time, modular ‘emergency keys’ to allow local interventions to be made, in addition to remote interventions (that, it may be assumed ‘according to Murphy’, that this will not work when really needed; maybe Machiavelli could have given us advice . . . )