Tuesday 2nd January 2018
Clive Efford, Labour MP for Eltham
During a demonstration at the Government’s automated vehicle testing site in Greenwich, one of the onlookers threw a plastic chair into the path of an automated vehicle. The vehicle could not stop and smashed into the chair.
That incident was not a scientific test, but demonstrated that, no matter how smart the technology, the unexpected will happen. When this is a choice between damaging property, or injury to a person, it is easy to programme software to protect people.
But let us fast forward a few years from the Greenwich test to a busy high street, and it is not a plastic chair but a four year old child that rushes out into the path of an automated vehicle, too late for it to be able to stop. In a nanosecond, the software assesses the scene; in order to miss the child its options are to either swerve into the path of an oncoming vehicle or to mount a pavement full of pedestrians. The people in the oncoming vehicle and the pedestrians are innocent bystanders, but they are about to be assessed as potential collateral damage in the vehicle’s calculations to minimise injury to people.
That is known as the ‘trolley problem’ and experts say that the machine will be programmed to carry out a “manoeuvre of minimal risk”. In a situation where all of the choices puts lives at risk, how should the machine be programmed to decide which option to take? Is it acceptable that an innocent bystander is injured because they offer the path of minimum risk?
Much is made of the potential safety improvements, with computers driving vehicles more efficiently than people as a consequence of being able to communicate with each other. However, the research focusses primarily on replacing employees, such as removing drivers of long haul trucks and taxi and delivery drivers in our towns, rather than on safety.
The ethical issues these vehicles raise are a challenge for legislators. Just how happy are we to allow machines to make decisions for us? Computer software will have to make ethical choices, particularly in emergencies where humans are at risk. If these vehicles are going to operate safely while communicating with each other, how much of our freedoms of choice will we have to surrender to machines to remove human error from our roads?
According to the International Labour Organisation, over 55 million people are employed globally in transport-related jobs. That is an enormous incentive for large corporations to replace drivers with computers to cut wage bills and increase profits. It has been estimated that the air industry, alone, could save £31 billion by removing pilots from the cockpits of airplanes. The technology is there for pilotless flights. In fact, most of any airplane journey is already managed by computers, but public opinion will not accept being flown without the presence of a pilot. In the not too distant future, a similar row to the one which is currently raging around guards on trains will soon rage over pilots on airplanes.
Public opinion does not appear to be so reticent about allowing automated vehicles on our roads. The technology for general use on our roads is not as far advanced as air transport, and it will be many years before we compete for road space with fully autonomous vehicles. To get to that point, researchers need to be able to carry out real-life testing on our roads. There are two areas of research: vehicles which will undertake long journeys on motorways where the hazards they are likely to encounter are relatively predictable; and those that will operate within a closely defined geographical area of a city where the environment is mapped in detail, including potential hazards, guided by a computer programmed to react in every foreseeable circumstance.
Our roads are about to become laboratories for the development of automated vehicles, and we, as we go about our daily lives, are to become unwitting lab rats.
We cannot hold back this technology and some of this is unavoidable, but there are very important ethical issues, relating to people’s employment and how much of our freedom of choice we are willing to concede, that we must continually scrutinise along the way.