Driving Morality to the Edge, Self Driving Cars
Self driving cars are soon to become a common sight on UK roads. With the responsibility of driving taken out of the hands of the person behind the wheel, the concerns about how the car will react to ethical dilemmas has been a hot topic in recent news.
Automotive engineers are attempting to program new self-driving vehicles to make split second, ethically right decisions.
When following rules or laws do not provide an answer to an ethical dilemma, we rely on our morals to justify how we should act.(1) But how can a machine make the right decision when it comes to a serious ethical dilemma?
Benefits of Autonomous Vehicles
Autonomous vehicles will be fitted with state of the art technology designed to give the passengers a safe, comfortable journey without having to worry about other road users, road works and potential accidents. In the next couple of years, a Super Cruise feature should be available in some vehicles which enables the car to see the road. Using an array of sensors, lasers, radar, cameras and GPS technology, the car will be able to analyse it’s surroundings. (2)
Not only will vehicles be able to take over most of the driving but also be able to communicate.Vehicle to vehicle communication will change how traffic flows and even how highway’s are designed. (2) This new technology will alert the driver to any hazards on the road ahead and communicate this information with other road users, preventing accidents and congestion.
Self- Driving Concerns
Self driving vehicles could mean a significant rise in unemployment. People who work as bus, taxi, truck and delivery drivers could face unemployment with the development of autonomous vehicles. Self driving cars will be more cost effective, never need a break or to go on holiday and should not make any mistakes on the road. This would therefore make them superior to any human driver.
Relying heavily on self driving vehicles could mean that we become lazy drivers and stop paying attention to the road ahead. Despite self driving cars being able to make fast decision on the road, it doesn’t necessarily mean that the driver should not be in control of the vehicle from a top level approach. The car will not be able to read your mind in terms of where you want to go or if you decide to re-route. This then will require you to take control again of the vehicle which requires you to have been paying attention initially. (3) With Super Cruise technology, the vehicle will require the driver to take control of the car if it detects a hazard ahead, so the machine will not be able to perform all types of road manoeuvres unassisted.
In the event of a self-driving car accident, where people have been injured or even killed, who would be held responsible. The driver, who might not have been able to change the outcome of the event or the manufacturer/ engineers who have preinstalled a particular set of moral codes which has caused the vehicle to manoeuvre in a particular way. (4)
Will your new self-driving car be programmed to sacrifice you in order to save others? People are debating whether self driving cars will be programmed to substitute your life, as the driver, in order to save people outside of the vehicle. This concerning ethical issue is sure to be raised further as the reality of autonomous vehicles heats up. (5)
Making Moral Manoeuvres
Ethical dilemmas occur in all aspects of life. Making the right decision can often depend on a number of factors and the circumstances when the event occurs.
Even for human beings, understanding how to evaluate a situation and apply the right rule and therefore make the ethically right decision is difficult, let alone for a machine.
One clear example of an ethical decision an autonomous vehicle may face on the road, is if the car approaches a cyclist on the left hand side. Should the car move over to the middle of the road and put the passengers at greater risk or to remain close to the cyclist and potentially intimidate them, which could lead to an accident?
Google’s new autonomous vehicles claim to have software installed which will aid the vehicle to make the right decision. Three questions are presented to the machine.
- How much information would be gained by making this manoeuvre?
- What’s the probability that something bad will happen?
- How bad would that something be?.
If the risk outweighed the information gained, the car would proceed to make such a manoeuvre. The question is, what would the car do if the risk and information gained values were equal?(4)
According to recent reports from KPMG (the UK’s leading provider of professional services including audit, tax and advisor) self driving cars will lead to approximately 150 fewer deaths a year. There is no doubt that the introduction of autonomous vehicles on UK roads will reduce the amount of accidents that occur, but there is still concerns about leaving life or death situations in the hands of a machine. (8)
Think Insurance offer competitive motor trade insurance for businesses around the UK. For more information please visit https://www.think-ins.co.uk.