FRR Highlights New European Commission Report on Robot Law

February 20, 2017 by Noel Sharkey and Eduard Fosch-Villaronga

About

Regulations on the responsible application of robotics technology are long overdue. While robotics developments are moving very rapidly throughout the word, Governments have been seeing them solely in terms of economic drivers. Many large companies and research laboratories in consultation put a very positive slant on the technology and the overall societal, ethical and legal impact has been largely ignored or minimized.

Then, in May 2016, Member of the European Parliament, Mady Delvaux, took the first steps by drafting a report to the European Commission containing recommendations for Civil Law Rules on Robotics. This has certainly been welcomed by the Foundation for Responsible Robotics.

The recommendations included general remarks on legal issues such as, liability, insurance schemas, intellectual property, safety and privacy. Different types of robot applications are included in the report such as care robots, medical robots, drones, and autonomous cars. The report also covered different contexts of use and social aspects, i.e. unemployment, environment, education, social division, and discrimination. There is a proposal for a European Agency, a Charter on Robotics, a License of Use and a Code of Conduct for Engineers. Each of these reflect important issues that need to be examined very carefully and looked at by a multidisciplinary committee of experts without a vested interest or agenda.

At the same time, however, some of the wording in the document is disappointingly based on Science Fiction. Notions of robot rights and robot citizenship are really a distraction from the main thrust of the document that only served to catch the wrong type of media attention. There is naiveté about the fictional Asimov’s laws of robotics that, as anyone who has read the novels will know, were set up as fictional foils to show how they were not workable.

On Tuesday 16th Feb 2016 the European Parliament resolution with recommendations to the European Commission on Civil Law Rules on Robotics (2015/2103 (INL)) was adopted. The resolution was passed by 396 votes to 123, with 85 abstentions. The Commission will not be obliged to follow the Parliament’s recommendations, but must state its reasons if it refuses

The final resolution has taken into account several documents, including the European Parliament Draft Report 2015/2103 from Mady Delvaux that was released in May 2016, and opinions from different European Committees:
Opinion Of The Committee On Transport And Tourism (*) (16.11.2016)-
Opinion Of The Committee On Civil Liberties, Justice And Home Affairs (*) (23.11.2016)
Opinion Of The Committee On Employment And Social Affairs (9.11.2016)
Opinion Of The Committee On The Environment, Public Health And Food Safety (14.10.2016)
Opinion Of The Committee On Industry, Research And Energy (15.11.2016)
Opinion Of The Committee On Internal Market And Consumer Protection (12.10.2016)

Subsequently a study was commissioned by the European Parliament’s Legal Affairs Committee to Nathalie Nevejans in October 2016. The aim was to evaluate and analyze the report from a legal and ethical perspective. There are several differences between the original Delvaux document and the study of the European Parliament’s Legal Affairs Committee including: how the European Commission can define robots; whether or not robots deserve rights; and what are the ethics principles that surround the idea of robots. The resolution that was adopted unfortunately did not take the Legal Affairs analysis led by Ms Nevejans into account.

The Foundation for Responsible Robotics highlights that while the European report includes many of the necessary conditions for regulating robotics applications and ensuring that innovative robotics can prosper, some of the regulations need considerably more detailed work. A large number of the proposals could have very series repercussions not only at the legal but also at the social level.

One example, of many, is that of conferring legal personhood to robots similar to corporate. The concern is that this could allow corporations to hide from their responsibilities by passing the buck to their machine creation. It is certainly not clear how a robot could meet its financial liability or legal responsibilities without allowing robots some sort of rights or citizenship. This is also discussed in the report and it is where it disappears down a science fiction rabbit hole of speculation about future technology. This is a distraction from the more important aspects of the report and yet it is highlighted. An autonomous care driving recklessly cannot be punished and robot personhood should not allow the humans behind the robots to be shielded from their responsibilities.

The next step should be to seek advice from international interdisciplinary groups such as the Foundation for Responsible Robotics. We can provide objective feedback and advice that is not based on a financially vested interest in the development of the technology. Our members have a long track record of reacting critically in the issues discussed in the report and can foresee the potential ethical, legal and societal consequences of such robotics with artificial intelligence. This would help to reduce the possibility of unexpected results and we look forward to assisting in the meaningful work ahead.