Do No Harm: Seven Challenges to Robotics Regulation

Sberbank analysts in the annual review of robotics for 2019 considered the main regulatory, legal and ethical problems of robotics

Legal issues of robotics

1. The concept of a robot

There are dozens of interpretations and definitions of the concept of “robot” in the world. Sberbank analysts themselves offer an approach based on three criteria: perception of the surrounding world, interpretation of the information received, and further impact on the physical world. But this is only one of the interpretations. For legal scholars, the concept of a robot should be neither broad nor narrow, but precise:

  • too broad definition will allow to include in the concept of “robot” ordinary devices like coffee machines;
  • a definition that is too narrow will not cover devices such as a robot vacuum cleaner, which are also robots;
  • linking to the word “robot” threatens to extend regulation to such phenomena as search or trading robots;
  • finally, it is not clear what to do with related concepts, such as cyborg, drone or android.

The way out, the authors of the report believe, may be to abandon the search for a single verbal definition of a robot in favor of a set of criteria, the coincidence of which allows us to categorize a certain device as a robot. The 2017 UNESCO report on robotics cites four such criteria:

  • mobility;
  • interactivity (this is just the ability to collect information from the world and influence it);
  • information exchange;
  • autonomy (the ability to make decisions without human assistance).

2. Responsibility for harm caused

This is one of the most complex and confusing legal problems in robotics, and allows for different solutions depending on the situation and the type of robot. True, in some cases it is very difficult to restore the actual circumstances of the harm. The authors of the report identify seven main approaches to the problem of responsibility:

  • complete release of someone from liability. To do this, it is necessary to recognize the actions of autonomous machines as force majeure circumstances;
  • partial exemption from liability. It differs in that the injured party is awarded compensation. Where – from the insurance fund or the owner of the robot – is another question;
  • fault liability. If the accident is caused by design defects, it is the manufacturer’s responsibility; if there was a computer failure, then the developer. If the robot is self-learning, the one who contributed the most to its learning. If the robot performed specific commands, then the device operator, and so on;
  • limited non-guilty liability, that is, the recognition of liability for a third party (owner or manufacturer) who is innocent of the incident, subject to a number of conditions, such as risk insurance;
  • full blameless responsibility. A certain person is by default responsible for the actions of the robot;
  • the personal responsibility of the robot, which implies the empowerment of the machine with legal personality;
  • a mixed liability regime, in which certain approaches are applied depending on the degree of danger of the robot and its other characteristics.

3. Intellectual Property Rights

If a robot creates music, a picture or text, who owns the rights to them? The manufacturer of the robot or the one who led the process of creating a creative object? To the robot itself? Or perhaps these rights should not belong to anyone, automatically becoming public domain? The choice of a specific solution is still open.

4. Principles of regulation

The basic legal problem of robotics is what should be the general principles applicable to all robots or some of their categories? Is it necessary to legally fix the presence of a “black box” and a red emergency shutdown button in a robot, to ensure mandatory human control? On paper, these principles have not yet been approved, although they are being actively discussed in the expert community, including in our country, Sberbank analysts say.

5. Rights of robots

The question of whether a “smart” machine has personal characteristics was raised by the creator of the term “robot” Karel Capek. For almost 100 years, a clear answer to it has not been found, now there are several mutually exclusive proposals:

  • a robot as an object of law (something like this is the case now, so there is no need to redraw all the legislation);
  • robot as an animal, i.e. the inclusion of “smart” machines in pet legislation or the creation of a similar set of rules for robots;
  • a robot as a legal entity, which will also not require major changes in legislation, closing a number of legal gaps along the way;
  • robot as a person, which is the most radical, difficult and least justified approach, the authors of the review believe;
  • a robot as an electronic entity, that is, a fundamentally new subject of law, slightly different in meaning from a legal entity.

6. Information security

Problems that are currently receiving minimal attention, but which will become key as the spread of robots – what will the machines do with the information they collect and how to ensure that it does not fall into the wrong hands? This is directly related to the protection of personal data, as well as the rules for processing large amounts of information. So far there is no solution to this problem.

7. Robots at war

Already, one of the most common types of robots in terms of the number of active devices is military equipment, say, unmanned aerial vehicles. From the point of view of national security, this is an extremely promising and interesting area for the authorities, which means it is generously funded. On the other hand, if such a killing machine gets out of hand, it will have the most dire consequences for people and society.

Therefore, now, the authors of the review write, UN experts and several dozen states are actively lobbying for the inclusion of military robots in the 1980 Convention on the Prohibition or Restriction of Special Types of Weapons along with laser weapons, anti-personnel mines and other inhumane devices.

The ethical side of the issue

A topic related to legal regulation is the ethical implications of introducing robots into human life. The fundamental concept is the laws of robotics, formulated by Isaac Asimov back in 1942.

1. A robot cannot harm a person or by its inaction allow a person to be harmed.

2. A robot must obey all orders given by a human, except in cases where these orders are contrary to the first law.

3. The robot must take care of its safety to the extent that it does not contradict the first or second laws.

Asimov’s laws have defined the approach to robotics so strongly that even official documents, such as European Parliament resolutions, refer to them. Other researchers have tried to supplement and modify the three classical laws, the trend of recent years is to add the ethical rules of artificial intelligence (AI) to the laws of robotics. So, Microsoft CEO Satya Nadella introduced ten laws of AI in 2016, which included the requirements of impartiality, technological transparency, harm prevention, and so on. For the decisions of the computer, according to Nadella, the person is still responsible.

The ethical problem of robotization is also the prospect of mass unemployment due to the widespread automation of labor. Algorithmization of procedures and the expansion of the use of robots in the next 20 years may affect a third of jobs worldwide. In turn, the creators and operators of robots, armed with knowledge, can become a narrow but powerful stratum of the population, which will aggravate social inequality.

Ultimately, the automation of a number of professions will require a change in the education system, which will have to support the skills necessary in the new conditions, helping a person to reveal his strengths, which cannot be replaced by a machine. Sberbank analysts also offer several possible solutions to the problem of robotization of the economy:

  • introduction of a tax on robots;
  • guaranteeing a person an unconditional basic income;
  • universal job security;
  • creation of a social insurance system;
  • updating educational programs.

Leave a Reply