Ready for Artificial Intelligence?

Artificial intelligence is a massive opportunity, but triggers some risks which cannot be sorted through over-regulations that might damage the market.

3 simultaneous technologic revolutions unleashing AI

One of the main topics of the World Economic Forum 2017 was artificial intelligence (AI). I found extremely interesting the interview to Ginni Rometty, the Chairwoman, President and CEO of IBM. She said that we are in a unique period since there a 3 technologic revolutions happening simultaneously which make this point in time different from ever before

The rise of cloud computing;

The rise of data; and

The rise of mobility.

Because of these 3 revolutions, there is a huge amount of information that cannot be dealt by humans, but we need systems that can deal with such data, reason around it and learn. This led to the rise of artificial intelligence.

AI and robots are exponentially becoming part of our daily life and the potentials of such technologies cannot be always controlled by humans. If you think about Google DeepMind project where AI is not programmed/taught to solve problems, butneeds to learn itself how to solve problems. This means that we will reach a stage when machines will take decisions whose reasoning cannot be explained by humans!

The call for regulations on artificial intelligence

Ms. Rometti herself mentioned as part of her interview that a 4th revolution is around security and privacy, and how such issues might still derail the revolution the three components mentioned above have combined to create.

And on this topic, it might not be a coincidence that the Legal Affairs Committee of the European Parliament approved a report calling the EU Commission for the introduction of a set of rules on robotics. Such rules include

Who is liable and how damages should be recovered?

The Committee is for the introduction of strict liability rules for damages caused by requiring only proof that damage has occurred and the establishment of a causal link between the harmful behaviour of the robot and the damage suffered by the injured party.

This would not sort the issue around the allocation of responsibilities for “autonomous” robots like Google DeepMind that did not receive instructions from the producer. And this is the reason why the Committee is proposing the introduction of a compulsory insurance scheme for robots producers or owners (e.g. in the case of producers of self-driving cars). The issue is whether such obligation would represent an additional cost that either would be borne by customers or would even prevent the development of technologies.

Robots treated as as humans?

What sounds quite unusual and honestly a bit “scary” is that the Committee also calls for the introduction of a “legal status” for robots of electronic persons“with specific rights and obligations, including that of making good any damage they may cause, and applying electronic personality to cases where robots make smart autonomous decisions or otherwise interact with third parties independently“.

The report does not fully clarify how such legal status should work in practice, but it seems like we are already attempting to distinguish the liability of the artificial intelligence itself separate from the one of its producer/owner. This shall be assessed on a case by case basis in relation to autonomous robots, but civil law rules definitely need to evolve in order to accept such principles.

Are ethical rules needed?

The Committee stressed the need of guiding ethical framework for the design, production and use of robots. This would operate in conjunction with a code of conduct for robotics engineers, of a code for research ethics committees when reviewing robotics protocols and of model licences for designers and users.

My prediction is that most of the companies investing in the area shall sooner rather than later establish an internal ethical committee. But the issue is whether statutory laws on ethics are necessary since they might limit the growth of the sector.

Privacy as a “currency” cannot affect individuals

It is the first time that I see privacy associated to a “currency”. However, it is true that we provide our personal data to purchase services. And the matter is even more complicated in case of complex robots whose reasoning cannot be mapped. Such circumstance might trigger data protection issues. But it is important that the Committee called for guarantees necessary to ensure privacy and security also through the development of standards.

The reaction from the industry

The European Robotics Association immediately reacted to this report stating in aposition paper that

“Whereas it is true that the “European industry could benefit from a coherent approach to regulation at European level” and companies would profit from legal certainty in some areas, over-regulation would hamper further progress. This poses a threat to the competitiveness not only of the robotics sector but also of the entire European manufacturing industry“.

This is the usual issue that is being discussed also in relation to the recent European consultation on rules for Internet of Things technologies. It can be hard to set so specific rules on technologies that are rapidly evolving. The concern is that regulations might risk to restrict investments in the sector, while in my view we should welcome regulations that create more certainty and foster innovation.

If you found this article interesting, please share it on your favourite social media!

Subscribe

Leave This Blank:Leave This Blank Too:Do Not Change This:

Your email:

About Us

DLA Piper is a global law firm with lawyers in the Americas, Asia Pacific, Europe, Africa and the Middle East, positioning us to help companies with their legal needs around the world. We strive to be the leading global business law firm by delivering quality and value to our clients. We achieve this through practical and innovative legal solutions that help our clients succeed.

In Italy, the team is formed by Italian and foreign lawyers who offer all the advantages of a global team, combining strong knowledge and experience of the international business environment, with a multi-jurisdictional and a full service approach.

Copyright

This information is intended as a general overview and discussion of the subjects dealt with. The information provided here was accurate as of the day it was posted; however, the law may have changed since that date. This information is not intended to be, and should not be used as, a substitute for taking legal advice in any specific situation. DLA Piper is not responsible for any actions taken or not taken on the basis of this information. Privacy policy
This website uses third party cookies (Google Analytics). If you continue browsing the site, you are giving implied consent to the use of cookies on this website. Read Google's Privacy Policy and Analytics Help pages for more information.