AI, data protection and data ownership
New technologies such as artificial intelligence (AI) or Deep Learning pose challenges for the law for example in regulatory or civil matters. Two areas where these challenges will have to be met by artificial intelligence companies and artificial intelligence in business are data protection in the regulatory field and data ownership in civil law.
Artificial Intelligence and Data Protection
Whenever AI technologies work with information relating to natural persons, data protection law will be applicable. Functionalities of AI tools may even work to include such information in the scope of such law that would previously not fall in this area: Being able to collect and connect data that is initially not referable to a specific person from different sources and its analysis could lead to the identifiability of a natural person.
Developers of AI technologies should therefore aim to integrate technical and organisational measures implementing data protection principles into the design of a tool (“Data protection by design”). Using AI tools in business operations may additionally require a risk assessment regarding the protection of personal data (“Data protection impact assessment”) and trigger an obligation to designate a data protection officer.
Lawfulness of Processing and Transparency
The General Data Protection Regulation (GDPR) requires a legal basis for any processing of personal data, and obliges the data controller to provide transparent information. If, for example, a natural language understanding tool assists employees of a support hotline in finding possible solutions for the callers, use of such tool must be made transparent to the person seeking support. This includes providing information on purposes and the legal basis for the processing. Depending on nature and extent of the data processing, e.g. storing and analysing the voice of the person making the call in order to enhance the tool, consent by the data subject may have to be obtained.
The GDPR grants an individual’s right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning this individual or similarly affects this individual. With regard to AI, a broad range of software agents in sectors like banking and finance, retail, healthcare, insurance and advertising may be restricted: The regulation applies where not a human but an algorithm decides on the question of whether a certain service, payment method or rate of interest is offered to the consumer, e.g. based on credit scoring, profiles on the consumer’s personal preferences or other data collected about the individual user.
However, automated decision-making can be legitimate if it is necessary for entering into or performance of a contract with the customer, inter alia. One example: When a web shop uses an algorithm that collects credit reports from third-parties and declines delivery on account in case the customer does not meet a certain credit score, this may be justified by the shop provider’s need to reduce its risks. In any event, the shop provider must inform the customer of the automated decision-making and grant him or her the right to obtain human intervention and to contest the decision.
In a future that is rapidly becoming characterised by artificial intelligence, legal challenges such as these will have to be mastered.
Artificial Intelligence and Data Ownership
Another area where AI companies will have to deal with new legal ground is data ownership. Within the European Union, there is no coherent approach on ownership of data as such. Restrictions of usage and disclosure of data other than personal data mainly stem from contractual relationships. Developers of artificial intelligence tools and users will have to bear in mind closely what they want to do with data and what the respective contracts allow. Data ownership should therefore already be an item on the to-do list of any contract manager dealing with AI related contracts.
Additionally, when looking at data ownership artificial intelligence lawyers should stay alert to the fact that proprietary rights in certain kinds of data may exist. Proprietary rights in the context of AI, not protecting data itself, but its compilation, may comprise copyrights and database rights regarding the “products” of AI.
An artificial intelligence due diligence will help with assessing risks and ensuring compliance on the one hand, but also with gaining an overview of IP assets on the other hand. Forward-looking contract design prevents conflicts: Where copyrights come into existence, for example, co-authors of source code may contractually deal with their rights beforehand, specifically if they use AI in developing such code.
Please also see the section on “AI and patents” for this topic.
The scope of copyrights on AI-generated content is subject to discussions, and depends on national regulation rather than the EU legal framework. Starting from the insight that only humans and not machines have legal capacity to be creators of copyrighted works, copyright presupposes a certain level of originality of a human being. Drafting an initial source code that autonomously collects information and enhances its technique leads to the question of whether the result, like a software source code or an artwork, will still be considered a product of human creativity. Where artificial intelligence solutions directly create software code the answer might be “no copyright”. Therefore, artificial intelligence law firms are in the process of developing contractual tools to deal with these issues.
Where no human creator is involved in creating the work result, AI companies may still have IP rights in the works created by their artificial intelligence solutions. With regard to the technological capacity to structure and analyse information, database rights might apply. The EU database directive protects investments in building databases rather than the creative act. In the context of AI business models, data compiled by AI solutions may represent a protected database under this directive.