Go back to menu

AI and IP: Licensing (UK law)

Is the standard software licence still fit for purpose?

27 November 2017

"Artificial intelligence" is a broad term used to describe a range of software functionality. At present, it is used almost synonymously with "machine-based learning". In this series of Talking Tech posts we consider some of the intellectual property law issues arising from the increasing prevalence, and increasing capability, of AI software. In this post we look at whether and how the issues identified can be dealt with in the licence of the AI software between developer and operator. Given the enhanced capabilities of AI software, it may be the case that a standard software licence is no longer fit for purpose.

Ownership of works generated by AI software

In our separate post on subsistence and ownership of copyright we concluded that copyright could only subsist in a work created through AI if it can be attributed to a human author. We are not quite at the stage with AI yet where AIs are acting entirely of their own volition, so the risk that AI-generated works are considered to be incapable of protection because no human author can be identified is low. For now at least, the more common question is likely to be whether it is a licensor of the AI or the licensee which owns copyright works generated by the AI system.

The best way to address ownership is through the licence agreement between the parties. In a typical IP licence, the licensor might seek to retain improvements to the AI tool created under the licence. This is harder to justify in the context of machine learning, as the function of the software is to improve its analysis based on the data provided by the licensee. It is difficult to disentangle the licensee's data – representing the licensee's underlying knowledge and business processes – from improvements to the tool itself. As such the common commercial position of each party retaining the rights to improvements to its IP may not apply. A licensee should be careful to avoid its data being swept up by any improvements clause in favour of the licensor. 

The licence agreement for the AI system cannot, however, be used to cure the subsistence problem if the AI created the work independently of any human author (or deemed author), nor to change who is the "author" applying the applicable statutory tests. The parties can agree that if copyright exists it belongs to a particular party, and contractually how a particular work will be exploited (whether or not the work is capable of protection), but cannot make copyright subsist in a work when it does not. When enforcing copyright, a key issue is often proving entitlement to the copyright in the work(s) in question. It would be open to an alleged infringer to argue that a work generated by AI on its own initiative has no author for the purposes of the CDPA, and without an author there is no copyright to enforce. If there are concerns as to whether a work would qualify for protection, because of lack of an actual or deemed human author, the parties could rely on mechanisms for protecting confidential information (where possible). 


If there is any cross-border element to the licence, thought needs to be given to where the AI will be "used". This can be a difficult question particularly in the context of cloud-based computing. There is a risk that local law treats computer-generated works differently to UK law.  This could, for example, result in the need to assign rights under the licence to the intended owner rather than have those rights vest in the intended owner by operation of law. This could also have knock-on effects, such as tax implications or even export control. 

The proposed structure for ownership of copyright works created during the licence must be compatible with applicable law. Local advice may be required as to the first owner of copyright works under local law and the licence drafted to accommodate applicable legislation, irrespective of the governing law of the licence. 


The licence agreement for the AI system should address liability for intellectual property infringement "committed" by the AI system, as discussed in our separate post on the topic. The AI system cannot by itself infringe a third party's rights, but the person most closely connected with the infringement (e.g. the operator of the system) is likely to be held liable. 

A typical liability clause for third party infringement excludes from the licensor's liability infringements caused by the licensee of software, but otherwise the licensor tends to be liable if the use of the software within the terms of the licence infringes the rights of a third party. Such clauses have traditionally been drafted with the software itself in mind, i.e. is the copyright in the software owned and validly licensed by the licensor. In the context of AI the standard liability clause may no longer be fit for purpose, as the software may have the capacity to exploit third party works. 

Liability for infringements by AI systems is likely to be highly fact specific. This emphasises the need for a mechanism for cooperation between the licensor and licensee if a third party alleges that its rights have been infringed by the AI system.