Go back to menu

UK-based Uber drivers launch Dutch court proceedings to obtain Uber's algorithm

Is Uber a platform operator or employer

12 August 2020

On 21 July, a number of UK-based Uber drivers launched a court case in the Dutch district court of Amsterdam, the location of Uber's European headquarters. The drivers are seeking an insight into the operation and logic of the algorithm behind the Uber app, which distributes the trips and collects significant data about the drivers. The drivers want to use this information to prove that Uber contracts with them as an employer, not merely a technology platform facilitating the provision of taxi services.

Whilst potentially relevant to the employment status and rights of millions of Uber drivers worldwide, this case also has the potential to be a landmark case with significant implications for companies that work with algorithms and artificial intelligence, as it is widely anticipated that the first instance decision will be appealed by both sides. It raises the key question: how far reaching are the rights of data subjects, in this case the Uber drivers, under the General Data Protection Regulation to obtain insight into the workings of an algorithm (GDPR)?  As there are greater calls for accountability for the use of algorithms in data processing, driving TikTok to make their algorithm source code publicly available and challenge its tech competitors to do the same, this case may clarify any legal obligations for companies to follow suit, lay bare their valuable intellectual property, and even have to explain it in a meaningful way (which may be difficult where AI systems are self-learning).

The underlying motive of the Uber drivers for seeking information on the algorithm relates to various pieces of litigation on their employment status in Europe and the United States. Is Uber merely a platform, that matches drivers and passengers, or does it do more than that with the consequence that it should be regarded as an employer?

The latest round of litigation commenced in the UK on 21 July before the Supreme Court. Uber is appealing the decision that Uber drivers qualify as 'workers' – rather than self-employed contractors – with the consequence that as workers they are entitled to the national minimum wage and paid holidays. The lower courts considered that in spite of the convoluted, complex and artificial contractual arrangement in place between Uber and its drivers, in reality Uber runs a transportation business and the drivers provide the skilled labour through which that business delivers its services and earns its profits as employees. More recently on 11 August a San Francisco Superior Court judge in a California ruled that Uber and its competitor Lyft must treat their drivers as employees rather than self-employed contractors. This decision will be appealed. Dutch trade unions are also considering similar legal action.

What are the rights of data subjects under the GDPR?

The GDPR provides data subjects (individuals) with the right to obtain access to their personal data from the data controller. The right of access, commonly referred to as subject access, gives individuals the right to a copy of their personal data, as well as other supplementary information. This right is not conditional upon the data being sought for a specific purpose. For this reason, in the UK subject access requests have long been used as a 'fishing expedition' tool prior to litigation to see what useful information can be unearthed prior to engaging in a formal litigation process. In other EU countries this does not yet appear to have been the case.

The GDPR also gives individuals the right not to be subject to solely automated decision-making, including profiling, which has legal or other similarly significant effects on them. Automated decision making is when decisions are based on personal data that is processed by computers, with little or no interference from human beings. Typically, an algorithm will be used to carry out the computations required to turn the input of raw data into an output of decisions. In these circumstances the GDPR subject access right requires the controller to provide meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject. This is a safeguard included in the GDPR to protect data subjects.

What is meaningful information: the key question in this case

The nature and extent of the right of an individual to receive 'meaningful information' about the automated decision-making is not yet clear. What is meaningful information? How much detail and information does a company have to provide? Should it be focused on the individual making the request or can the information be of a more general nature?

How the court addresses these questions will have an obvious impact on Uber, but may potentially inform other companies using algorithms on the extent to which they may be required to explain the logic behind their product and how transparent they will have to be.

Until such time that there is appellate case law and/ or guidance from the relevant data protection authorities, companies have no clear steer on the level of transparency required of them. Indeed, transparency may be at odds with the desire to protect trade secrets. How will these two interests be balanced by the court?

Before ruling on this question, the Dutch court needs to determine that the Uber app indeed includes wholly automated decision-making processes which produces a substantial legal or similar impact. This does seem likely as Uber's algorithms essentially determine if an Uber driver gets a ride, and hence his earnings potential.  

What's next?

The Dutch court has to set a court date to rule on this case. Such date is yet to be announced publicly.

The extent to which details of the algorithm have to be provided and whether that detail will provide helpful evidence to support the various employment status pieces of litigation and/ or give rise to any other claims remains to be seen.

For example, some gig economy platforms do not hold data on protected characteristics such as the race or gender of their workers in order to avoid claims for direct discrimination. This also means there is no way to detect any indirectly discriminatory effects. Furthermore, it is unclear if an algorithm in its entirety can be the basis of a claim, or whether the law will require specific functions to be targeted. Algorithms are complex, opaque and interconnected. Even with access to meaningful information relating to their data, it may not be possible for employees to isolate the process that is causing them a disadvantage in order to establish a claim.

There is no doubt that the use of automated decision making has changed modern employment relationships, and it will be an interesting area to watch for legal developments in the near future.

Kristi Tucker, Trainee, contributed to the writing of this article.