Go back to menu

Facebook / Cambridge Analytica breach

Five lessons to be learned

24 July 2018

The Information Commissioner's Office (ICO) announced 10 July that following a wide-ranging investigation into the alleged misuse of personal data, Facebook is to be fined the maximum amount possible of £500,000. The investigation found that Facebook had contravened the UK Data Protection Act in two instances: (i) Facebook had failed to safeguard its users’ information; and (ii) it had failed to be transparent about how that data was harvested by others.

As every news cycle brings further revelations about companies and governments misusing personal data, regulators, politicians and the public are more concerned than ever with what happens to information which is placed online, who has it and how it is being used. Here are five lessons to be learned in the aftermath of the Cambridge Analytica Breach.

Complicity through inaction

Many more details will emerge from the investigations concerning Facebook and the role it played in the Cambridge Analytica Breach, but for now it is telling that a major focus of the ICO has been on Facebook's failure to adequately investigate its position and its denial of allegations. Chief privacy officer Erin Egan has said the company should have done more to investigate claims about Cambridge Analytica when they were first raised in 2015. In this case the focus has been on the response as opposed to the event itself. As unforeseeable as such events may be, they are increasingly becoming part of today's reality and therefore continuous monitoring and having the right contingency plans in place are paramount.

Consider before investigating

In the interest of getting to the bottom of the facts and responding to law enforcement agencies, governments, stakeholders and the public, it is tempting for businesses to rush into commencing an internal investigation. However, in some jurisdictions, uncovered facts may not be protected by legal professional privilege and therefore disclosable to enforcement agencies or future litigants. In addition, there is a risk of enforcement agencies alleging that the 'crime scene has been contaminated' as a result of such internal investigation. Therefore, before acting, it is critical to give careful consideration to the objectives and scope of any investigation.

More attention on political parties' use of personal data

The ICO has issued political parties with notices compelling them to agree to a full audit of their data protection practices. This is on top of the warning in the regulator's report that it is worried about the risks in relation to the processing of personal data by many political parties, particularly regarding: (i) the purchasing of marketing lists and lifestyle information from data brokers without sufficient due diligence; (ii) a lack of fair processing; and (iii) use of third party data analytics companies with insufficient checks around consent.

This heightened focus from the ICO, together with individuals' greater awareness of their own rights regarding personal data, means that political parties will have to be more mindful of where they get their data from and how they use it.

High-stakes – unprecedented sanctions and remedies for breaches

As the incident took place before the EU's GDPR came into force on May 25, Facebook only face the maximum fine under the 1998 Data Protection Act, the £500,000. Under the GDPR however, the ICO would have been able to impose a much larger fine. A scheme of fines based on the anti-trust model has been introduced for serious breaches, amounting up to whichever is higher of EUR 20 million or 4% of group global turnover. The implications of getting it wrong are therefore higher than ever before.   

Self-sovereign identity

Drawing a parallel to the disruptive effect technology and, in particular, the effect that use of blockchain and distributed ledger technology is having on various sectors, prompts the notion of "self-sovereign identity" systems. The theory behind self-sovereign identity is that individuals themselves should control the data elements that form the basis of their digital identities, and not centralised authorities such as governments and private companies.

The concept is already being put into practice by the Swiss municipality of Zug. Zug launched an identity system called uPort last year, which allows residents access to certain government services. The municipality announced it will also use the system for voting this spring. This process allows people and businesses to store their own identity data on their own devices, and then efficiently provide that data to those who need to validate it. As this is a more cost-effective way of managing data, "self-sovereign identity" systems could definitely play a key role in the future of data.