Ethics in a post-digital age
What does it mean to be human in the post-digital age?
15 February 2018
We have moved from a purely analogue life to a digital one. This merits a re-evaluation of our understanding of the fundamental rights and values of individuals. The incoming EU General Data Protection Regulation (GDPR) is bringing into effect the biggest change in data protection law in a generation. On 25 January 2018, the European Data Protection Supervisor's Ethics Advisory Group (EDPS) published a report on digital ethics This piece considers some of the issues and recommendations that the Report raises.
Creating a big data protection ecosystem
The use of emerging technologies, data mining and algorithms will directly impact the lives of individuals. Often data processing is so minute that it moves into the realm of being unnoticeable. For example:
- negative (automated) credit scores can impact an individuals' ability to live
- targeted advertising can lead someone to make a decision that probably isn't right for them
- 'nudging' individuals to buy a product or service as a result of behavioural analysis / profiling can be intrusive, albeit within the confines of democratic norms.
The Report emphasises the need for "a big data protection ecosystem".
The technical sophistication and complexity of data protection rules against the backdrop of emerging data processing systems, such as machine learning and deep learning algorithms, are taking control away from regulators.
This Report suggests that this data ecosystem would build in future-oriented regulation and make the persons responsible for the data accountable. Accountability is one of the key motivators for the GDPR – making businesses accountable for the personal data which they affect. However, there is clearly a disconnect with the way that the digital world works in practice. Does accountability to extend to digital ethics?
Even the right to data protection appears to be insufficient to understand and address all the ethical challenges brought about by digital technologies. While the GDPR, together with the right to data protection, remains valuable for protecting traditional data processing, the Report suggests that the GDPR and this right are not equipped to address the unprecedented challenges raised by this digital era.
According to the Report, the "ethical moment" is when autonomy is engaged in the name of something that can neither be calculated nor computed. The issue of ethical autonomy is not a new one – in law or ethics. It is however fairly novel in a digital context and the EDPS' concept of an "ethical moment" provides a useful starting point for teams to consider when designing new products.
Keep ethical foresight
Stakeholders should be inspired to identify ethical issues at the development stage and to integrate in their designs and business planning, a reflection of the impact on ethics that the new technology will have on society and generate their own guidelines for addressing them. It requires ethical foresight and a responsive element by providing solutions to unprecedented challenges, which will be the key to commercial success in the digital economy, just as technological foresight is.
This means that an "ethical checklist" setting out measures to look out for will not be appropriate. The Report discourages an approach that equates data protection with the application of do's and don'ts. This means that proactive reflection on the human value, including the right to data protection will be key – particularly as tech innovation will often challenge these fundamental concepts.
The Report identifies several shifts defining the new landscape for digital ethics. One of these is how the identity of an individual is now moving towards their identity as a digital subject, established through digital constructs and patterns. However, data protection is not only about the protection of data, but primarily the protection of the persons behind the data. Data protection should always be a profoundly human matter and not a technical or legalistic one.
The new ecosystem will directly challenge traditional European values of dignity, autonomy, freedom, solidarity, equality, democracy and trust. The integration of these foundational values is necessary to maintain an ethically sustainable development of digital technologies.
The concept of responsible innovation involves finding ways to overcome ethical deadlocks and apparently insurmountable value-dilemmas.
This can be done by increasing transparency while observing confidentiality, strengthening accountability without breaches of security or explaining the application of algorithms without reducing the functionality of IT systems. This is an obvious challenge for new forms of digital technologies (e.g. complex artificial intelligence). But building these "privacy-centric" features into new tech sits neatly alongside the privacy by design requirements under the GDPR. Responsible innovation should therefore consider building these ethical considerations into product design at the outset – particularly as individuals become increasingly more protective over their digital identity.
Re-think, re-articulate and re-purpose. This is the direction that the Report concludes with. The traditional concepts of value are shifting. Assuring the continuity of legitimate practices and an unseen future can be mitigated by responsible innovation and prioritising digital ethics now.
The EDPS makes some recommendations to help condense this task into five significant "directions" of thought and innovation:
- making individuals' dignity inviolable
- ensuring that personhood and personal data are inseparable from one another – a person endowed with moral qualities, rights and responsibilities is inseparable from the information produced by and pertaining to that person
- maintaining awareness that digital technologies weaken the foundation of democratic governance – e.g. automated decision-making may be incompatible with democratic processes and freedom of choice
- being alive to digitised data processing potentially creating new forms of discrimination (for example, via profiling)
- data commoditisation risks – e.g. value shifting where personal data is given an economic value, undifferentiated with other products on the market.
The use of personal data should serve humankind. Not the other way round. At least this is what new EU data rules say. New laws do not intend to stall business growth and innovation. However, the innovators that succeed will build these ethical considerations into their design to attract trust in their strategy and products.
Written by Rachel Pereira, Trainee, London