Data protection / General Data Protection Regulation / ICO / Personal data / pseudonymisation / Risk-based approach / sensitive data

ICO Requests Feedback on New Data Protection Profiling Provisions

cranium-2099135__340

If we stopped calling it ‘profiling’ and started calling it “creating composite, digital ‘mosaics’ by singling out, linking, and inferring personal attributes”, people might say “Well, it’s about time”

The UK Information Commissioner’s Office (ICO) has published a discussion paper seeking feedback on profiling provisions under the EU’s General Data Protection Regulation (GDPR). The deadline for comments is short (28 April), with the responses received to be used by the ICO to help it in its role on this issue as part of the Article 29 Data Protection Working Party (WP). The WP is aiming to publish guidelines on profiling later in the year and, remember, the GDPR will come into force on 25 May 2018 (including in the UK regardless of Brexit negotiations underway).

Let’s remind ourselves definition of profiling and why this is an issue that causes legal concerns in a big data age. Under Article 4(4) GDPR, profiling is defined as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”.

Notably, the GDPR dos not prohibit profiling activities from being carried out. In other words, EU law makers recognise that profiling can be a useful tool for the economy and society generally and also potentially can benefit individuals. However, personalised (solely, or indeed, partly) automated targeting of information from data can have significant, detrimental effects (as well as having legal ramifications) for individuals. Furthermore, the ways in which profiling are carried out (the underlying algorithms used) are often opaque to the people being profiled, as well as the profiling purposes.

The GDPR introduces new restrictions to protect individuals from this type of processing, including additional obligations on data controllers that they must comply with compared to the legal status quo. Remember, under the Data Protection Directive, there is no definition of profiling, albeit the concept of ‘automated individual decisions’ is acknowledged. Article 15 of the Directive grants “the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.”, unless such decision is: taken in the course of entering into or performance of a contract; or authorised by a law.

Article 22 of the GDPR lays out similar rules: “[t]he data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” (my emphasis – see below). However, the GDPR does make several important changes over and above inserting a formal definition for the term ‘profiling’. These changes include introducing ‘explicit’ consent as a new legal basis for profiling activities; as well as an obligation to inform data subjects specifically about profiling activities (including about “the significance and the envisaged consequences of such processing for the data subject”). Furthermore, a decision based on profiling must not be based on sensitive personal data unless the data subject has given their (explicit) consent to that processing (except where a law provides that such prohibition cannot be lifted by the individual’s consent); or, such automated decisions are necessary for reasons of public interest (Article 9 GDPR).

To understand how the introduction of these new rules came about and their significance, we must look back to 2012/3 when the importance of online behavioural advertising – and more generally the increased use of big data analytics across sectors – was formally recognised by EU lawmakers. Article 20 of the draft Regulation proposed by the European Commission in January 2012 first sought to limit the extent to which data subjects may be subjected to ‘profiling’ measures and prohibit profiling except in certain circumstances. Following this, in May 2013, the WP published an advice paper on essential elements of a definition and a provision on profiling in the Commission’s draft text. Interestingly, the WP stated its belief in that paper that more must be done to explain and mitigate the various risks that profiling can pose. In particular, it endorsed the introduction of a new definition of profiling for inclusion in the GDPR, but also emphasised that new profiling provisions should focus on collection of data for the creation and use of profiles, rather than merely on the outcome of profiling. The WP also suggested a need for greater transparency to make data subjects aware that their personal data is used for the purposes of profiling, as well as supporting an approach whereby profiling would have to be authorised first by the data subject’s explicit consent.

Returning, then, to the ICO (which has incidentally previously summarised the law on profiling and automated processing in its overview guidance on the GDPR here). In its discussion paper, the ICO (interestingly) confirms the WP’s position that, [t]he GDPR applies to profile creation as well as to automatic decision-making using profiling”. The ICO specifically states that the GDPR’s definition “appears to include the analysis of personal aspects as well as processing that has a predictive element”. [Although to note, the second question in the first bullet in bold below – on which feedback is sought – is exactly on this point – is there hesitancy here?! Should there be?] Regarding the predictive element, the ICO also recognises that profiling can create new data that needs to be GDPR-compliant in its own right (a point prone to oversight by many organisations, and yet the possibility of the creation of – in particular, sensitive – personal data through profiling should raise a big ‘red flag’ with anyone planning any profiling of data relating to persons).

In seeking input from stakeholders, the ICO highlights other key areas where further consideration is required – such as direct marketing, the right to object, and data minimisation – and then sets out some key questions that it consider remain untested. These are (emphases added for particular points of interest):

  • When, how and why does your organisation carry out profiling? Do you agree that there has to be a predictive element, or some degree of inference for the processing to be considered profiling?
  • How will you ensure that the profiling you carry out is fair, not discriminatory, and does not have an unjustified impact on individuals’ rights?
  • How will you ensure that the information you use for profiling is relevant, accurate and kept for no longer than necessary? What controls and safeguards do you consider you will need to introduce, internally and externally, to satisfy these particular requirements?
  • Have you considered what your legal basis would be for carrying out profiling on personal data? How would you demonstrate, for example, that profiling is necessary to achieve a particular business objective?
  • How do you mitigate the risk of identifying special (sensitive) categories of personal data included in your profiling activities? How will you ensure that any ‘new’ special category data is processed lawfully in line with the GDPR requirements?
  • How do you propose handling the requirement to provide relevant and timely fair processing information, including “meaningful” information on the logic involved in profiling and automated decision-making? What, if any, challenges do you foresee?
  • If someone objects to profiling, what factors do you consider would constitute “compelling legitimate grounds” for the profiling to override the “interests rights and freedoms” of the individual?
  • Do you consider that “solely” in Article 22(1) excludes any human involvement whatsoever, or only actions by a human that influence or affect the outcome? What mechanisms do you have for human involvement and at what stage of the process?
  • What is your understanding of a “legal” or “significant” effect? What measures can you put in place to help assess the level of impact? [This is key as the the GDPR does not explain what constitutes a “legal effect” or “significantly affects” the individual, with limited examples: automatic refusal of an online credit application; or, e-recruiting practices. Therefore, the interpretation of these concepts is likely to vary between EU Member States, depending on the national courts or data protection authority reviewing the controller’s profiling activities. The ICO notes that a “legal” effect might be something that “adversely impacts an individual’s legal rights, or affects their legal status” and that a “significant” effect “suggests some consequence that is more than trivial and potentially has an unfavourable outcome”. However, it also comments,”[i]t may be useful to establish an external recognised standard to measure such effects, instead of simply relying upon the subjective view of the controller or the data subject”. Who might pick up this mantle of responsibility is yet to be seen…but introduces an interesting proposition for all those – like me – who are interested in issues around the conceptualisation and quantification of data protection harms].
  • What mechanisms or measures do you think would meet the GDPR requirements to test the effectiveness and fairness of the systems you use in automated decision-making or profiling?
  • Do you foresee any difficulties in implementing the GDPR requirement to carry out a DPIA, when profiling?
  • Will your organisation be affected by the GDPR provisions on profiling involving children’s personal data? If so, how?

For background, this discussion paper should be viewed in the context of other work it is doing on big data analytics and data protection, with a number of related projects currently underway in the big data context. For example, the ICO recently published an updated version of its 2014 paper on big data, artificial intelligence, machine learning and data protection. In that paper, the ICO outlines the data protection implications of big data processing, and features practical advice on compliance tools to help organisations minimise privacy impact. It also presents six recommendations to help organisations achieve compliance which include anonymisation, data protection impact assessments (DPIAs), appropriate privacy notices, privacy by design, the development of ethical principles, and auditable machine learning algorithms.

Specific guidance is further contained in that paper on the practicalities of conducting DPIAs, a factor gaining frontline prominence with organisations dealing with big data in the light of the fact that the GDPR (Article 35) will require such assessments to be carried out whenever – inter alia – “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person”. Guidelines on conducting DPIAs has also been promised shortly by the WP (watch this space for developments).

To note, finally, even if organisations have assessed whether their profiling activities are lawful, they must still also ensure that they have implemented appropriate measures to guarantee that individuals can exercise their rights (in particular, the right not to be subject to a decision based solely on automated processing). This may be done, for example, by applying data minimisation and pseudonymisation techniques that are aimed at minimising the risk of affecting the privacy of individuals, as well as carrying out DPIAs prior to conducting their profiling activities, particularly if there is a risk of discrimination, identity theft or fraud, financial loss, damage to reputation, or other adverse effects for individuals.

Which leads me to ask one question: when will data protection authorities fully recognise that discussions about anonymisation and pseudonymisation – as a matter of jurisdictional determination (is data personal) – share the same legal analytic ‘space’ with separate discussions being carried out about data protection law compliance and DPIAs? After all, the GDPR does state (Recital 28) that “[t]he application of pseudonymisation to personal data can reduce the risks to the data subjects concerned and [thereby] help controllers and processors to meet their data-protection obligations”.

Alison Knight

One thought on “ICO Requests Feedback on New Data Protection Profiling Provisions

  1. Pingback: EU Article 29 Working Party consults on draft guidance on automated decision-making and profiling | Peep Beep!

Leave a Reply