big data / Data protection / data protection agencies / General Data Protection Regulation / Privacy impact assessment / Risk-based approach / sensitive data

New EU Guidelines on Data Protection Impact Assessments

mars-11613__340

Assessing the likelihood of a ‘deep impact’ – but how ‘deep’ is ‘deep enough’ and by whose standards? In other words, how exactly do you develop a methodology for determining whether processing is “likely to result in a high risk” to data subjects under the GDPR?

Draft guidelines on conducting data protection impact assessments (DPIAs) have been published by the Article 29 Working Party (WP). The guidelines seek to clarify how DPIAs will function and when they are necessary under the new GDPR. Let’s remind ourselves of the meaning of DPIAs and why this topic is gaining frontline prominence by regulators. More importantly, why do organisations dealing with big data need to sit up and pay attention in the light of new obligations upon controllers to apply from 25 May next year?

The GDPR does not formally define the concept of a DPIA as such, Article 35 states:

“Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data” (emphasis added)

The DPIA’s role is also clarified by Recital 84 GDPR as follows:

In order to enhance compliance with this Regulation where processing operations are likely to result in a high risk to the rights and freedoms of natural persons, the controller should be responsible for the carrying-out of a data protection impact assessment to evaluate, in particular, the origin, nature, particularity and severity of that risk”  (emphasis added)

So what exactly is a ‘high risk’ situation? The GDPR side-steps this issue by setting out a non-exhaustive list of categories of processing that will fall within this provision (Article 35(3)):

“(a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;

(b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10; or

(c) a systematic monitoring of a publicly accessible area on a large scale.

Outside these examples it is clear that an assessment is required – in advance of intended processing activities on the facts of each case – whether such activities are likely to present high risks. Does that translate into the requirement to carry out a DPIA before you do a DPIA? The WP confirms its view that this should be the case (p.7):

“In cases where it is not clear whether a DPIA is required, the WP29 recommends that a DPIA is carried out nonetheless as a DPIA is a useful tool to help data controllers comply with data protection law”

To note, during the GDPR negotiation process, the European Parliament proposed to go a step further and supplement the DPIA requirement with a preceding obligation to carry out a “risk analysis of the potential impact of the intended data processing on the rights and freedoms of data subjects, assessing whether its processing operations are likely to present specific risks”. It then listed different processing operations likely to present ‘specific risks’ (EP Legislative Resolution, Amendment 127, Article 32a). However, this proposal was ultimately rejected. If the WP now recommends that – if controllers are in doubt over whether a DPIA is required – they should err on the side of caution and carry out a DPIA, isn’t this a ‘specific risk’ analysis recommendation by any other name?

So, how should DPIAs be carried out, bearing in mind that the GDPR (Article 35(8)) envisages that “[c]ompliance with approved codes of conduct … by the relevant controllers or processors shall be taken into due account in assessing the impact of the processing operations”? [We can only assume that these guidelines are a key code of conduct in this context]. The guidelines offer a general rule of thumb that processing operations meeting at least two of a list of criteria will require a DPIA. However, a processing operation meeting only one criterion may require a DPIA depending on the circumstance. The criteria are as follows:

  • Are you doing evaluation or scoring (including profiling and predicting) of aspects specific to the data subject?
  • Does the processing involve automated decision making that produces significant effect on the data subject?
  • Are you performing systematic monitoring of data subjects, including in a publicly accessible area?
  • Does the processing involve sensitive data (special categories of data as defined in Article 9 GDPR and data regarding criminal offences)?
  • Is the data being processed on a large scale?
  • Have datasets been matched or combined (“for example originating from two or more data processing operations performed for different purposes and/or by different data controllers in a way that would exceed the reasonable expectations of the data subject”).
  • Does the data concern vulnerable data subjects (as laid out in Recital 75 GDPR)?
  • Is this an innovative use or does it apply technological or organizational solutions (for example, combining use of finger print and facial recognition)?
  • Are you transferring data outside the European Union?
  • Will the processing itself prevent data subjects from exercising a right or using a service or a contract?

While this list seems sensible in specifying the conditions for processing operations likely to present high risks, some criticisms can be made. It is typified by the reference to “large scale” processing as a key indicator of high risk. This may be considered a bit of a ‘red herring’. The WP here (from 2012) has previously suggested that the words “on a large scale” should be deleted from an earlier draft of Article 35 as risky processing activities requiring a DPIA can apply to small-scale processing operations (e.g. biometrics). Yet, in footnote 14 of the DPIA guidelines the WP states, “[n]onetheless, if sensitive data are not processed systematically and on a large scale, their processing does not automatically present high risks for the rights and freedoms of data subjects”. [Surely the WP’s 2012 approach is the right one – processing scale does not equate to potential data impact upon individual data subjects! Or is this implied in the use of the word “automatically”: i.e. it requires a case-by-case assessment. Why does the WP not say this more plainly?]

A similar criticism can be made against reference to the WP’s above use of the phrase “significant effect” without further explanation as to how this concept should be interpreted. As mentioned in my recent post on the GDPR’s profiling provisions, the ICO quite rightly pointed out the value in establishing “an external recognised standard to measure such effects, instead of simply relying upon the subjective view of the controller or the data subject”. The same logic of this recommendation surely also applies to ensuring coherent interpretation of the related concept of ‘high risks’ under the GDPR, especially as the WP does not make clear what the risk should be of exactly although it does comment as follows (p.15): “A “risk” is a scenario describing an event and its consequences, estimated in terms of severity and likelihood.”

[Indeed, should assessments of risks and likely negative effects that may be suffered by the data subject as a result of the processing be equated, as the WP later states (p.18):

An example of an unacceptable high residual risk includes where the data subjects may encounter significant, or even irreversible, consequences, which they may not overcome, and/or when it seems obvious that the risk will occur.”

Furthermore, the WP (p.8) – in listing “automated-decision making with legal or similar significant effect” as an example of high-risk processing – gives the example of processing which “may lead to the exclusion or discrimination against individuals”, whereas “processing with little or no effect on individuals does not match this specific criterion”.]

Although it side-steps this issue, the WP does specify a minimum of four features that a DPIA must include that will help assess the level of likely impact to result from a processing activity:

  • A systematic description of the envisaged processing operations, the purposes of the processing and, if applicable, the legitimate interests pursued by the controller;
  • An assessment of the necessity and proportionality of the processing operations in relation to such purposes;
  • An assessment of the risks to the rights and freedoms of data subjects; and,
  • The measures envisaged to address such risks (compare, Recital 84 GDPR: “The outcome of the assessment should be taken into account when determining the appropriate measures to be taken in order to demonstrate that the processing of personal data complies with this Regulation”).

Moreover, Annex 1 to the guidelines contain a list of links to examples of existing DPIA frameworks (both generic and sector-specific) and to international standards containing DPIA methodologies for reference. Annex 2, by comparison, sets out the criteria for an acceptable DPIA by reference to the relevant GPDR provisions. In this context, the WP is happy to leave it to controllers to choose the exact form and structure of each DPIA to fit within each controller’s existing working practices, as long as they align their analyses with established impact assessment methodologies. [To note, the WP equates DPIAs with the more long-standing concept of privacy impact assessments (PIAs). Yet, the equivalence of the two in terms of the underlying interests to be considered is not altogether clear-cut – a subject left open for a future post].

Either way, it is clear that – under the GDPR – controllers will be required to demonstrate accountability through a genuine assessment of risk, not only to facilitate compliance with the GDPR, but also to “demonstrate that appropriate measures have been taken to ensure compliance” as controller must document the (above) four features of their risk assessments (Article 35(7) GDPR).

Furthermore, the WP exhorts data controllers to also carry out further DPIAs when there is a change of the risk (being a context-driven assessment) represented by processing operations, such as change of purpose. This makes the impact assessment into more of a continuing obligation rather than a one-off exercise [although to note, a single DPIA may address either a single data processing operation or multiple processing operations if they are similar in terms of risk, nature, scope, context, and purpose].

Finally, to note, while the WP advises that data controllers should consult the data subjects or their representatives on the intended processing as part of the risk assessment process (see Article 35(9) GDPR), it offers no guidance for determining when this consulting may be deemed appropriate. The WP only states (p.13) that controllers should document “its justification for not seeking the views of data subjects, if it decides that this is not appropriate”, and (if the data controller’s final decision differs from the views of the data subjects) “its reasons for going ahead or not”.

In conclusion, I can’t help feeling that we are still some way to understanding the ‘science’ behind the quantification of risk levels that sits behind the GDPR (e.g. when a risk to a data subject might jump from ‘low’ to ‘high’ risk, or vice versa), and there is more work to be done in this area to ensure harmonisation by delving the legal concepts that sit on top (e.g. ‘risks’, ‘effects’, ‘harm’) and their meta-concepts (a typology of data protection-centric risks and harms) in turn. This is notwithstanding the fact that the WP encourages the development of sector-specific DPIA frameworks to better address particular types of processing operations; and Article 35(3)-(4) GDPR require national data protection agencies to produce a blacklist and a whitelist of processing activities that should or should not be subject to an impact assessment (to be submitted to the European Data Protection Board to supersede the WP under the GDPR).

After all, the stakes are very high with fines possible – including in respect of not carrying out DPIAs when they should have been carried out – up to 2% of worldwide annual turnover per annum. Moreover, prior consultation with a national data protection authority is also required if a processing operation is deemed to reveal what the WP calls “high residual risks” (see Article 36(1) GDPR: “The controller shall consult the supervisory authority prior to processing where a data protection impact assessment under Article 35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk”, discussed at p.18 of the guidelines).

Has the WP done enough to fill in the gaps as to what the Article 35 GDPR requires in practical terms in determining levels of risk that rely upon contextual assessments, particularly as this analysis is also relevant to construing many of the GDPR’s other key risk-based provisions? Arguably, the GDPR is reaching out for a context-driven answer – that requires detailed analysis of the different types of risks and harms that can arise through data processing  – and, in particular, standards of measurement against which to assess their “origin, nature, particularity and severity. Yet the WP appears to be holding back somewhat in relying upon pre-existing guidelines and formalistic concepts, rather than grabbing with both hands the challenge laid out for setting a new approach.

The guidelines are open to public consultation until 23 May 2017, after which a final version will be adopted. Comments should be sent to JUST-ARTICLE29WP-SEC@ec.europa.eu and presidenceg29@cnil.fr.

Alison Knight

Leave a Reply