Big data may not be a ‘game-changer’ under data protection law, but it does ‘up the stakes’ for data-controller compliance obligations
On 6 May, the European Commission published a Communication on a Digital Single Market Strategy for Europe. This sets out various measures to be taken so that “individuals and businesses can seamlessly access and exercise online activities under conditions of fair competition, and a high level of consumer and personal data protection, irrespective of their nationality or place of residence”. One theme considered by the Commission to be central to achieving this vision is maximising the growth potential of the ‘big data’ sector in Europe – with the Commission estimating that it is growing by 40% per year, seven times faster than the IT market.
The technological capabilities of big data analytics is a major ‘hot-topic’ under discussion in legal and policy circles internationally. Private and public sectors are making increasing use of its benefits in a range of industry applications, some of which use personal information. It is not surprising, then, that the UK’s Information Commission’s Office (the ICO) is interested specifically in ensuring that the emerging implications of big data for data protection and privacy are considered.
To this end, last July the ICO published a discussion paper on big data and data protection. This paper invited responses to its views that data protection principles are still relevant in a big data world. The ICO argues in this paper that compliance with the Data Protection Act 1998 (the DPA) – especially key data protection concepts, such as using data fairly and only for specified and limited purposes – is compatible with reaping benefits from big data. In particular, it invited responses to three key questions:
- Does the paper adequately reflect the data protection issues arising from big data or are there other relevant issues? If so, what are they?
- Should the ICO produce further guidance to help organisations that are doing big data analytics to meet data protection guidelines? If so, what should they cover?
- Are other practical measures and tools needed to protect data privacy in the context of big data analytics? If so, what are they?
The paper also includes practical suggestions on how organisations using big data can safeguard data protection rights, such as obtaining data subject consent and keeping personal data secure, while also evidencing that they are acting responsibly as controllers of such data. These include adopting an early stage transparency approach, carrying out appropriate due diligence and using certain ‘privacy by design’ compliance tools (such as applying anonymisation techniques to personal data).
Moving forward to May 2015, the ICO has just published a summary of the feedback it received to its discussion paper. As an overview document, this summary picks out and discusses what the ICO considers to be the key themes addressed and responds to the main feedback under these topics. Most importantly, it notes an apparent consensus with the general views it put forward in its paper as being on the ‘right track’, although it recognises that changes of emphasis could be added in light of some of the respondents’ suggestions. In general terms, these include consideration of the argument that big data requires a regulatory focus on data processing at the point of use (or misuse), rather than solely on how it is collected, and recognition that data subjects may be affected in different ways by the application of big data analytics to their personal data.
One practical theme that emerges strongly in the summary is the importance of privacy ‘engineering’, which requires that organisations already have an adequate framework in place when they start a big data project for predicting and mitigating any privacy risks that might arise. These potentially include a range of organisational and technical ‘tools’ that should be addressed from the outset. In particular, both respondents and the ICO acknowledge the key role that privacy impact assessments (PIAs) play in data-controller compliance obligations and incentivising the use of less privacy-intrusive measures. The ICO says that it will continue to encourage businesses to follow the PIA code of practice it published last year setting out practical advice on how to do PIAs. It will also look to work with a few industry sectors to develop its forthcoming plans to discuss the development of more-specific PIA guidance and case studies on big data with key stakeholders. It also comments that it plans to hold a seminar on privacy and big data later in 2015, as well as issue a new version of its 2014 discussion paper during the summer in light of the feedback it received.
In conclusion, while the ICO confirms that the same ‘rules of the game’ apply to those engaging in big data analytics involving personal data, it seems to appreciate the difficulties that can face controllers in a big data context (such as seeking consent from data subjects, the difficulty of providing adequate privacy notices, and ensuring adequate personal data security). Yet, the regulator is at pains to show that organisational accountability to engage in measures that can help to protect data privacy, rather than individual responsibility, should bear the most emphasis in this context. To this end, it recognises that there is a need for more work and it encourages the development of standards and sector-specific guidance by industry on its own initiative.
While a spirit of regulatory partnership with the private sector is underlined here, we are sure to see some enforcement decisions being issued by the ICO in the future in this area. It seems timely, then, that the ICO has just announced its publication of updated guidance on the issue of monetary penalties (under section 55A-E of the DPA), the underlying objective of which is to promote compliance with data protection law as an integral part of the carrying out of any business activity where personal data may be involved. The ICO may impose a monetary penalty notice if a data controller has ‘seriously’ contravened the DPA and if the contravention was of a kind likely to cause ‘substantial’ damage or distress. In addition, either the contravention must have been deliberate or the data controller must have known – or ought to have known – that there was a risk that a contravention would occur and failed to take reasonable steps to prevent it. Organisation relying on big data analytics beware – ignorance is no excuse!
Alison Knight
Pingback: Businesses engaged in ‘big data’ personal data processing should consider carefully whether they have ‘legitimate interests’ grounds to justify their activities, says ICO | Peep Beep!
Pingback: ICO responds to public comments on its approach to Big Data and Data Protection, and sets out its future agenda – Alison Knight | Inforrm's Blog