Use Privacy Impact Assessments to measure the impact of data processing operations
Crash Test Dummy – why every prudent processor of data should use Privacy Impact Assessments (PIA)
We all feel more secure when we get into our cars knowing they are kitted out with multiple safety features developed through testing and predicting risks of force, impact and injury should things go wrong.
The same general principles apply when data is processed – data controllers will feel more comfortable engaging third party data processors when they know that those data processors use Privacy Impact Assessments to assess and address risks relating to the rights of individuals from a processing operation.
What is a Privacy Impact Assessment?
Simply put, a PIA is a tool to identify and reduce the privacy risks of any project, process or process change within your organisation, and it is a key part of the GDPR concept of “privacy by design”. By incorporating PIA into the fabric of your organisation, you can:
- More readily identify potential problems;
- Ensure that the organisation as a whole is more aware of privacy and data protection issues; and
- Increase the likelihood of being able to adhere to the GDPR (and to be able to show the same)
Who should conduct a PIA under GDPR?
Under Article 35 of the GDPR, data controllers are required to undertake PIAs.
When to carry out a PIA?
Article 35 states that a PIA should be undertaken prior to data processing where such processing is likely to result in a high risk for the rights and freedoms of individuals. Article 35 then goes on to present a non-exhaustive list of when to complete a PIA.
As there is no definition of what “high risk” might mean, and the list is non-exhaustive; the issue of PIA has been one of the first areas of focus for the Article 29 Working Group who are issuing guidance on a number of key elements of the GDPR.
Article 29 Working Party: guidance to help interpret when to conduct a PIA
After issuing draft guidance in April 2017, the Article 29 Working Party issued its final guidance on PIA on 4 October 2017. The guidance provides a set of evaluation criteria to consider when assessing whether to carry out a PIA. The guidance suggests that where two (2) or more of the following criteria are present, an organisation should consider a PIA. The criteria are (taken directly from the guidance):
- Evaluation or scoring, including profiling and predicting, especially from “aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behavior, location or movements” (recitals 71 and 91);
- Automated-decision making with legal or similar significant effect: processing that aims at taking decisions on data subjects producing “legal effects concerning the natural person” or which “similarly significantly affects the natural person” (Article 35(3)(a));
- Systematic monitoring: processing used to observe, monitor or control data subjects, including data collected through networks or “a systematic monitoring of a publicly accessible area” (Article 35(3)(c));
- Sensitive data or data of a highly personal nature: this includes special categories of personal data as defined in Article 9 (for example information about individuals’ political opinions), as well as personal data relating to criminal convictions or offences as defined in Article 10;
- Data processed on a large scale: the GDPR does not define what constitutes large-scale, though recital 91 provides some guidance. In any event, the WP29 recommends that the following factors, in particular, be considered when determining whether the processing is carried out on a large scale: a. the number of data subjects concerned, either as a specific number or as a proportion of the relevant population; b. the volume of data and/or the range of different data items being processed; c. the duration, or permanence, of the data processing activity; d. the geographical extent of the processing activity;
- Matching or combining datasets, for example originating from two or more data processing operations performed for different purposes and/or by different data controllers in a way that would exceed the reasonable expectations of the data subject;
- Data concerning vulnerable data subjects (recital 75);
- Innovative use or applying new technological or organisational solutions, like combining use of finger print and face recognition for improved physical access control, etc. The GDPR makes it clear (Article 35(1) and recitals 89 and 91) that the use of a new technology, defined in “accordance with the achieved state of technological knowledge” (recital 91), can trigger the need to carry out a PIA;
- When the processing in itself “prevents data subjects from exercising a right or using a service or a contract” (Article 22 and recital 91). This includes processing operations that aims at allowing, modifying or refusing data subjects’ access to a service or entry into a contract.
- The guidance goes on to provide that any PIA should be “continuously reviewed and regularly re-assessed” as a matter of good practice, meaning that PIA will become part of the accountability road map.In good news, the guidance softens the previous recommendation to conduct a PIA on all processing activities in place prior to May 2018, and instead suggests that organisations instead review which of those processing activities require a PIA and then to complete the same by May 2018.
How does this relate to the role of the data processor in the context of pre-employment background screening?
So we now know by whom and when to carry out a PIA. This means that a prudent data controller, when engaging a data processor, would be well advised to carry out a PIA on that data processor if two (2) or more of the evaluation criteria are present. In order to do that, it is likely that an audit of some form would be required. This could be a physical audit, or an audit in the form of a questionnaire.
This sounds straightforward i.e. completing a questionnaire or preparing for an audit. However, in order to enable the data controller to complete their PIA, the data processor needs to have asked itself the same questions and gone through the same process.
In the context of background screening, it is highly likely that two (2) or more of the evaluation criteria are present by virtue of the services themselves. The most likely criteria:
- processing of sensitive data (criminal, credit, adverse media, drugs screening);
- large scale processing (many organisations have its service providers handle large volumes of candidates);
- combining data sets (screenings are made up of a variety of diverse products); and
- innovative use of technical or organisational solutions (most screening providers utilise technology to more efficiently deliver services to an increasingly global market).
- Therefore, a data processor should carry out its own PIA and put in place its own PIA policy.
What should a PIA process or policy look like?
In order for a PIA to work, a systematic approach should be taken that can be applied throughout the organisation in any department. Ideas to consider (as suggested by the Information Commissioners Office in the UK):
set up a PIA core team from different departments that can be brought together to work the PIA
prepare a question set for teams to complete in order to work out if a PIA is needed
prepare a template for teams to complete:
- reasons for the PIA
- information flows and who needs to be consulted including who has sign off
- identify the risks
- identify the solutions
- record the PIA outcome and sign off
- integrate PIA into project plan
- record follow up sessions or likely timing of a refresh PIA
- circulate outcome to all stakeholders
Whilst the primary obligation to carry out a PIA rests with the data controller, their data processors should also consider implementing a PIA policy and carrying out a PIA.
After all, you could say that data processors can and will be expected to serve the valuable safety role of “crash test dummies” for data controllers.
Learn more about how to prepare your screening programme for the GDPR