Profiling and automated decision making under GDPR

Chaucer / Digital Viewpoints  / Profiling and automated decision making under GDPR

Articles 20 and 22 of the GDPR provides safeguards for individuals against the risk that a potentially damaging decision where profiling and automated decision making is taken without human intervention.

What is Profiling?

Profiling can enable aspects of an individual’s personality or behaviour, interests, hobbies and habits to be discovered, analysed and predicted. Profiling has already found its way into many areas of life in the form of consumer profiles, movement profiles, user profiles and social profiles. Profiling is not always visible and may take place without an individual’s knowledge.

What is Automated Decision Making?

Automated Decision Making is a decision which is made following processing of personal data solely by automatic means, where no humans are involved in the decision-making process. The rights of an individual in relation to an automated decision only arise where the automated decision can have a significant impact on the individual (eg where the decision relates to the individual’s job performance or creditworthiness).

These rights exist so that no potentially serious decisions regarding an individual is taken without human involvement. These rights work in a similar way to existing rights under the current Data Protection Act 1998.

Some examples of sources of data used to profile individuals:

  • internet search and browsing history
  • education and professional data
  • data derived from existing customer relationships
  • data collected for credit assessments
  • financial and payment data
  • consumer complaints or queries
  • driving and location data
  • information from store cards and credit cards
  • consumer buying habits
  • wearable technology, such as fitness trackers and smart watches
  • lifestyle and behaviour data gathered from mobile phones
  • social network information
  • CCTV / video surveillance systems
  • biometric data

When do individuals have the right to NOT be subject to an automated decision?

When it is based purely on automated processing

Examples:

  • An individual applies for a personal loan online. The website uses algorithms and auto credit searching to provide an immediate yes/no decision on the application.
  • A factory worker’s pay is linked to his productivity, which is monitored automatically. The decision about how much pay the worker receives for each shift he works is made automatically by reference to the data collected about his productivity.

When it produces a legal effect or a similarly significant effect on the individual that is deemed negative

Examples:

  • In the above example on monitoring the productivity of a factory worker, it is obvious that a decision about how much pay he is entitled to will have a significant effect on him.
  • An employee is issued with a warning about late attendance at work. The warning was issued because the employer’s automated clocking-in system flagged the fact that the employee had been late on a defined number of occasions. However, although the warning was issued on the basis of the data collected by the automated system, the decision to issue it was taken by the employer’s HR manager following a review of that data. So the decision was not taken by automated means.
  • An individual complains to a credit provider because his online application for credit was declined automatically. The application was declined because the information provided by the individual did not match pre-defined acceptance criteria applied by the automated system. The credit provider undertakes manual underwriting checks to review the original decision

In nearly all cases, you must ensure that individuals are able to:

  • Obtain human intervention
  • Be permitted to express their point of view
  • Obtain an explanation of the decision made
  • Be given the opportunity to challenge the decision

Does the right apply to ALL automated decisions?

No. The right does NOT apply if the decision:

  • Is necessary for entering into or performance of a contract between you and the individual.
  • Is authorised by law enforcement (e.g. for the purposes of fraud or tax evasion prevention).
  • Is based on explicit consent from the data subject under Article 9(2)) of the GDPR.

Furthermore, the right does not apply when a decision does not have a legal or similarly significant effect on someone.

What else does the GDPR say about profiling?

The GDPR defines profiling as any form of automated processing intended to evaluate certain personal aspects of an individual, in particular to analyse or predict their:

  • Performance at work
  • Economic situation
  • Health
  • Personal preferences
  • Reliability
  • Behaviour
  • Location
  • Movements

Five tips for processing personal data for profiling purposes – you must:

  1. Ensure that appropriate safeguards are in place.
  2. Ensure processing is fair and transparent by providing meaningful information about the logic involved, as well as the significance and the envisaged consequences.
  3. Use appropriate mathematical or statistical procedures for the profiling.
  4. Implement appropriate technical and organisational measures to enable inaccuracies to be corrected and minimise the risk of errors.
  5. Secure personal data in a way that is proportionate to the risk to the interests and rights of the individual and prevents discriminatory effects.

Processing of Special Data

Automated decisions taken for the purposes listed in Article 9(2)

Processing of special categories of personal data must not concern a child, or be based on the processing of special categories of data unless you have the explicit consent of the individual, or the processing is necessary for reasons of substantial public interest on the basis of EU / UK law.

This must be proportionate to the aim pursued, respect the essence of the right to data protection and provide suitable and specific measures to safeguard fundamental rights and the interests of the individual.

Five tips as to what your business needs to do now

  1. Assess whether your business makes automated decisions (including the use of profiling), or is likely to do so, in relation to individuals.
  2. Consider whether any such automated decisions will be exempt.
  3. Review and amend any existing policies and procedures you have in place dealing with automated decision making (including profiling) so that they comply with the GDPR.
  4. Get GDPR-compliant policies and procedures drafted if you do not have any.
  5. Ensure your staff are aware of your obligations in relation to automated decision making (including profiling) under the GDPR.

Implementing appropriate safeguards for the future

The GDPR requires your organisation to use appropriate mathematical or statistical procedures to safeguard individuals’ rights and freedoms when carrying out automated processing or profiling under Article 22(3)

Your organisation must also introduce technical and organisational measures to avoid and correct errors and minimise bias or discrimination of individuals. These requirements may involve you implementing:

  • Measures that identify and quickly resolve any inaccuracies in personal data.
  • Security appropriate to the potential risks to the interests and rights of the data subject.
  • Safeguards to prevent discriminatory effects on individuals on the basis of special categories of personal data.
  • Specific measures for data minimisation and clear retention periods for profiles.
  • Anonymisation or pseudonymisation techniques in the context of profiling.
  • A process for human intervention in defined cases.

Your organisation might also want to consider:

  • New ways to test your big data systems.
  • The introduction of innovative techniques such as algorithmic auditing.
  • Accountability/certification mechanisms for decision making systems using algorithms.
  • Codes of conduct for auditing processes involving machine learning.
  • Ethical review boards to assess the potential harms and benefits to society of particular applications for profiling.

If you think we can help you to implement your project or programme strategy, please call us on:
+44 (0) 203 141 8400 in UK/Europe, or ​​​+1 713 821 1783 in the USA.

Alternatively, please send us some brief information and we can discuss things in more detail.