A Perfect Pain? Privacy, Profiling & Big Data, the EU-Way

Creating big databases from disparate sources of personal data is all the rage. We have the technologies to mine them like never before, and this has set off something of a gold-rush to gain ever-greater insight into user/customer/human behaviour.

This picture obviously raises privacy issues. Once you add in the fact that many organisations want to re-use existing personal data stores in their quest for insight, the picture gets more complex still. Amongst other things, how much detail about what you are doing and intending do you have to explain to the individuals concerned? Does this information need to be future-proofed? Do you need to seek an individual's consent to what you are doing?

Until recently we had little privacy guidance to feed off, plus apparent regulator indifference to issues in this area. Now Google are being threatened with sanctions, Apple have just been dragged through the German courts, and the EU's privacy guidance body (the Article 29 Working Party or A29WP) has produced a 70 page report to put some flesh on the bare bones of the law. 

In this post, we look at what the A29WP's report has to say (and be warned, what follows may not make for palatable reading).

Why has the A29WP produced a report now?

  • The time is now. The law tends to lag behind commerce. Commerce has woken up to the potential of personal data. In fact it did so years ago. From experience one might say it is flouting the law in this area. The EU has been jarred into action as a result.
  • Politics. As we've previously blogged, the EU's draft General Data Protection Regulation is being considered at present. The EU's privacy guidance body appears to want to reinforce its previous comments on certain areas of the Regulation. 
  • Self-justification? Google's scrap with several EU member state's data protection regulators is partially linked to issues in this area. By issuing new guidance, the A29WP is certainly assisting others to avoid making Google's mistakes; cynics might say it is also trying to justify the EU member states' stance against Google. 

Quick recap of the law

The A29WP's report focus' on Principle 2 of the EU Data Protection Directive. 

In broad terms this Principle requires that:

  • Personal data should only be used for purposes that are (a) specified, (b) explicit and (c) legitimate. 
  • Once collected, personal data cannot be "further processed in a way incompatible" with the original purposes for which it was collected. 

How does this principle relate to big data and profiling

There are at least two key reasons.

Firstly, organisations create databases and undertake profiling with particular aims in mind. Principle 2 effectively requires you to state these aims clearly and in public; the key question is at what level of detail? Noone wants to bore consumers or users with details they never read, but on the flip side, a high-level approach allows organisations to gloss over things those users and consumers may find unpaletable. The law has to strike the right balance between these competing interests.

Secondly, and as already touched upon, you can't chop and change these aims on a whim. If you are taking an existing database on people, and wanting to use it in different ways, you may find the law prevents you from doing so if your new aims are "incompatible" with the original purposes for which you obtained the data. This kind of "reuse" of existing data is fundamental to many big-data and profiling projects.

What is the EU looking for?

Don't forget, all A29WP guidance is technically non-binding. It is however endorsed by the data protection authorities in all 27 member states, so absent much in the way of case law in this area, it is the best guide we have to interpreting the current EU Data Protection Directive and each member state's local implementing laws.

  1. You need to describe the purposes for which you use personal data. No surprises here then. Technically, you don't have to put the purposes in a privacy policy (although this route is probably the easiest route); there are lots of different means of achieving the same end. 
  2. Timing is key. The purposes have to be specified at the time the data is being collected. 
  3. Tailor your description to your circumstances. You don't have to spell out the obvious, so think about what a reasonable person would expect to happen to their data based on what they directly see and experience. Don't push the boundaries though. Apply good consumer protection best practice and treat people with kid-gloves. When seeking to change purposes, you should revisit this exercise. Anything beyond a person's reasonable expectation risks being "incompatible" and therefore unlawful.
  4. The more ambiguous, alien or unexpected your use, the more depth you need to go into. 
  5. Sweeping generalisations are not good enough. They don't tick the "specified" or "explicit" bits of Principle 2 (see above). You can't just say data will be used for "improving users experience", "marketing", "IT security", "future research", "preventing fraud and abuse of the financial system", "complying with legal obligations requiring certain information to be reported to competent authorities", or (where you are doing sophisticated profiling) used "for marketing purposes including providing customers with special offers and discount coupons".
  6. If you are doing sophisticated things, you also need to set out "how" you process data for each purpose. For example, the EU sees your decisional criteria and market segmentations for profiling as particularly key. Reading between the lines, this is probably because if made public (at least at a level consumers can understand) it allows a consumer to choose between organisations on the basis of which ones look least scary or "big-brother" in their approach.
  7. You need to layer the information you provide to make sure it is digestible. Again this can be done in lots of different ways. The EU has suggested using hover-over warning symbols, sub-purposes, linked documents, creative content (e.g. articles) as well as the mundane privacy policy.
  8. Don't do anything which is actually unlawful! In most cases this should be obvious (e.g. you can't use personal data to discriminate on racial grounds). Some laws are trickier to think through. For example, unfair trading regulations protect consumers against practices which unfairly distort their behaviour. In the UK, misrepresenting or glossing over privacy issues is already being looked at as to whether it could be considered such an unfair practice. The US already take the attitude that it is (again Google has fallen foul of the rules in this area)
  9. In many cases you do need "free, specific, informed, unambiguous, "opt-in" consent. This is especially the case if you are changing purposes. The A29WP expressly said consent should be obtained for tracking and profiling for the purpose of behavioural advertising, direct marketing, data-broking, location-based advertising and tracking-based digital market research. They also used 2 retail-specific case studies to illustrate their point (retailers beware)
  10. Consider your market and especially, your direct competitors. For example, if you are the only player in a particular space, and you change the purposes for which you want to use data, you have a bigger duty to give individuals real choice over that change, because otherwise they have to lump it or lose access to your particular kind of service altogether.
  11. Tick box "consent" will not get you home for all purpose changes. The A29WP used the example of a dominant market player to illustrate (quite convincingly) why consent may not be truly "free" in all cases. If consent is not free, then any new purposes are likely to fall foul of the incompatibility rule.
  12. The more incisive your analysis the more additional safeguards you should build in, otherwise any change in purpose is likely to be ruled "incompatible". The list of potential concerns the A29WP have is long; the scale of data being collected, whether it is all secure, whether the outcomes are sensitive even if the input data is not, the possibility of people having their secrets revealed, being embarrassed or suffering social exclusion, discrimination, or economic prejudice, government surveillance etc etc. They want to see organisations adopt approaches that mitigate these risks, especially user/customer-facing tools.
  13. If you are profiling people, you should be disclosing detailed information back to them. This point builds on point 6 above. The A29WP expressly said these disclosures should include an individual's actual profile, the algorithm or decisional criteria from which this is derived, and the relevant data sources. People should also have the ability to correct their data.  You might also want to consider giving the user/customer sophisticated choices, such as how long their profile is retained for, or even the right to take their data with them in electronic form (note that this is a principle enshrined in the draft EU's General Data Protection Regulation, and already looked at by the UK government's midata initiative). 
  14. How to go about obtaining consent. The A29WP implicitly approved the following steps for a website to achieve consent to a change in purpose (provided they have complied with all the above points of course!). This may sound positive in principle, but in reality it will appear quite onerous to many marketeers.
    • Send an email notice to every user/customer giving full details of the change and a detailed privacy policy (This email cannot have any marketing in it though).
    • A detailed privacy policy should also be available on the organisation's website giving full details of the changes. (This should be prominent)
    • A period of advance notice should be given (e.g. 30 days) in which the consumer/user can choose how it wishes to respond.
    • Effective tools should be provided to allow the effect of the change to be mitigated if the consumer/user doesn't like it.
    • Consent to the change can be deemed at the end of the notice period.
    • A user/customer has to accept the new privacy policy before it can browse the site. 

What should your organisation be doing in practice?

  1. Be honest with yourself about what you do and may want to do in future with personal data. 
  2. Understand your algorithms in full (even if that means putting some colleagues or suppliers on the spot and recognising unpalatable truths)
  3. Consider scrapping your existing privacy policy if you are profiling (it is probably not thorough enough) but do this with real care and a clear process to get a new one in place (see below!)
  4. If you are changing your privacy policy and purposes, at the very least plaster prominent details everywhere you can, give people full information, and a decent time period to make a decision that suits them.
  5. Think about how best to enable user/consumer choice on a change in purpose and yet still derive the benefits you want (this may need hard coding into your website though).
  6. Make that choice a real one, a hobson's choice for the user/consumer is no good.
  7. Incentivise the outcomes of that choice that you want to see.
  8. Be prepared to "open up" in full in your privacy policy, including (in broad but accurate terms) about your data sources and why and how you profile people.
  9. If you don't want to "open up", think whether this is because you have something to hide and if you do, address it.
  10. If opening up in full in your privacy policy makes it unwieldy, split it up, use different documents and formats, and above all, get creative.
  11. Treat these issues as you would any other marketing exercise, and deploy your best user experience experts, copywriters etc; don't just leave it to your lawyers.
  12. Don't get too carried away with transparency though; the EU have not said you have to cough up all your trade secrets
  13. Think the position through at the project inception stage, and thereafter as your project progresses. Your commercial thinking will no doubt evolve. Your privacy approach will have to evolve with it.
  14. Don't bury your head in the sand.  

The last point is particularly key. Get found out on this issue, and at the very least you may struggle to realise any benefit from your data project because you won't be able to lawfully or fully use your data, however "big" and well-analysed. Stranded IT costs alone may be enough to make you think twice.

Finally, you may well be thinking "how likely are we to get found out?" (This has been the stance of many organisations to date in our experience). If so, you should also think "Can we afford to take this chance?". You may want to strike data "oil" but no one wants to end up with an Exxon Valdez or Deep Water Horizon on their hands.

This information is intended as a general discussion surrounding the topics covered and is for guidance purposes only. It does not constitute legal advice and should not be regarded as a substitute for taking legal advice. DWF is not responsible for any activity undertaken based on this information.