Many of my friends, family and contacts will already know how I feel about surreptitious digitized ‘spying’ by website and App providers (who call it something else and say they only track your behaviour for ‘personalisation’, of course). I’ve been known to rant about ‘sneaky tracking’ ever since it became possible to identify people (by their Internet-connected devices) and connect all of their individual ‘dots’ together to monitor pretty much everything they do, predict what they’ll do in predefined situations and, of course, sell that data to 3rd parties, often without telling you openly and clearly. These ‘Cookie Monsters’ make my blood boil…
The implementation of GDPR in 2018 tilted the balance slightly more in favour of the Consumer (that means you). I spent about year as a specialist GDPR Consultant Business Analyst for some large Public and Private entities and most are openly keen to comply and to make sure they don’t infringe on your rights.
However, even the potentially Company-busting penalties possible under GDPR don’t make the problem go away for a variety of different reasons:
Some providers don’t pay much attention to GDPR, to be honest, because it can’t “hurt” them for one reason or another – usually because of jurisdiction, limited enforcement resourcing, and the sheer scale of the problem (thousands of providers with millions of pages tracking billions of interactions every day).
Some providers comply with GDPR to the smallest extent possible (the MCP or ‘Minimum Compliant Product’ approach).
Some providers don’t actually know, even at this stage, whether they are GDPR-compliant with regard to your data, security & privacy.
Of course, the biggest problem in all of this is us: the consumers. Put simply, we allow providers to behave like this. The ubiquitous, impatient “Accept, Agree and Install” mindset that we have when accessing websites, Social Media platforms or Apps means that sneaky providers know they can simply get you to ‘actively, explicitly’ permit them to track you by requiring you to click more than one button to prevent it! They know you would rather accept (or ignore) the risk than have to click another button or (worse!) actually read something about your rights or their policies!
I skipped over this particular gem the first 50+ times I had reason to refer to the official #GDPR regulations but, for whatever reason, it jumped out at me this week. I’m curious to hear others’ views. I’m not looking for a definitive Legal interpretation (which can’t happen prior to May anyway!) – just interpretations & views.
This is the text of Article 12.1:
“The controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 and any communication under Articles 15 to 22 and 34 relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child. The information shall be provided in writing, or by other means, including, where appropriate, by electronic means. When requested by the data subject, the information may be provided orally, provided that the identity of the data subject is proven by other means”.
The last part is the bit I’m interested in:
“…the information may be provided orally, provided that the identity of the data subject is proven by other means”
This could/should(?) be interpreted to mean that any information provided orally (e.g. by phone) as a result of a GDPR rights request (e.g, SAR under Article 15), can only be provided orally (e.g. by phone) if identity verification is not carried out orally (e.g. by phone).
In other words, an Organisation cannot orally give me details of information it holds on me if it has orally verified I am who I claim to be.
This seems bizarre, counter-intuitive and unnecessarily restrictive. It also seems to rule out the possibility of automated voice-based Identity Verification leading to subsequent oral provision of data since – even though there is no actual person involved in the Identity Verification process – it is an oral process.
Those of us involved in evaluating the General Data Protection Regulation (aka GDPR) and advising on how to implement it for different scenarios are well aware of the distinction between “Personal Data” and “Special Category” data and we know that our Clients need to pay special attention to “Special Category” data. However, I have been thinking more and more about scenarios where I might consider recommending Clients treat routine “Personal Data” as if it was “Special Category” data.
Read on for an example to illustrate the point, although there are more examples (and I expect there are many that I haven’t thought of). Dissenting and assenting opinions are welcome so feel free to chime in with your views.