Monday, 26 November 2018

One size does not fit all: GDPR, family trees, employment law and financial services.

Everybody loves certainty.  Everybody positively adores predictability and forseeability.

So the notion that the General Data Protection Regulation (EU 2016/679) ("GDPR") has swept some degree of certainty into data transactions must be a good thing, right?

Right.

But what if the certainty is fake?  Or does not resemble reality closely enough to be a useful instruction?

The economic objective of GDPR: an overhead barrier to entry


To those who have already read GDPR, and/or implemented it in the private sector workplace, one already knows the real economic interest that the EU had to introduce GDPR.

Aside from playing trade politics with America, GDPR de facto requires all Data Controllers to compile contemporaneous documentation to prove the Data Controller's innocence in advance of any claims/investigations arising.  Data Protection By Design, remember (the Precautionary Principle for data)?

And, of course, documentation is inherently a manual task, therefore expensive, and must be funded by means of overhead resources.  Which means that GDPR has become traditional European form of protectionism: a barrier to entry via an obligation to have bloated overheads.

How effective is GDPR?


As to the actual effectiveness of GDPR on protecting personal data of EU member nation citizens, the jury is still out.

The European Commission is not currently satisfied with the American approach to data protection.  One lawyers' public report of 02Sep2018 pointed to a growing unease in Europe about the enforceable adequacy of the EU-US Privacy Shield  (EU, wiki).

In part, this is presumably down to the sheer impossibility of getting Americans to understand anything that doesn't fit into their own pre-conceived, bureau-legalistic outlook on the world (as the IAPP lamented on 20Jan2018, "Explaining GDPR to an American"; my response: best of luck, lots of patience... and breeeeathe...).

How workable is GDPR?  Alien law in foreign courts?


The challenge of explaining Napoleonic code mentality to a common-law American lawyer is one of the major challenges that European jurisprudence faces in the wider, non-European world.

The tick-box absolutism of EU rules, directives and regulations has been a challenge for the British (and will remain so after the fake-Brexit of 29Mar2019 asunder the Withdrawal Agreement of Nov2018).

So how the same tick-box absolutism will square up to, say, Chinese jurisprudence is going to be more than merely entertaining.

Absolutism = absurd outcomes


The core problem for the Europeans is that the absolutism of GDPR results in absurd outcomes.  GDPR does not accommodate the Anglo-Saxon legal concept of "reasonableness", largely in part because GDPR is based upon a Germanic model of data protection, enforced by a European political system based on France's fourth republic.

The absurdity arises because the rules are designed to fit too narrow range of objective outcomes, with an even narrower range of method by which to achieve the too narrow range of objective outcomes.

The range of outcomes is narrow because of the underlying ideology, contaminated by corporatist lobbying along the way.

The net result is that GDPR serves as much as barrier to trade/entry as it does some limited protection of personal data.

Like all EU regulation, it meets some of its retail objectives, but at hidden costs to the consumer that the typical consumer might have re-considered had they the choice about whether to implement the regulation.

One such cost to the consumer - and therefore also to the value-chain of producers - is the narrow range of prescribed uses of personal data.

Consumer impact: case 1, Facebook et al


GDPR is designed primarily with discriminating against Facebook, Google, Microsoft and Apple, to ensure that they cannot re-sell personal data of EU member nation citizens without the citizen knowing.

Of course, the care that the EU has for us ordinary taxpaying plebs is as miserably low as any tin-pot civil servant within any of the EU member nations.  The real issue for the EU is to stack Facebook et al with a tonne of irrecoverable overheads that are fundamentally tied to the core of their business, to reduce their profitability, to reduce the perceived viability of operating in the EU and/or with personal data of EU member nation citizens.

Has GDPR successfully constrained Facebook's re-sale of personal data?  Facebook is, for the moment (Nov2018) seemingly evading any meaningful investigation, but others claim to find that Facebook is hiding known breaches of GDPR.  TechRepublic summarised the chronology on 25Oct2018.

Consumer impact: case 2, the consumer's employer


GDPR specifically did not address situations where data processing is inevitable, but otherwise unrelated to internet-based consumer services.

For example, take a business running in a business-to-business market, with no consumer customers.

GDPR applies to this business as much as it does to Facebook.  It's exposure to GDPR is much smaller than it would be for Facebook, i.e. Facebook is both a Data Controller and a Data Processor.  But the small business is probably only a Data Controller, who sub-contracts the payroll to a Data Processing payroll bureau.  But, like Facebook, the small business needs to prepare tonnes of documentation to prove its innocence well in advance of something bad happening.

For those in English jurisprudence, the precautionary elements of GDPR make compliance with it the same as that of health & safety risk assessments, or fire risk assessments.  In effect, GDPR requires data risk assessments, even if the risk is self-evidently zero.  Don't assume, just keep typing (even if it detracts from running your business).

Consumer impact: case 3, a family tree


For another example, take a family that seeks to centralise its family history.

The collation of family history of dead relatives is not a problem under GDPR, and can be done on paper or digitally.

The problem comes with living relatives.

The compilation of a family tree containing personal data of living relatives cannot meet a single basis of lawful processing asunder GDPR Article 6:

  • there is no documented consent mechanism within a family: a consent mechanism could be built, but, only at humongous cost.  As at Nov2018, what software available on the market does that, and how could it work if one member of the family refuses (or cannot) use the internet?  
  • there is no contract to perform: a contract between family and member could be drafted, but it would be an absurdity;
  • there is no legal obligation on the family to compile its family history;
  • there are few cases where a family history would be vital to the interests of the Data Subject, except to track medical conditions.  Ironically, this a form of Special Data Processing subject to even more restrictions asunder GDPR Article 9.
  • there are no de facto legitimate interests that the family archivist could have to record personal data of Data Subjects, especially regarding minors (children).
And yet, much of the basic data that would justify a Data Regulator prosecuting the Data Controller (family archivist) is typically available from the Register of Births, Deaths and Marriages.  Ah, but they are state agencies, aren't they, so they enjoy a different legal system to that which the state inflicts on us ordinary plebs, don't they?

The blindingly obvious absurdity arises over time.  The generation that first compiles its family tree might be the easiest if they consent unanimously.  Fine.  But that might include their children, who may be too young to grant consent.  After a few years of wilful non-compliance with GDPR, the children grow up, mate with a complete stranger and spawn.  The archivist wants to record the mate and the spawn on the family tree.  And this decision by the archivist lays a landmine for the future, waiting to be triggered.

Roll-forward one more generation, and the family archivist - by then an elderly person - is unlikely to have met the full range of grand-children that the extended family would have spawned, especially if there were more than one child in the other families married into the family.  Yet, the same archivist will have wanted to document all of them to keep the family tree complete.  Managing consent would be a slight nightmare.  All it would take is a small family tiff, an unexpected discovery that a family archivist was recording data about the wider family, and then the archivist would be slapped with a Data Subject Erasure Request from the offended distant relative.  The Request would likely be followed by a notice of investigation from the Data Regulator.  Where, would demand the regulator of the archivist, is the signed proof that the mate and spawn consented to their personal data being in your hands?  Thus the archivist steps onto his own landmine (remember that?).  Oops.  Good luck defending against that one.


The European Right to be Forgotten


Taken in combination with the Copyright Directive (a proposal as at Nov2018) and the Right to be Forgotten (Google Spain SL, Google Inc v Agencia Española de Protección de Datos, Mario Costeja González (2014), partially codified in GDPR, wiki), we stare at a European jurisprudence that demands its populace zombies-out.  Literally, to throw its brains and memories out of the window, never to be used ever again.

How absurd is this outcome?  Yet, it is an inevitable consequence of European law.  Arguably a public choice to dumb down the population.  It makes it much easier for cultural Marxists to re-write history when the history has been erased by due process of law.

Making financial services to make employment hard again

Such absurdities undermine the integrity of the woolly concepts within GDPR.

The definitions of Data Subject, Data Controller and Data Processor sort-of make commercial sense in a simple commercial environment governed by one, or two, contracts of service.  But they don't work competently in a non-contractual environment like a family tree.

More pressingly, they don't work in a commercial environment where there are multiple intermediaries required to deliver a service to the end-consumer.  This is most acutely felt in the financial services industry.

Personal accident and travel insurance: the story of the customer's employee


Take a medium-sized business whose employees travel abroad for business travel.  The company needs to arrange some sort of travel insurance for its operations, to cover its employees.  The company might approach an insurer directly, but that is often impracticable (and undesirable, for both insured and insurer!), so most companies would use an insurance broker.  The insurer is regulated as a producer of financial services.  The broker is regulated as an intermediary of financial services.  The company is the end-user, and, in this context, is defined as "retail" asunder financial services law (MIFID).

So far, so good.  But translate these agents to GDPR, and the issue arises almost instantly.

The employee is clearly the Data Subject.  The employer is the Data Controller.  The insurer is.... erm... what, exactly?  A Data Processor?  No: the insurer is not being asked to receive personal data, churn it, and return it back in a particular form.  A Data Controller?  By default, it must be, because it's handling personal data.  That means that the broker is also a Data Controller, right?  Well, maybe not, because the broker is converting personal data from the employer into a format that the insurer might understand.  Umm..... but there is no contract between broker and insurer relating to data processing.... so does that mean the broker is really a Data Controller, or a non-compliant Data Processor?

So far, so bad.  But it gets worse.  During a business trip to America, the employee stubs his toe on an airline chair.  He claims on the employer's travel insurance.  To do so, he would (nowadays) need to have installed the insurer's travel agent's app on his smartphone (let's assume it's a dedicated company-owned smartphone, to keep this absurdity as simple as possible) and claim from there.  But how does the insurer's travel agent identify the employee as being an employee of the insured company?  Ah.  Well, ultimately, the employer is the Data Controller responsible, and the employee's details would have originated from the employer (most likely, the employee probably registered his smartphone app with the travel agent's claim service on instruction from his employer).  But however the personal data got passed to the travel agent, it's impossible to characterise the relevant agencies in the service chain cleanly between Controllers or Processors, or Joint Processors.  Or a hybrid.  Which GDPR doesn't define!

And it gets worse still.  Having stubbed his toe, the following day, the poor mush suffers a stroke while in America.  The American emergency services rush his wallet to hospital, desperate to find a travel insurance card to prove that the patient can afford to pay for his treatment.  But this is a European insurer: it's paperless (prior to the business trip, the employer told the employee to protect his personal data, so don't print anything out that comprises the safety of your personal data).  The only evidence of insurance is on the employee's smartphone.  Only the employee knows the 4-digit PIN to unlock the smartphone.  But the employee is unconscious.  All the hospital knows is the employee's name and nationality, because - thankfully - his passport is still a physical document.  But the delay caused by finding a way to get paid was a fatal choice by the hospital for the employee, and the employee dies later that day.  The issue is now to repatriate his body to his home country, with only his passport as a form of identity.  The American hospital now needs to breach just about every rule in GDPR to find out who the guy was, where he came from, who employed him, who insured him, who his next of kin is, where the next of kin lives, to where the body needs repatriating... And all GDPR can do is fudge it: GDPR apparently doesn't apply to dead people.  So that's alright then.

Other financial intermediaries


Insurance is the most likely area where the abnormal European mentality of "one size fits all" within GDPR causes more problems than it solves, with a wide range of absurdities waiting in the wings for future court cases to determine.  Travel insurance covering travel to non-EEA states without "adequacy assessments" by the European Commission are just the tip of this particular iceberg.

Beyond insurance, credit reference agencies, money-laundering agencies, credit/liquidity agencies (including credit unions), investment manufacturers ("PRIIPS"), pension providers, annuity providers and identity agencies (in states where the state provides no identity confirmation as a service, normally in common-law legal systems) are all processes which require data processing necessarily without the consent of the Data Subject, but whose operations are prescribed by law only as objectives, not methods (methods would be implied by case law, tort of negligence, i.e. "Would a reasonable credit reference agency think such a shoddy job was acceptable?").

In the instances listed above, the certainty imbued by GDPR's "reasonable" concepts is a false and fake certainty.  The real world does not comply with the childlike innocence of GDPR's noddyesque concepts.

The issue for financial services is so cloudy that a lawyer wrote up a pre-sales script for impacted parties to consider.

Impacts on mergers, acquisitions and capital markets?

A fundamental tenet of capitalism is that under-performing companies are bought up by aggressive buyers, turn the company around (or break its weak management up), and re-sell it.

In such deals, the names and positions of senior management are shared between buyer and seller, typically with such secrecy so as not to pervert the management decisions of the managers under discussion.

Necessarily, the sharing of the names - personal data - happens without the Data Subject's express consent.  It has no bearing to either the vital interests of the Data Subject, or the legitimate interests of the Data Subjects.

So, by virtue of normal operations, a merger/acquisition deal invariably - and unavoidably - breaches GDPR.

What would be the fix?  Under GDPR, the seller would need to obtain explicit consent from its employees whose names are to be shared with the buyer.  Were this to happen - especially for listed companies - the risk of insider trading would magnify considerably.  The confidentiality of the deal would be blown, and, consequently - especially for German capitalists - the oh-so-precious stability of equity markets would be in tatters.  Worse, if the management holds so much of the share capital, the deal would require a mechanism so far not yet designed to enable the transaction to happen at all.

There is no provision within GDPR for such circumstances, and no means of complying with both GDPR and current merger/acquisition customs.  GDPR thus appears arbitrary, wholly disconnected from the real world to which it supposedly relates.

This means that GDPR wilfully did not consider these circumstances during its design.  This much is quite obvious from a simple read of GDPR.  Or, more cynically, it could be that GDPR was designed to impede capital markets.  This much is likely when considering the mercantalist and protectionist outlooks of the top two EU member nations.

Vicarious liability under employment law

On 01Dec2017, in the case of various claimants v WM Morrisons Supermarket plc, the High Court held in favour of the claimants, resulting in Morrisons being vicariously liable for a data breach that a rogue employee wilfully committed as an act of revenge against his employer.  The Court of Appeal upheld the High Court's judgement (no online document from the court as at 25Nov2018; technical overview from a lawyer); Morrisons is said to plan a further appeal to the Supreme Court.

Irrespective of the chain of logic used by the courts to upload Morrisons as vicariously liable for the torts of the criminal offender, the point is that the courts have pinned the blame for a criminal act on the employer.  In the High Court's judgement, paras 72-73 set out the chain of complacency that the court held against Morrisons, specifically that Morrisons failed to deploy the appropriate level of snooping, spying, repeated investigations (presumably without due cause), clairvoyance and omnipotence to have assessed the risk of the employee's foreseeable criminal actions.  It was apparently Morrisons' fault not to have correctly guessed what the rogue employee was going to do.

Reasonableness - a core concept in English law - longer matters, even when considering tort of negligence in a case of this topic (judgement para 67, "The seventh principle [of data protection] does not impose a duty to take "reasonable car" as such.  Those words do not appear in the Statute.  This might suggest that the draftsman was aiming at a rather different target when he required that "appropriate" measures to be taken").

Indeed, from this judgement, we can reasonably (how ironic!) deduce that European jurisprudence is substantially more about the ticking of boxes, so if there is no box available to accommodate the messiness of real life, so be it.  The judgement even explains what real life entailed in this case (para 75, the testimony of Morrisons' officer responsible for data security): "it is impossible for any sizeable data controller completely to exclude the risk that data may be compromised, for example as a result of a criminal hack of its IT systems or the criminal misuse of data by its own employees."  The court accepted this testimony (para 80), but it appears to have no impact on Morrisons' defence.

In a common law environment, the absurdity of GDPR's implementation in the Morrison case enables the claimant to fish around for the deepest pockets for breach of their privacy asunder GDPR, but for which there is no evidence that such breach resulted in foreseeable costs to the claimants.

And yet... in para 103 is the double-irony: "The Bluecoat server [firewall] keeps a record of every website request made by the end user.  Thus, if an authorised person wishes to know what an individual employee has attempted to look at on the internet at work, it is technically possible to get Bluecoat to provide a list."  The court ended up upholding vicarious liability against Morrisons because it didn't systematically snoop for stuff that failed an unspecified sniff test.  So much for privacy!  What was GDPR for...?!

The case underlines the fake certainties embedded within GDPR's vague definitions and relationships between concepts (sort-of) defined in GDPR.  GDPR makes no attempt to be compatible with any other statute in any other legal system, so wilfully disregards reality outside the immediate environs of GDPR.

Accordingly, the case also underlines the perversion of common law by GDPR, its underlying Napoleonic nature, and similar for all law originating from the European Union. 

Conclusion


While the professions struggle with making GDPR workable in the real world, no-body seeks to make the case that GDPR is the wrong tool for the wrong job.

As the world's first data protection law, it pioneers a way to protect data.  But the way is backwards, so poorly designed and so poorly thought through that it provides a legal mechanism for the continuing censorship of the people, the propping-up of artificial barriers to entry into the marketplace, the monopolisation of consumer services, the protection of incumbent economic operators (far more effectively than protecting Data Subjects' personal data) and a march towards the silencing of opposition to cultural Marxism and political communism.  One size does indeed not fit all.

What started as a good idea by the innocents has been perverted by lobbyists and ideologues at the European level into an insidious form of state control, resulting in a minor war between state and IT sector as to whom has the right to subjugate the masses.  The IT sector has embarrassed the state, for showing how subjugation can be done bloodlessly; the state is jealous that the IT sector knows more about the state's serfs than the state does.

The lesson of GDPR - that one size does not fit all - will likely apply to any number of regulations in any number of jurisdictions, not just the European Union.  But for a such modern proto-state, it is unreasonable to expect the EU still to be using the tools of mid-20th century to progress its agenda.  That the EU chooses still to work in mid-20th century says an awful lot about the evil, longer-term objectives of the statist-globalists who continue to steer the EU towards its sinister end-game against the citizens of its ill-fated member nations.


End of post.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.