Difference between revisions of "GDPR Avoidance"

From MgmtWiki
Jump to: navigation, search
(Other examples)
(Other examples)
Line 37: Line 37:
  
 
==Other examples==
 
==Other examples==
There us plenty of evidence that the GDPR is nothing more than an effort to extract money from successful Silicon Valley companies. See the following stories:
+
There is plenty of evidence that the GDPR is mostly a political attempt to punish large digital companies while providing exceptions for local companies that can afford lawyer to protect them. See the following stories:
 
* The EU Parliament voted to collect all residents biometrics data into a single data base.<ref>Catalin Cimpanu. ''EU votes to create gigantic biometrics database.'' Zero Day (2019-04-22) https://www.zdnet.com/article/eu-votes-to-create-gigantic-biometrics-database/</ref>
 
* The EU Parliament voted to collect all residents biometrics data into a single data base.<ref>Catalin Cimpanu. ''EU votes to create gigantic biometrics database.'' Zero Day (2019-04-22) https://www.zdnet.com/article/eu-votes-to-create-gigantic-biometrics-database/</ref>
 
* stories will be added here as they are reported.
 
* stories will be added here as they are reported.
  
 
==References==
 
==References==

Revision as of 17:17, 19 August 2020

Full Title

The GDPR is well-stocked with 40 different avoidance techniques. While these can be used to avoid the terms of the GDPR, it is not ethical to do so.

The Indictment

  • Lawyers that charge client over $500 per hour have been at work first filling the GDPR with exemptions and now are teaching companies how to exercise those exceptions.
  • The net results of these legal efforts is that the GDRP was never meant to apply to EU Governments or Businesses.
  • This is not an indictment about legal activity as the lawyers have assurance that their exemptions are legal, but against the perversion of what appears to be a consumer friendly law that does not meet it objectives.

An Example

This example will show how one standard has been created with one of its purposes of skirting the GDPR law while collecting data at low cost. It does this by claiming that its goal is to help companies reduce the cost of complying with money-laundering legislation. Since support for efforts to reduce international crimes like drugs or pedophilia have a high level of support, any effort to achieve those objectives (it is argued) should be a good idea. So, the OpenID Connect for Identity Assurance 1.0 specification[1] was created to allow the German banks to collect of your credential information from a variety of source. The fact that you have given the same information to the other banks you do business with, is of no interest to them if they can get you to just give them everything. And, incidentally, that makes it easier for them to profile you for marketing purposes, which was something the GDPR was created to stop.

This analysis of the OpenID spec is comparing it to the prescriptions and proscriptions of the GDPR itself. It does not consider the exceptions. If you want an exception so that you can continue doing whatever it is that you do, hire an expensive lawyer. She will be happy to provide an opinion that the spirit and letter of the GDPR do not apply to you. The biggest exception is Article 1, paragraph 3 (A1P3) which allows free flow of data within the EU after it is first collected. So we focus on the first collection of data, which seems to be the only place the user has any control at all. First will skip over the 173 "whereas" clauses, which also prevent the implementation of the OCIA terms and go straight to the articles.

Data Minimization (A5P1d) - "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed. The issue for OCIA is "limited to what is necessary" there is no case made that the data collected in OCIA meets this restriction. Just because a bank claims that data is necessary does not make it so. It should be recognized that there is a long tradition of using personal attributes to determine assurance which is documented in NSIT 800-63 version one. The point is that alternate methods exist and were offered to the OCIA developers, but it was rejected because that "is not the way we do things". It should also be noted that the inadequacy of this clause was rectified in the California Legislation that would require a data processor to let the user know what data was required and what was desired, but not required. This capability has been built into OIDC from the time of OAUTH in the tags that could be included in the request for data. The lack of concern for this clause was made starkly evident by the exclusion of that feature from AuthXYZ. The author of that spec eliminated it because "it was never used, the request always claimed all data was required."

Data Accuracy (A5P1e) "accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay". Most of the data collected by OCIA is granted without a period in which accuracy is guaranteed. So there appears to be no limit placed on how long the data is archived.

Storage Limitation (A5P1e) "kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed" data that is used for assurance should not be kept after the assurance has been completed. The OCIA does not require a report that the assurance has been completed and the data erased. This could be performed by mandating a Consent Receipt with the information.

Accountability (A5P2) There is no demonstration of compliance with paragraph 1 in any OpenID protocol. So the OCIA is not complete.

Conditions for Consent (A7) seems to lacking in any OpenID protocol. So the OCIA is not complete.

Child' Consent (A8) seems to lacking in any OpenID protocol. So the OCIA is not complete.

Biometric Data (A9) is used as an assurance factor. Not addressed in any OpenID protocol. So the OCIA is not complete

Information from Controller (A13 and A14) in particular P1 on mandatory data element providing information about the controller to the subject.

Data Protection by Design and by Default (A15) this essentially reinforces the terms of data minimization. Data is collected by OCIA for the purposed of identity and authentication assurance that could be enabled by other means. And then the data is used for other purposes, including profiling. Note that requiring the data for identity assurance, which is claimed to be a requirement, should not permit the data to be used for other purposes, like marketing, which could not be claimed as a requirement.

Approved code of Conduct (A41) need to be referenced in protocols like the OCIA.

Certification (A42) this is voluntary, but i would expect to see it enabled by the protocol.

Other examples

There is plenty of evidence that the GDPR is mostly a political attempt to punish large digital companies while providing exceptions for local companies that can afford lawyer to protect them. See the following stories:

  • The EU Parliament voted to collect all residents biometrics data into a single data base.[2]
  • stories will be added here as they are reported.

References

  1. The OpenID Foundation, OpenID Connect for Identity Assurance 1.0 (OCIA) https://openid.net/specs/openid-connect-4-identity-assurance-1_0.html
  2. Catalin Cimpanu. EU votes to create gigantic biometrics database. Zero Day (2019-04-22) https://www.zdnet.com/article/eu-votes-to-create-gigantic-biometrics-database/