(Continued from the previous article)
P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect.
In our previous article, we discussed a new concept of “Privacy” where I advocated that “Data is in the beholder’s eyes” and hence Data in binary form which is accessed by an algorithm and processed without releasing the data out of the algorithm as an “Identified personal data” should be considered as not “Accessing” personal data for the purpose of personal data protection.
I am aware that supervisory authorities under GDPR would consider that when identified data is accessed by an algorithm without consent, it amounts to “Automatic Processing” and is considered as a breach of identifiable personal data.
What I am advocating is that this approach needs to be changed.
“Data” is “Data” only when it is in a form in which a human can understand it.
Identifiable Personal Data in binary form which is not accessible by a human being but is accessible only by an algorithm which ensures that even the admin of the algorithm cannot view it in identified form
and
subsequently released for human consumption only in anonymized form should be considered as not having been “accessed” and hence not considered as an infringement of Privacy.
In GDPR, this view is not supported. However, GDPR recognizes “Anonymisation” and considers “Anonymized personal data” as “Non Personal data”. If viewing by an algorithm in identified form is considered as “Breach” then all anonymization processes are actually processing of personal information and hence must be considered as an “Access” of identified personal data and requires consent.
I am in agreement with the views expressed in this article “Does anonymization or de-identification require consent?
According to this article, in Opinion 05/2014 of the Article 29 Working Party on Anonymisation Techniques, the Working Party stated:
“The Working Party considers that anonymisation as an instance of further processing of personal data can be considered to be compatible with the original purposes of the processing but only on condition the anonymisation process is such as to reliably produce anonymised information in the sense described in this paper.”
But if this view is correct then access of identified personal data by an automated processing algorithm is not an objectionable access. There is therefore an inherent conflict in GDPR.
This principle is extended in the concept which I am trying to advocate as Privacy 2.0 and drawing a principle that whenever any process accesses identifiable personal data ad returns anonymised personal data where even the algorithm administrator has no access to identified personal data, then the process is compatible with the view that there is no infringement of privacy. Such a process will required that after the algorithm removes all identifiers the identifiers are irrevocable destroyed and are not associated with the output of the process.
This is an important clarification I am advocating in the New Data Protection Act as part of the definition of “Privacy” and the definition of “Sharing”.
In the current versions of both GDPR and PDPB 2019, the law does not define “Privacy” and proceeds to speak of various measures to protect the “Information Privacy”. It is felt that this is not fair on the data processing industry that they are required to protect a “Right of Privacy” which even the experts in Judiciary are not confidant of defining. We therefore strongly feel that the definition of privacy is essential in this data protection law though most of the law is related to “Information Privacy”.
The Privacy definition clause defines “Physical Privacy” which is the old concept which Supreme Court in the Kharak Sigh Case upheld. “Mental Privacy” is what Justice Chandrachud defined in the Puttaswamy judgement. The same Puttaswamy judgement also simplified the definition of Privacy into “Protection of Right of Choice” as expressed, which leads to the “Consent” and “Lawful basis for processing” requirements. Further the Puttaswamy judgement as well as GDPR mostly addressed issues of protecting “Information Privacy” which is protection of the Right of Choice about the use of personal information by the data subject when the personal information is in electronic form. Though in stray articles/sections it is stated that the principles of the law are also applicable for manual processing or file systems, the essence of the law is protection of personal information in electronic form.
We have tried to remove these attempts of law makers to hide behind vague concepts of “Privacy” and try to catch an industry for non compliance at the appropriate time just like the traffic cop who prefers to hide behind a bend and catch violators rather than standing in the middle of the road and guide the traffic towards compliant driving.
In GDPR we do differentiate between automated processing leading to “Profiling” from “automated decision making” without human involvement but both require a lawful basis and this view if extended to other data protection laws also resulting in a general belief that access of identifiable personal data by any automated process requires consent/lawful basis and otherwise will be considered as a data breach.
We are challenging this interpretation and seeking validation in the new data protection law.
(For abundant caution, I am trying to clarify that what is suggested is for the forthcoming new law in India and does not alter the earlier view in GDPR compliance where automated processing/decision making require a consent/lawful basis.)
Naavi
P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi. Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.