Google and Apple set up a new Nation A & A and opt to Govern it themselves

A few days back, Google and Apple , the owners of the Android and IoS systems and considered business rivals, came together to make a surprise joint collaboration arrangement.

The collaboration appeared like an attempt to regulate the use of Contact Tracing apps but has a long term implication on the way the World Governance system functions.

If the UN does not wake up, we will have a new nation state that will be under the control of Alphabet and Apple  (A &A) Incorporated.  Facebook-WhatsApp has already created its own nation state with its own currency Libra. If A&A opts for a currency of its own, they will disrupt the current global system more than what the North Korean -China combined regime can do together.

Soon we may have a constitutional crisis of Companies incorporated under the laws of a sovereign State trying to create their own constitutional islands. This idea was effectively used by  Swami  Nityananda who has purchased an island and declared it as a Nation “Kailaasa” with his own Governance system.

Naavi


Alphabet and Apple create a separate legal zone for Mobizens

According to this report in Economic Times

“Apple Inc and Alphabet Inc (Google)would ban the use of location tracking in apps that use a new contact tracing system the two are building to slow the spread of the novel corona virus”.

The Companies plan to allow “only”  public health authorities to use the technology. At the same time they also said that they would prevent the Governments from using the system to compile data on citizens and that was the primary goal of this joint exercise.

Though this appears to directly reflect on the Arogya Setu app in India and its intended operations on which a team of “Highly Concerned Privacy Activists” are working to prevent the Government of India from misusing the App for public surveillance, the issue is more universal. Several states in USA as well as other countries including UK have started using mobiles as an instrument for locating an individual and thereby trace the movements that could lead to tracing the contacts of people with others who may be having infections. If a person is detected as having been infected, it is considered useful to know his movements in the last few weeks and the persons with whom he came into contact with so that the potential risks can be identified and acted upon to reduce the spread of Covid 19.

The new system prevents the use of GPS location data for tracing and requires the contact tracing apps to use Bluetooth in a manner that Apple and Google dictate , for tracing which is considered less reliable.

Google and Apple also said that they will allow only one app per country to use the new contact tracing system. They will allow different States in US to use the system independently but in other countries, they may or may not allow the regions to use the system independent of the federal Government.

By these moves, Google And Apple are projecting themselves as the saviours of the Privacy of people across the globe and dictating terms to the sovereign Governments. They have thereby thrown a challenge to the global Governance system and creating a “Nation State” governed by the users of the Android-IoS driven mobiles.

In this new suggested order, the Android-IoS mobile holders are “Mobizens” of the Android & Alphabet  (A&A) state and the responsibility for protecting the fundamental right of privacy in this nation lies primarily with the A& A.

A &A opt out of protection under Section 79 of ITA 2000/8

Under the current laws prevailing in India the activities of any organization dealing with “Electronic Documents” is regulated by several measures. The sale of mobiles is regulated by business license and a mobile is a system of hardware, the OS, the default OEM apps and the apps downloaded and installed by the owner of the device.

Alphabet and Apple control their own App Stores and are considered responsible for malware free apps to be allowed to be listed there, which they have not been successful in meeting.

Under ITA 2000/8, the mobile is a computer and the OS and apps are accessories. Owners of these accessories are “Intermediaries” with their own responsibilities. Under Section 79 of the Act, Intermediaries are liable for any contravention committed by a user unless “Due Diligence” is exercised and the intermediary is not in complicity. For an entity to use this safe harbor clause, it is necessary that they fulfill the definition of an “Intermediary” and the conditions for availing the protection under Section 79.

The definition of Intermediaries under Section 2(w) of ITA 2000/8 is

“Intermediary” with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web hosting service providers, search engines, online payment sites, online-auction sites, online market places and cyber cafes.

Under Section 79 (2)

Notwithstanding anything contained in any  law for the time being in force but subject to the provisions of sub-sections (2) and (3), an intermediary shall not be liable for any third party information, data, or communication link  hosted by him.

But the above provision would be applicable (besides due diligence and lack of complicity) only if

(a) the function of the intermediary is limited to providing access to a communication system over which information made available by third parties  is transmitted or temporarily stored; or

(b) the intermediary does not-

(i) initiate the transmission,

(ii) select the receiver of the transmission, and

(iii) select or modify the information contained in the transmission

By virtue of the above provision, the moment Alphabet and Apple take on the responsibility of how the GPS system or Bluetooth system works in their system, they lose the status as an “Intermediary”.

Hence the CERT-In should issue a notice to both the companies Alphabet and Apple if they are opting out of the Section 79 protection if any available to them under the Indian law.

A & A are Data Fiduciaries/Data Controllers

Now looking at the forthcoming data protection act that is envisaged in India, any data handler who determines the purpose and means by which the personal data will be processed will be considered as the “Data Fiduciary”. Elsewhere the entity may be called “Data Controller”.

The data fiduciary /data controller does not have an independent legal power to determine how the personal data may be handled. Either the data principal/subject should provide a consent by which the personal data has to be processed as per the choice of the data principal/subject or the law should have provides certain exemptions and derogations.

While the Governments may use the powers of exemptions because they have a duty for public safety and health, it is not clear under what legal grounds can A&A state can claim immunity from not giving a choice to the owner of the system to give permissions for the use of his personal data.

Indian law has a provision by which Alphabet Inc or Apple Inc may register themselves as “Consent Managers” who will also be a data fiduciary and have the authority to determine how consents can be given on their behalf for the personal data to other third party data fiduciaries including the Governments. GDPR and other laws may not have similar provisions.

Since the DPA in India under PDPA  is not yet in place, it may not be possible to check the intention of the companies under the provisions of PDPA.

However, a notice can be issued under ITA 2000 itself about whether Apple and Alphabet would like to register themselves under Section 67C as one of the “Digi Locker” service providers. Avoiding an available legal provision to get the permission of the lawful authority is a clear violation of the law of the land and cannot be attributed to ignorance.

A &A should come under the Scrutiny of Competition Commission

Looking from another angle, if Alphabet and Apple having a monopoly of 99 % of the use of “Mobiles” and the activities of “Mobizens”, then all their activities including the current joint venture should be seen with the compliance of the Competition law.

Today A& A is taking the excuse that they want to be the sole distributors of GPS access because they want to protect privacy. Tomorrow they will make it the instrument of making money and be the sole suppliers of GPS data for all application owners.  This is a dangerous monopoly situation.

The Competition Commission should therefore issue a notice to both the companies to explain their stand.

Elliot Anderson should provide guidance for a public cause

I also need to add here that there is one most concerned French citizen who impersonates himself under the pseudo identity of  Elliot Anderson and writes “Aarogya Setu: The story of a failure”

This person may very well be a direct contact of some Indian politician and  could even be a person sitting in Delhi since he is the first to react on Indian developments  even before other Indian security professionals can get a scent of something happening here.

It is to be appreciated that he identified some bugs in Aarogya Setu and gave a notice to the Government to “respond …or else….”. He has explained his analysis of the app after decompiling the source code. Probably what he has pointed out is correct.

But many of the technical experts consider that the bugs pointed out are not significant weaknesses that can compromise the data which is lying inside the user’s device itself in an encrypted state. If accessed it will be hacking of individual device owners, whose privacy Mr Elliot Anderson is so concerned about. (P.S: This is based on the Government’s announcement that the personal data is not transferred  to a data server and is stored within the device).

According to an expert

“For apps of this scale that handle sensitive data, sophisticated code hardening and app security tools like DexGuard or Arxan need to be used. These tools modify the app at build time to add code and also have features like root detection and Frida detection built in”.

The Copyright Issue

However we need to reflect,

If I just call myself an “Ethical hacker”, does that give me the license to overlook Indian Copyright Act or DMCA or any French Copyright Act?

…to the extent of de-compiling the source code and publishing it?

If I am good enough to find the flaws should I not give a reasonable time to the app developer to make corrections? Or even better

Should I not myself suggest the App developer what corrections can be made?…particularly when we are talking of a non commercial public safety app of a sovereign Government fighting the pandemic?

Declaring an App as a Protected System

Had the Government declared that the App is a “Protected System”, even an attempt to unauthorizedly access the source code would have qualified for an imprisonment of 7 years. It is good for these so called ethical hackers that Government did not remember Section 70 of ITA 2000 and how it could have been used to protect such motivated hackers.

The Government which acknowledged the report of Mr Elliot and made some corrections which it thought was necessary should have thrown back a challenge to Mr Elliot to suggest how the code should be modified to prevent the bug he points out. Then we could have found out if Mr Elliot was willing to help in the public cause or only trying to strengthen the hands of the Indian opposition and our own indigenous Privacy activists who along with their friendly media keep criticizing all Government moves without suggesting any alternatives and call themselves “Internet Azadi Brigade”.

If the Government does declare Arogya Setu as a “protected system” now, it will ofcourse face the charge of “Shooting the messenger” charge and hence they may not have the courage to do it.

Need for better articulation

If however the privacy policy provides some warranties such as storing of data within the device, deletion after a specified time etc and declares the purpose, then the only issue that remains for criticizing the app is the “Mandate that it has to be installed by all workers returning to work”.

The Government could have articulated its measure by stating that “Lock down continues in public interest but relaxations are provided only for those who have installed the App”. This would have appeared like a favour rather than saying “All can return to work but they have to install the App” which looks like a punishment.

Naavi

(Comments invited)

Posted in Cyber Law | 2 Comments

Naavi is conducting another online Crash Course on PDPA . This will be a 12 hour course spread over two week ends. There will be two sessions of 75-90 mts each day between 4.00 pm to 7.00 pm.

Participants of this program would be eligible to take the Certification program from FDPPI for “Certified Data Protection Professional-Module I” with a further payment of Rs 5000/- towards membership (If they are not already members) and an examination fee of Rs 5000/- (Total additional amount payable Rs 10000/-). Contact  for more information.

The coverage would be as follows:

1.Evolution of Privacy Law in India. (ITA 2000-ITA 2008-Puttaswamy Judgement.Etc.) and .Understanding the Concept of Privacy and its relation with Data Protection, Applicability, Exemptions, Data Protection Obligations and Data Principal’s Rights

2. .Grounds of Processing without Consent, Restrictions on Transfer of Personal Data outside India ,

3.DPA, Adjudication and Appellate Tribunal, Penalties and Offences and Grievance Redressal mechanism

4.Compliance Obligations (Transparency and Accountability Measures), Data Audits and DPO ,6.Data Protection Challenges under New Technologies, Data Governance Framework, Interactive discussion and Review

The participation fee would be Rs 3000/- per participant.  Registration can be done by making the payment below:

Posted in Cyber Law | Leave a comment

WhatsApp and Fakenews

(This is a reproduction of the Article that appeared in India Legal Magazine on April 18,2020)

 The spread of fake news through social media has been a cause of concern for quite some time. It was highlighted in the past during elections and now continues as Covid-19 threatens humanity.

Whenever an election nears, social media is used for campaigns promoting the electoral prospects of candidates. This is a legitimate advertising and promotion activity and cannot be faulted or curbed. Unfortunately, unscrupulous candidates and their campaign managers have focused more on projecting negative information of their opponents rather than positives of their own partymen. The matter has assumed greater importance today with the growth of fake messages which can cause untold damage to society and therefore, have to be curbed ruthlessly. In the past, attempts to curb them failed because whenever legislative controls were brought in to punish fake campaigns, politics would creep in. This would lead to both the supporters and opponents of a candidate being reluctant to identify and prevent fake messages. The attempt to do so was questioned as an assault on free speech and courts were dragged into the controversy.

The last time that the government tried to bring in some measures to prevent fake messages, it demanded that messaging platforms such as WhatsApp identify their origin. WhatsApp, however, refused to do so and stated that any such exercise would compromise its end-to-end encryption system. As a result, intermediary guidelines under Section 79 of the IT Act could not be amended when it was first presented in December 2018. It was a pre-election period and the government as usual did not press the change.

Experts had said that this contention of WhatsApp was wrong and it was technically feasible for it to identify the originating device of a forwarded message without compromising privacy and the confidentiality of the messages. They said that when a message was forwarded several times, it was feasible to ensure that a meta data was attached to the header so that at each stage of forwarding, the device could identity it and the date and time of forwarding are added to the message before it goes into encryption. This was not different from a block chain mechanism where the message with the header information keeps evolving and each such evolved message continues to be encrypted so that privacy and security are not compromised.

WhatsApp’s justification that it was technically unable to agree to the law enforcement requirement was unconvincing and dishonest. However, it yielded a little ground when it agreed to limit the sharing of a message at one point of time to only five recipients so that if a message had to be sent to 50 people, then the sender had to do so in 10 different attempts. This was an attempt to give the impression that it was assisting the government in combating the menace of fake messages without going all the way. WhatsApp also took action against some software developers who had developed applications for mass forwarding of messages through it so that the dispersion of fake messages could be slowed down. This was more to protect their IP than to prevent fake messaging.

When the Personal Data Protection Bill of 2019 was drafted, the government once again made an attempt to take control of fake messaging by introducing a mandatory requirement that social media intermediaries provide an option to users to get their messages displayed with a “Verified Tag”.

However, with the advent of Covid-19, the problem of fake information became more acute as people spread wrong information about its reach, the damage it can cause, likely remedies, etc. This time there was no political backing for the fake messages and hence, there was an apolitical response from WhatsApp with a new voluntary, technical measure meant to slow down their spread. The new system will identify the number of times a message is forwarded and after the first five forwards, this will be restricted to just one at a time. The message will also display an extra arrow to indicate that forwarding is in the restrictive stage. This, however, does not eliminate the message if it is fake. It will only delay the process of forwarding.

By initiating this restriction, WhatsApp has said that it is able to monitor whether a message is forwarded five times or more. This proves that its earlier contention to the government that it cannot identify the origin of a message is false.

Technically, if WhatsApp can count whether a message has been forwarded by one or more persons, then it will be able to identify the message and also from where the forward has come. All WhatsApp messages pass through its server before they land on the destination phone as it has to be re-sent if that phone is not connected at the time the message was first sent. Hence, it is considered infeasible that the WhatsApp server cannot see the sender’s device by whatever ID it may recognise it.

Legally, the government had the power to demand the assistance of WhatsApp not only for identifying the origin of a message but perhaps even for decryption. Section 69 of the Information Technology Act, 2000 gave the powers of interception, monitoring or decryption to a designated official of the government under a specific procedure. Such a procedure is already in place and though a notification to amend the rules issued in December 2018 was stalled, the availability of the power was never in doubt. Further, Section 69 also provided that if the service provider or any other person failed to assist the designated authority, the company and its executives could be imprisoned for up to seven years.

In several rounds of discussion between the Ministry of Electronics and Information Technology, WhatsApp and other social media representatives since December 2018, it must have dawned on these agencies that they stand on weak legal ground in resisting the moves of the government to curb fake news. But now, with the need to prevent fake news to protect the community from a pandemic and with no political support, whatever little courage these companies had in resisting the government earlier must have crumbled. Hence, they have come out with a voluntary offer of restricting the forwarding to a single destination.

With WhatsApp dropping its earlier resistance, it is up to the government to push it once again to institute a mechanism where a header is inserted for every message to identify the origin and each forward. WhatsApp can also initiate measures to monitor such meta data so that there is proactive identification of any forwards to identified groups and they are filtered. Filtering of messages on the basis of intended forwarding would help law enforcement authorities to identify suspect groups who are working against the interest of the public and they can be blocked from receiving messages.

There will, no doubt, be a charge that this would amount to censorship. But if the procedure laid out is stringent and its use is restricted to exceptional cases with hard evidence to back it, the filtering of fake and malicious messages and subsequent legal action can be undertaken by the police better than is possible now.

As regards end-to-end encryption which WhatsApp claims to be impregnable and beyond its capability to de­crypt, the existence of malware such as Pegasus proves that breaking into a mobile device and reading WhatsApp messages is feasible. Hence, end-to-end encryption is not a fool proof system.

End-to-end encryption of a messaging service like WhatsApp is different from that of a voice message like Blackberry or Apple. Retrieving a voice message without the permission of the owner of a device by the law enforcement agency or a hacker requires not only access to the device but also enabling of the storing of the voice files.

In the case of messaging applications, storage and subsequent retrieval is an inherent character of the service and therefore, technically, reduces one process compared to recording of a voice conversation and listening to the recorded files.

WhatsApp restricting the number of forwards, therefore, strengthens the hands of the government. The company can no longer use technical excuses when it is ordered by law enforcement to reveal the identity of the devices originating and forwarding fake messages. This will now also possibly extend to decryption of end-to-end encryption.

Naavi

Posted in Cyber Law | Leave a comment

Don’t Shoot the Messenger, Media often says.. INS should first remember this policy

(This is in continuation of the earlier article)

The circular issued by INS r(Indian Newspaper Society) the posting of some publications in certain WhatsApp groups by over zealous members has the following advise.

1. Take legal action against offenders, especially against WhatsApp and Telegram admins who’re offending and trigger legal notices (WhatsApp group admins are liable for anything illegal that happens in their groups)

2.Additionally, also for any legal action taken, publish  few news stories to talk about the huge fines and lawsuits initiated  against offenders to deter others from doing it.

I would like to draw the attention of the INS secretariat to the following.

Media often accuses the Government and the Police when they take action against the journalists  with the advise “Don’t Shoot the Messenger”. It is common for investigative journalists to adopt bribing and other illegal means to obtain a story which these publications gladly publish. Has INS ever sent any advisory to the publications that their journalists should not adopt such practices or use ethical means of publishing articles without taking bribes?

Suddenly INS has decided to shoot the WhatsApp admins instead of the individual member who has infringed. The threat itself is illegal and violates the principle of “Free Speech” by creating a “Chilling Effect” as discussed by Supreme Court in the Shreya Singhal case.

INS secretariat must learn the law that WhatsApp Admin is only a manager and not an “Editor”. The messages donot get moderated and get posted directly because the person posting the message sends a message to WhatsApp group server and the server distributes it to the group. The “group” only represents a mailing list maintained by the WhatsApp server and the admin has no control other than removing a member.

Further message in the group represents only what is meant for the members and not for public. News papers are shared by family members and in libraries it is shared by many others. Will INS go after the librarians also? If not on what grounds do you discriminate against the WhatsApp admin? Your suggested action is therefore discriminatory and against public policy. If properly pursued INS registration may have to be suspended and cancelled for acting against public policy.

INS secretariat may kindly read the following article where I have explained the WhatsApp aspects in some what more detail.

“police target WhatsApp admins and FaceBook posters once again”

Police, Prosecutors and Judiciary: Please Don’t Create Fake Laws out of your misinterpretation

It is wrong to say that WhatsApp group admins are responsible for all that happens in the group.

If in a news paper an illegal advertisement appears, will you put the Editor in jail?.

In Information Technology Act there is some thing called “Due Diligence” and the WhatsApp admin’s due diligence has certain responsibilities. As soon as a prima facie illegal activity takes place, the Admin has to advise the member to withdraw the post since the post can be withdrawn only by the member who has posted. The only punitive action the Admin can take is to remove the member which is like sacking a reporter for one fake report. Many WhatsApp admins do it when the message is sensitive.

Please let me know whether you advise your news papers to sack the reporters if any of the reporters send a wrong report? If not why treat WhatsApp admins differently?

Secondly, the advise to harass the WhatsApp admins for the infringement with huge fines and further defaming them with publicity because the publication is in charge of its own publication is not a proper advise. It is a conspiracy to threaten members of public and violate the copyright law which may provide for reasonable compensation in case of violation as determined by a Court.

First of all in any Copyright infringement, one has to see whether the person infringing made any unfair gain by the infringement and whether there was any notice of copyright etc. The Courts will consider what is a reasonable penalty. Civil claim has to have some relationship to the loss suffered by the victim and wrongful gain made by the offender. Arbitrarily claiming a large amount is not provided in law.

If Newspapers are losing customers because they have become irrelevant in the age of TV and Social media, dont’ suggest them to recover their losses by suing the WhatsApp admins. The WhatsApp admins of groups where the kind of infringement have taken place will be not worth even a few thousands of rupees for claiming compensation. The publication will not get even the lawyer’s fee for notice in return. The publication can however bribe a policeman and try to harass the WhatsApp admin and both the Police , the publication and INS would be liable for human right violations.

INS as a society of responsible publications should show some maturity before issuing such circulars.

As a remedy, INS should withdraw the part of the circular which targets the WhatsApp./Telegram admins and apologize to the community. You are well within your rights to advise the publications to institute security measures to prevent downloading which many publications do. If you had not done so so far, it shows your incompetence. If your members donot want to spend money on hosting a secure website, you cannot advise them to go after WhatsApp admins.

I look forward to a positive action from your end.

Naavi

Posted in Cyber Law | 2 Comments

Circular of INS Secretariat on copyright violation by WhatsApp and Telegram

(This is in continuation of the earlier post)

I have received a copy of a communication supposed to have been sent by the Secretary General of the INS (Indian Newspaper  Society) to the publications as an advisory which is reproduced below:

Dear Esteemed Members,
Greetings from the INS Secretariat !!
It has come to our attention that some Publications are facing issues with distribution of the print copies and a lot of piracy and theft of newspapers is happening, especially in the digital format.
A lot of Newspapers are available in the ePaper format online in the morning every day, some of them being paid and some being free. Many users are actually copying the newspaper and creating PDFs which they circulate in WhatsApp and Telegram groups to the readers – leading to a loss in both subscription revenue for the print newspapers as well as ePapers digitally.
This is completely illegal and  Publications are trying to battle it in their own ways.  It is therefore recommended as below: –
1.      Communicate clearly in the Apps, Websites and Newspapers – that circulating any copies or part thereof, is ILLEGAL and strict legal action will be taken against individuals with heavy penalties.
2.      Additionally, also for any legal action taken, publish  few news stories to talk about the huge fines and lawsuits initiated  against offenders to deter others from doing it.
3.      Take legal action against offenders, especially against WhatsApp and Telegram admins who’re offending and trigger legal notices (WhatsApp group admins are liable for anything illegal that happens in their groups)
4.      Build certain product features which prevent piracy or at least slow it down   
a.       Limit downloading as PDFs, Images
b.      Add Java script code on pages to prevent copying
c.     Insert a user identifier code which is not human visible, so circulated PDFs on Social Media can be tracked back to individuals
d.      Auto generate list of users downloading greater than a certain number of PDFs per week and block them
This is for your kind information.
Kind regards,
Signed
Secretary General
While we appreciate the measures taken by the INS to protect the interest of their members, we are awaiting the response from the secretariat on why publications which have reduced the size of their print publications continue to charge the same earlier price. 
From the point of view of the consumers, this is an unethical act of the News papers and we expect the INS to show the same zeal in advising the members to reduce the cover price of the publications at least temporarily.  
Naavi
For the information of all:
The WhatsApp admin policy suggested by Naavi in the Cyber Law Compliance center  has he following paragraph. 

Quote:

Sharing of Content

The electronic space represented by the messages sent and received by a member of the group is considered as a “Private Message Space”.

The messages delivered by a member through this group is meant only for other members of the group and Non Members have no authorization to access these messages nor  the messages are meant for them.

If any member shares any message with any Non-Member, such member shall be solely responsible for the consequences thereof. Also he shall be considered to have indemnified the other members of this group including the admins for any adverse consequences arising thereof.

If any Non-Member accesses the messages without specific permission, it shall be deemed to be an unauthorized access as per Section 43 of ITA 2000/8 and also liable for payment of compensation and prosecution under Section 66 of ITA 2000 of India.

UNQUOTE:

WhatsApp admins are advised to use such a clause and adopt the model policy suggested.

Posted in Cyber Law | Leave a comment

Redefining “Personal Data” for the purpose of PDPA

I refer to an article today in Financial Express titled “Personal Data Protection Bll: Will it disrupt our data eco system?

This article discusses the importance of the early passage of PDPB 2019 and at the same time highlights the possibility of the act impairing the digital economy of the country by referring to the difficulty arising out of the wide scope of the definition of personal data.

There are no two opinions that the Act when it comes will cause disruption in the industry and the Government departments who have no clue on Privacy Management now will be the worst hit.  The private sector will be in a far better position since the professionals in the private sector are aware of Privacy protection because of their exposure to GDPR and other laws.  This could be one of the reasons why Government departments may have to be given a slightly longer time frame for implementation than the private sector though it would raise a hue and cry of discrimination in the industry circles.

The concerns expressed in the article are

  1. The wide scope of definition of personal data deviates the core proclaimed purpose of the legislation which is protecting the privacy of individuals.
  2. Curtailing the expansion of digital technology driven activities in the false pretext of privacy could lead to a decline in the growth trajectory. There is no legitimate need to regulate the creation and use of every data set or processing of data.
  3. Restricting data storage is thus of no use.
  4. Giving notice to everyone is  not possible and does not ensure better rights to data subjects.
  5. The economic impact of this legislation should be deeply examined and reconciled before moving ahead with it.

The article is well written and the views are well articulated. However, we need to present our views on the concerns expressed above.

It is clear from the last concern above that the author has advocated possible deferment of the passing of the law. It is strange that two years back all advocates were shouting that Indian Government does not want to enact a Privacy protection law because the Government does not want to bind itself to a discipline in the usage of personal data of its citizens etc. They all forced Supreme Court to come with a hurriedly conceived judgement on Privacy and the Aadhaar related decision in which the Supreme Court declared that Privacy was a fundamental right of a citizen of India protected under Article 21 of the Constitution. The Court also extracted an assurance from the Government that they will soon introduce a robust law for the purpose of privacy protection.

The Government went ahead, constituted the Srikrishna committee and came up with the first draft of PDPA 2018 as presented by the committee to the Parliament. When it was sent for public comments, elections intervened and a new version had to be introduced as PDPB 2019.

But now the same people who wanted the legislation earlier has realized that the law would bring in greater hurdles to the business than the Government itself and are now using all their skills not to let the Government go ahead with the passage of the Bill. There are frequent articles in news papers providing suggestions which in the end only mean that another version of the Privacy Protection Bill has to be worked out by the Government. This game has been going on for several years now and several draft bills have been earlier presented to the Parliament in the earlier regimes only to be kept pending in JPCs until the Parliaments end their term. We hope this Government will be different and finally come up with the passage of the Act or face a serious contempt charge from the Supreme Court.

We need to therefore consider how we can move ahead with the current version of the bill with minor modifications. Fortunately the Bill has enough flexibility to ensure that regulations from DPA can address most of the concerns and it is not necessary for all concerns to be addressed only in the Act.

The author (FE article) has spoken about the consent mechanism and considered it impractical to obtain the consent from every data principal. However, by the very definition of “Privacy” being an ability to exercise “Choice”, there will be no “Privacy Protection” without giving a choice to the data principal to determine how the data may be processed. PDPB takes into account several practical instances in which consent may not be necessary both for the Government and the private sector. Hence the concern is addressed.

The author of the article has also objected to the data storage limitation principal. However since the permission is linked to the purpose of processing and the data storage can be extended if the purpose demands or the legitimate interest of the data fiduciary requires extension, the concern has been adequately addressed.

The concern that the Act tries to regulate every bit of data that is created and this would hamper the industry has to be seen in the context of what is “Data” and what is “Personal Data”.

Personal data is part of the data and hence if we want to regulate Personal data as the Supreme Court wants, there is no way you cannot regulate the non personal data in some form. Personal data and Non personal data are like two sides of the same coin

Hence PDPA while regulating Personal data has to also say what it does  leave out as Non Personal data since Personal data is carved out of total data.

Regulating personal data therefore hinges on what data we carve out of the total as “Personal Data” so that the regulations can be applied there in.

Hence the definition of “Personal Data” is the most critical  part of the regulation and if we can agree on the definition, most of the disagreements that different segments of the industry have on the Act will perhaps reduce or even evaporate totally.

Currently, PDPA defines Personal data as

 “personal data” means data about or relating to a natural person who is directly or indirectly identifiable, having regard to any characteristic, trait, attribute or any other feature of the identity of such natural person, whether online or offline, or any combination of such features with any other information, and shall include any inference drawn from such data for the purpose of profiling;

Under GDPR,

‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

The two definitions have a small difference intended or otherwise. The GDPR definition refers to the “identifiers” and gives examples such as “Name”, location data, online identifier etc. The author of the FE article makes a reference to the European Court of Justice and even adds the “Answer sheet in an examination” as an identifier.

The PDPA does not name the identifiers but it is natural for people to extend the GDPR identifiers as also identifiers for PDPA to differentiate between Personal data and non personal data.

We need to deeply think here when does a data which is in the hands of a data fiduciary become “Personal Data”. No data is born “Personal” it acquires the status during the life cycle which starts from raw data  and journeys through the state of  non personal data, to personal data to sensitive personal data until it is destroyed or converted into other states such as de-identified data or anonymized data.

So, if there is a data

01110110 01101001 01101010 01100001 01111001 01100001 01110011 01101000 01100001 01101110 01101011 01100001 01110010

it is simply data and neither personal or non personal.

If a viewer sees this through an ASCII converter, his computer would display a conversion of this data into

vijayashankar

Now in this context is the first set of binaries “Personal data”? It perhaps became so because some body decided to convert it. Is it not similar to identifying a de-identified data?

The law is not clear about this.

Now having converted the binary stream into a text read as “vijayashankar”, does this amount to personal data? Does this identify a living natural person? What makes one think that vijayashankar is a name of a person? why can’t it be the name of a place?

In the absence of further clarification, will “vijayashankar” be called personal data?.. The law is not clear.

If we adopt the logic expressed in the FE article and what is also prevailing world wide, the name is an identifier, IP address is an identifier, email address is an identifier etc. But who says some thing is a name or email address?. If I name my company as Naavi@Naavi.org and register it, then is it the name of the company or the email address of naavi and who is naavi, is he an object, or person etc, are the things which make the information unable to be identified as a personal information.

Hence we must accept a definition where no information is personal or otherwise per-se. It becomes personal in relation to the conversion of the binary data into a human experienceable form and in the eyes of the beholder, it represents a person.

This is the concept which Naavi’s theory of data adopts as the “Definition Hypothesis” of data.

Does PDPA accept this principle? or fall into the check list approach of the other world to give a list of 18 parameters (as in HIPAA) or any other number of parameters that we can imply in GDPR?

As of now the definition in PDPA remains unclear. Hence “vijayashankar” or “naavi” or “naavi@naavi.org” as independent data elements are not automatically “Personal Data”. But if the “beholder” knows that there is one natural person who responds when you call out “vijayashankar” or “naavi” or send an email to naavi@naavi.org, because of such knowledge, the data becomes personal data in his custody.

The same data in the custody of somebody else who has no clue to what is “vijayashankar”, it is a non personal data.

The definition of personal data should therefore incorporate the “User of the Data” who may be a Data Fiduciary in this context and his knowledge to identify any set of characters as personal data or otherwise.

I am not sure how  if this should be done by amendment of the definition of the personal data or we should leave it to the DPA to clarify.

As a suggestion, I would recommend consideration of a revised definition of “Personal Data” to ensure that this definitional uncertainty is removed.

‘personal data’ in the context of its use by a data fiduciary and the knowledge of the data fiduciary, means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

In such a definition no single stream of binary data is called “Personal” unless it is associated with one or more other binary streams which together indicate that the data set is an identifiable personal information. Hence vijayashankar, email:naavi@naavi.org would together be called personal data while individually, vijayashankar or naavi@naavi.org cannot be called personal data.

Comments of experts are invited.

Naavi

Posted in Cyber Law | 2 Comments