The New Human Order… Are we Ready for Techno Philosophy?

Presently we are exposed to the low level of Artificial Intelligence where a software some times embedded in a specific hardware such as a robot can do a specific job efficiently. Just as the Computer was initially used for doing repetitive tasks and work as a tool of the human, the Artificial Narrow Intelligence (ANI) can work as a tool to relieve humans of some repetitive tasks.

The ANI machines are as good as they are programmed. The programming of an ANI device includes the “Training” that it receives where the developer feeds standard inputs and commands specific types of responses from the algorithm. The response of such machines can be intentionally or otherwise biased because of the training process.

The next generation machines may have self learning capabilities based on the feedback received by the users in the process of its service. In such a process the continued learning may be a form of automated input generation. The quality of response  may still be dependent on the quality of the inputs.

In such scenario it is possible to introduce some ethical guidelines to ensure that the algorithm will behave in a controlled manner.

At the third level, the “Deep Learning” or “Artificial Neuro networks” are algorithms that can recognize patterns in data and make decisions or predictions. In such a process there could be several layers of analysis as the learning process makes a step by step progress to reach its final decision which may be stored in the memory for responding to the standard use case.

As long  as the AI is dependent on the training, its functions may be under human control.

However the latest generation of AI is designed to think of breaking out of the standard instruction led or prior data input lead decisions and adopt a human like approach of an alternative way of doing a thing.

It is in this context that it is felt that AI will become capable of self learning both from its own earlier activities (Meta Learning) or from  exploratory expeditions. With its ability to gather large amount of data and process them, this ability to learn from learnings can create outcomes which are outside the imagination and expectation of the developer.

Such AI which is capable of applying its learning from one field in another will exhibit an ability which may be called “Artificial General Intelligence”. (AGI)

Legally speaking, at this stage the applicability of laws like Information Technology Act 2000 which attribute the activity of a software to its creator starts failing. We need new laws to take care of such scenarios.

While creating an Artificial Intelligence Act that tries to address the creation, maintenance and destruction of self learning AI algorithms is within the capabilities of the humans, there would a possibility of the Super Intelligent AI which will learn through its General Intelligence that it can break away from the limitations of human intelligence and the human controls and re-set itself to be  beyond the capability of being human controlled is a possibility.

Those who are aware of “hypnotism” are aware that the hypnotist can implant a “key suggestive trigger” which can instantly bring out the hypnotized out of the trance or stop doing a specific task. Some stage demonstrators of hypnotism use the term such as “Sleep” to suddenly make the hypnotized drop into a deep sleep.

Similarly, we have seen in movies that Robots may have a “Kill Switch” which can disable it. But super intelligent AI can alter it’s program to kill the kill switch. Even the oft quoted rules such as “Dont harm humans” etc can be over ridden by the super intelligent robot and the robot can turn rogue at its own will. This may today be a move script but well within the realms of possibility in the next 50 years.

While Naavi is today advocating the AI regulation to ensure that human race is not destroyed by the future generation of AI led robots, Naavi feels that the human race should be ready to allow itself to be replaced by the new generation of robots as if they are our own progeny.

Today we are aware that our children are not necessarily an exact replica of their parents. The parents may be good guys but the children may not be so. Partly this may be due to the genes and partly it may be driven by the environment…like an orphaned child growing up in a hostile atmosphere.

Similarly we may have to develop a philosophical attitude that our robotic children who may be humans with implanted chips (Cyborgs) or Totally artificial robots are actually part of our own progeny and accept them as part of the new human race.

This perhaps is the “Techno Philosophy” that we need to develop as Plan-B if our Plan-A to reign-in the uncontrolled development of AI that could destroy the human race fails.

Your views?

Naavi

 

Posted in Cyber Law | Leave a comment

Privacy Pledge..has become the norm of Privacy Day Celebration in India in 2023

On January 19,  FDPPI launched a “Privacy Pledge” program inviting professionals to take a pledge as a mark of celebrating Privacy Day 2023.

The Pledge stated as follows:

Pledge of Data Privacy

On the occasion of International Data Privacy Day 2023,  I hereby take a voluntary pledge to uphold the cause of “Privacy as a Human Right” by taking all steps necessary for Protection and Privacy of Personal data which I shall come across in my Professional and Personal life with due regard to the Principles of Fairness and Lawfulness of processing.

In particular:

I shall adhere to the requirement of obtaining informed consent of the data principals whose personal information comes within my control and shall use, disclose such information only as per the choice of the data principal and in accordance with the applicable laws.

I shall adhere to the principle of Minimal and  purpose oriented Collection of personal data and shall ensure that it shall be shared only on a need to know basis.

I shall take necessary steps to stop using personal information if the purpose for which it came into my possession has been completed.

I shall take necessary steps to ensure that the personal data is kept updated from time to time.

I shall not disclose the personal information except as provided under law or in the genuine interest of the individual or the community.

I shall at all times take steps to ensure the security of the personal data from unauthorized access or modification or denial of access for authorized purposes.

I shall take all necessary steps to comply with the data protection law with regard to reporting of data breach or any other requirement of compliance.

I shall endeavour to keep myself aware of the data protection laws and also spread awareness in my organization and with my professional and personal contacts.

Those who took the pledge through the link CLICK HERE TO TAKE THE PLEDGE were issued a Certificate of pledge in acknowledgement of their action.

Today,  I was pleasantly surprised to see that another Privacy Organization in India namely DSCI also initiated a Privacy Pledge Program with the following pledge.

This Data Privacy Day, I reaffirm my commitment to the spirit of Privacy and Pledge to:

Though the contents of the pledge are slightly different from the FDPPI pledge, I am glad to note that “Requesting and Obtaining the pledge from an individual as a demonstration of his commitment to a cause” has become a trend. 
While the DSCI pledge is more towards the individual protecting his own privacy right, FDPPI pledge is directed to protecting the Privacy of the community of which the professional is a part.
In this context, I also draw attention to an earlier article : “Data Protection Hexagon…An approach to being compliant” where I unveiled the following Data Protection Hexagon as a depiction of the model for implementation of Data Protection in an organization.
We may observe that one of the elements of this model which is meant for “Motivating a Privacy and Data Protection Culture” in an organization is “Acceptance” that follows “Awareness”.
It has been my firm belief which I first propounded as a “Theory of IS Motivation” in September 2009 and used  as part of the Ujvala Framework of HIPAA compliance Audit, that mere creating “Awareness” about a law with the employees would not convert itself into action. Hence  we require a Commitment from the employees in writing.
While getting a written commitment does not mean that an employee can not still violate the commitment, at least it will create an ethical barrier which the employee will have to cross before violating the commitment. 
Hence this principle was included in the  in the PDPSI (Personal Data Protection Standard of India). Now the “Pentagon model” has been upgraded into a “Hexagon Model” and included in PDPSI-Version 2023. The addition is “Role Identification”. 
In the PDPSI (Those who have not studied the PDPSI framework may request Naavi for more information), one of the key Model Implementation Specification (MIS) is “Distributed Responsibility”. Under distributed responsibility, every employee of an organization is expected to shoulder the responsibility of a DPO at his data control space. This may at first glance appear to be different from the “Accountability” principle for an organization where one “Designated DPO” is required to take the responsibility for Privacy and Data Protection. But I think it is an extension of the same at the micro level. While the designated DPO continues to the accountable to the external world, within the organization he needs to be supported by every employee who as part of his work gets access to personal data and can misuse it if he wants before the  control catches him/her.
In this direction the detailed standard/specifications suggest recognition of “Internal Data Controllers” and “Internal Data Processors” where individual employees will shoulder responsibility to ensure that the Data Protection Principles are always followed. This is also relevant in case of “Unstructured Data in the possession of an employee”.
Hence after being aware of the Data Protection Requirements and accepting it, the employee has to also identify himself as the Internal Data Controller or Internal Data Processor with reference to a specific micro level of activity and apply his Internal DPO obligations accordingly.
Hence “Role Identification” has now been added to the Pentagon Model to re-define the Motivational framework for  Data Protection implementation in an organization. The “Tools” represent the Polices, the different software tools made available to the individual as well as the training opportunities which could go beyond the “Awareness” into related skill development. Incentives and Sanctions are the inevitable parts of the puzzle that is required to motivate compliance and discourage non-compliance.
While these principles are part of the FDPPI training of DPOs, I thought that on this auspicious occasion of the Data Privacy day 2023, I should share these thoughts with the community.
(Comments welcome)
Naavi
PS: It is still a matter of intrigue why this concept initiated before 2009 and implemented in HIPAA and ITA 2008 audits of Ujvala despite being published, has taken 14 years to become a trend. I wish that adoption of PDPSI will not take another 14 years to become a trend. For Naavi who has taken the legislation for Neuro Rights and AI Rights as goals for the near future, convincing the professional community that PDPSI Version 2023 is the Privacy and Data Protection Audit framework to follow will be another goal for 2023 and a Privacy Day Commitment. 
Posted in Cyber Law | Leave a comment

Right to be Forgotten under Indian Law.. Participate in the Discussion

Right to be Forgotten was a distinct feature of GDPR and a concept discovered by  the EU Court of Justice.

In the case against Google, CJEU found the GDPR law required a halt to online publication of results that were no longer relevant after a certain amount of time had passed and the individual wanted them removed. Google was found to be a data controller that had to respect the right of the individual to control their own data.

This arose out of an interpretation of Article 17 of GDPR which states as follows:

Article 17:Right to erasure (‘right to be forgotten’)

1. The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies:

(a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;
(b) the data subject withdraws consent on which the processing is based according to point (a) of Article 6(1), or point (a) of Article 9(2), and where there is no other legal ground for the processing;
(c) the data subject objects to the processing pursuant to Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to Article 21(2);
(d) the personal data have been unlawfully processed;
(e) the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;
(f) the personal data have been collected in relation to the offer of information society services referred to in Article 8(1).

2. Where the controller has made the personal data public and is obliged pursuant to paragraph 1 to erase the personal data, the controller, taking account of available technology and the cost of implementation, shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data.

3. Paragraphs 1 and 2 shall not apply to the extent that processing is necessary:

(a) for exercising the right of freedom of expression and information;
(b) for compliance with a legal obligation which requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
(c) for reasons of public interest in the area of public health in accordance with points (h) and (i) of Article 9(2) as well as Article 9(3);
(d) for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) in so far as the right referred to in paragraph 1 is likely to render impossible or seriously impair the achievement of the objectives of that processing; or
(e) for the establishment, exercise or defence of legal claims.

In India , PDPB 2019 specified under Section 20 that Right to be forgotten could be exercised with the order of an Adjudicating Officer. The section stated as follows.

20. Right to be forgotten.

(1) The data principal shall have the right to restrict or prevent the continuing disclosure of his personal data by a data fiduciary where such disclosure—

(a) has served the purpose for which it was collected or is no longer necessary for the purpose;
(b) was made with the consent of the data principal under section 11 and such consent has since been withdrawn; or
(c) was made contrary to the provisions of this Act or any other law for the time being in force.

(2) The rights under sub-section (1) may be enforced only on an order of the Adjudicating Officer made on an application filed by the data principal, in such form and  manner as may be prescribed, on any of the grounds specified under clauses (a), (b) or clause (c) of that sub-section:

Provided that no order shall be made under this sub-section unless it is shown by the data principal that his right or interest in preventing or restricting the continued disclosure of his personal data overrides the right to freedom of speech and expression and the right to information of any other citizen.

(3) The Adjudicating Officer shall, while making an order under sub-section (2), having regard to—

(a) the sensitivity of the personal data;
(b) the scale of disclosure and the degree of accessibility sought to be restricted or prevented;
(c) the role of the data principal in public life;
(d) the relevance of the personal data to the public; and
(e) the nature of the disclosure and of the activities of the data fiduciary, particularly whether the data fiduciary systematically facilitates access to personal  data and whether the activities shall be significantly impeded if disclosures of the relevant nature were to be restricted or prevented.

(4) Where any person finds that personal data, the disclosure of which has been restricted or prevented by an order of the Adjudicating Officer under sub-section (2), does not satisfy the conditions referred to in that sub-section, he may apply for the review of that order to the Adjudicating Officer in such manner as may be prescribed, and the Adjudicating Officer shall review his order.
(5) Any person aggrieved by an order made under this section by the Adjudicating Officer may prefer an appeal to the Appellate Tribunal.

In the DPDPB2022, there is a section 13 which states as follows:

13. Right to correction and erasure of personal data

(1) A Data Principal shall have the right to correction and erasure of her personal data, in accordance with the applicable laws and in such manner as may be prescribed.

(2) A Data Fiduciary shall, upon receiving a request for such correction and erasure from a Data Principal:

(a) correct a Data Principal’s inaccurate or misleading personal data;

(b) complete a Data Principal’s incomplete personal data;

(c) update a Data Principal’s personal data;

(d) erase the personal data of a Data Principal that is no longer necessary for the purpose for which it was processed unless retention is necessary for a legal purpose.

Additionally DPDPB 2022 (as well as PDPB 2019) recognized the right of a search engine to present data as part of public interest and deemed consent.

Hence it appears that India has not endorsed the EU Court’s view that a Right to Forget exists automatically.

Whether Section 13 of DPDPB 2022 has to interpreted as restricted to the “Erasure” of data for further processing only or extends to the sending the identifiable data to the “Oblivion” is a point that is not clear.

Followers of GDPR would interpret that Section 13 of DPDPB 2022 is both a Right to Erase and Right to be forgotten. But Followers of PDPB 2019 may interpret that India wanted to distinguish Right to be forgotten from Right to erasure and hence created two sections in PDPB 2019 but decided to completely drop Right to Be Forgotten in the DPDPB 2022.

There have been a few High Court decisions in the past where Courts have agreed to redact the names of the acquitted accused persons in published judgements. Whether this has created the law for “Right to be forgotten” in India is a moot point.

Recently the Kerala High Court had an opportunity to analyse the law in depth and has come up with a very elaborate analysis of the Right to be forgotten.

These details have already been presented at Naavi.org over the following five articles:

Hats off to the Kerala Judgement on Right to Forget-5: Evolution of the Right to be forgotten
Hats off to the Kerala Judgement on Right to Forget-4: Need for Transparency in Judiciary
Hats off to the Kerala Judgement on Right to Forget-3: Right to Forget is not Right to Anonymity..
Hats off to the Kerala Judgement on Right to Forget..2: Ratio Decidendi in Puttaswamy Judgement
Hats off to Kerala High Court for it’s treatise on Right to Forget

As a part of the Privacy Day Activities, FDPPI is conducting a webinar on 28th January 2023 at 4.00 pm IST which consists of a Moot Court discussion on the Right of litigants of Court cases to request for redaction of their names from the published judgements and the publication of the judgements in Indi Kanoon data base or Google Search.

Interested persons can register for this Zoom Webinar at

Register in advance for this webinar here

Naavi

 

Posted in Cyber Law | Leave a comment

The UNESCO Guideline on AI Regulation

A resolution adopted by UNESCO 41st Session in Paris held between 9th and November 2021 has put out a guidance note for the member nations which was published in 2022.

A copy of the note is available here: 

The G20 meet of 2020 had also flagged the issue in 2020. The EU even brought out a draft AI Act last year.

However, with the release of ChatGPT and the possibility of other similar models of AI being released from Google shortly, the world has reached a stage similar to when a Nuclear Reactor goes critical. The pot of AI is now boiling. If we can control it’s use, we may direct its energies to the benefit of the society.

But if we remain complacent or think that the dooms day is far off and relax, we will soon see the uncontrolled developments of AI that will kill the society.

In the long run the human race is facing the existential threat. In the short term we will see a spurt of Cyber Crimes and a disruption of such magnitude that we are not aware of.

Before the matter goes out of control, the society needs to act positively and try its best to delay the inevitable.

The danger of AI should be seen in combination with the developments in the Neuro Technology that will provide a direct entry of AI into human minds and also the developments in the  VR as a new immersive way of taking over the human faculties by computes.

We therefore request all the responsible members of the society to start addressing this immediate need.

Naavi.org has already started a Community to discuss this on a Telegram platform. Naavi has also raised this issue in a G 20 forum.

Additionally Naavi is addressing the public today through a YouTube live session to place the concerns on the public platform.

Please do attend the session either on Zoom or on YouTube streaming today on 26th January 2023  at 11.30 am (IST)

The link is given below.

Zoom Meeting
https://us02web.zoom.us/j/88675200348?pwd=cGZLOGZ4eVM5TzdFMEJHTXdsSEhPZz09

Or

https://www.youtube.com/@VijayashankarNa/streams.

Today is the “Republic Day” in India but it is the right time to start discussion on this topic which is most relevant from the point of view of preserving the future of human race.

Naavi

PS:

 

Posted in Cyber Law | Leave a comment

Data Protection Hexagon.. An Approach to being compliant

To Be compliant with Data protection or Privacy Protection through Personal Data Protection, an organization needs to implement a systematic approach like a project implementation.  The “Privacy By Design” is a term used in the industry to indicate the approach.

In implementing an effective Personal Data Protection Program (PDPP),  we need to consider that the most important part is to

a) Involve the entire work force in the compliance plan as a Team Effort

b) Keep the workforce motivated to implement the plan and maintain it as a continuing requirement.

Naavi recommends a Six step process to motivate the workforce to collectively implement the Privacy Program for an organization.

The six steps shown in the diagram as six elements of a Hexagon are

    1. Awareness
    2. Acceptance
    3. Role Identification
    4. Tools
    5. Incentives
    6. Sanctions

Awareness building is the common implementation step which is easily understood as conducting necessary trainings so that target audience (Employees) understand the requirements of the Data Protection Laws. This can be done at two levels, namely one at the Management Level and another at the workforce level.

Acceptance Building is a process where the workforce agree from the bottom of their heart the learnings of the awareness building exercise. A commitment from the member of the workforce to be compliant is always a good strategy to ensure that trainings donot remain only matters for ticking the check boxes.

Role Identification is a process where from the knowledge of what is required for data protection compliance built over the awareness building, is applied to an individual’s work responsibilities so that they can identify whether they do access personal data and if so how within their sphere of influence they need to implement the compliance requirements.

Tools provision is the responsibility of the organization and consists of Policy documents (properly explained to the workforce) and technical tools required for discovery of personal data, consent tagging, Encryption, data leak prevention etc.

Incentives are an important aspect of positive motivation so that good compliance culture exhibited by the workforce is rewarded in some manner whether financially or otherwise.

Sanctions are also essential since non conformance need to have a consequence without which the value of Incentivisation also will be less and complacency will set in.

This Hexagonal Approach to Data Protection Motivation is inspired by the Theory of Information Security Motivation and the Pentagon model that Naavi had published several years back.

As had been indicated in the Pentagon model, where five elements of motivations were considered as five walls of a pentagon rather than a hierarchical model of one after another, the Hexagonal Model of Data Protection Compliance should also be considered as a “Compact Hexagon” where each of the elements are walls of the Hexagon and are closed.

As a Closed Hexagon, all six elements are expected to be present simultaneously and not built on a hierarchical model where some elements like Training are provided with Policy documents and expect the workforce to maintain a compliance culture.

FDPPI’s framework of Data Protection Compliance Standard of India (DPCSI) is geared towards implementing a compliance program in conformity with this Hexagonal motivational model.

The “Distributed Responsibility” concept used in DPCSI is a unique binding factor that enhances the efficiency of the Compliance program and to make it work, this Hexagonal Model of motivation would be useful.

Comments welcome.

Naavi

Posted in Cyber Law | Leave a comment

Naavi responding to the UNESCO call for AI Regulation

Naavi has been advocating a Neuro Rights Act for India for some time now. The website www.neurorights.in  captures the developments in Neuro Technology building a logic for Neuro Rights legislation.

In the meantime, the advent of GPT3/GPT4/DALL-E etc have opened up new doors of excitement in the AI world and simultaneously triggered the concerns of one section of civil society whether AI taking over of Human race is nearer than we think.

Most experts watching the development of AGI (Artificial General Intelligence) and ASI (Artificial Super Intelligence) as against the ANI (Artificial Narrow Intelligence) that may be common, suggest that in the next 30-40 years, there is a potential risk of ASI s taking over decision making in the creation and development of AI devices/algorithms.

30-40 years is within the lifespan of today’s youth and looks much more dangerous than any other risk to human mankind other than an alien attack. Alternatively the ASI robots may themselves represent the aliens who will wipe out the mankind. An Astroid hit risk today is manageable and a Nuclear war may only affect parts of our planet. But a rogue army of ASI robots could enslave the human kind the way the “Raise of the Planet of Apes” suggest and this could happen sooner than we think.

To some, this may look like speculation and fear mongering. But there is no harm in guarding against the fear even if it does not materialize. Current generation is being urged to plant trees, reduce use of fossil fuels etc for preserving the planet for the future generation. A whole lot of activities are geared towards protecting Earth from the plundering through mining, deforestation etc.

We now need a movement in the IT domain to ensure that AI does not become a threat to the mankind and we need to start flagging this possibility and start working towards finding solutions.

UNESCO has already called for member nations to work on regulations the way UNCITRAL gave a call for E Commerce  laws in 1996 which gave birth to Information Technology Act 2000. Now India is in the verge of a new Digital India Act. It is the right time to consider Digital India Act (DIA) to include the requirements of Artificial Intelligence Act (AIA). More appropriately like the Telecom Regulatory Act, UIDAI Act, DPDPB2022 etc standing apart from ITA 2000, Artificial Intelligence Act can be a separate Act since it has many nuances to be considered before it becomes a full fledged law and combining it with the amendment to ITA 2000/8 would delay other amendments for which Government might be ready now.

Naavi.org will therefore start taking some action in mobilizing the experts into a task force for developing an Artificial Intelligence Act of India. At some point of time in the future, the MeitY may set up a similar committee. However, in order not to waste time, we have initiated some action immediately.

This year, India is presiding over the G 20 conference and G 20 has also adopted a preliminary resolution in 2020 about working on AI regulation. It is therefore suggested that this year G 20 work on taking the discussion on AI regulation in India further.

Naavi has tried to bring together like minded persons into a common message group and those interested in joining this group may contact Naavi. This group will work not only on AI Act but also on Neuro Rights Act and try to develop a draft legislation for both.

Naavi

 

 

Posted in Cyber Law | Leave a comment