AI Enabled Data Analytics and DPDPA Risk.. DGPSI..3

One of the hallmarks of DGPSI (Digital Governance and Protection Standard of India) is that it recommends a “Process Based Approach” to compliance and an aggregation to arrive at the “Enterprise Level Compliance”.

In other words, the DGPMS (Digital Governance and Protection Management System) is an aggregation of Process1, Process 2 etc where process n refers to a technology process where applicable personal data is an input or a product of generation or is being stored, modified or disclosed.

One example of this approach, is the website compliance. In this approach, a “Corporate Website” is a process and Compliance as per DPDPA applies to personal data collected during the visit of a data principal to the website serving corporate information. The purpose of the website is serving of corporate information and the collection of personal data should be limited to the purpose, retained for the required purpose, secured during the purpose etc. DGPSI discourages use of “Omnibus Privacy Notices” and recommends process specific privacy notice and consent”.

Similarly, under this principle, AI enabled Data Analytics can be considered as a “PII Process” which requires to be compliant to DPDPA and can be assessed separately and certified for compliance.

DPDPA Compliance (DP.COM.) for AI Enabled Data Analytics can be a combination of “DP.COM for AI algorithm” used and “DP.COM for Data Analytics algorithm” used. AI itself can be defective due to BIAS and HALLUCINATION and along with Data Analytics, which may ignore notice and consent requirements and therefore, there could be doubling (Squaring) of the DPDPA risks.

During the last week’s ETCIO conference in Bengaluru, the presentations of many companies indicated an aggressive use of AI Enabled Data Analytics to draw different “Insights” into the behaviour of customers and for generating automated decisions that could persuade the customers of a service towards a desired objective of purchase on the e-commerce website.

While, as an ex-Marketing professional, I do agree that Business should have the ability to profile their customers and direct their marketing efforts to bring maximum customer satisfaction even on the “Post Purchase Experience”, as a Privacy an Data Protection professional, I am constrained to point out that a “Consent” is required from the customer before his personal data is collected deceptively and manipulated to conclude a sale.

It is not correct to only object “Data Subject Manipulation” when Cambridge Analytica uses personal data for creating ads for Election Campaign and ignore an e-Commerce entity make you buy things which you do not want.

When I pointed out that AI+Data Analytics has the negative intelligence probability, I was indicating that “Dark Patterns” and “Deceptive Marketing” is legally not allowed. This could become a non compliance issue and lead to DPDPA fines.

In this connection, I want to draw the attention of the audience on the Consumer Protection Act 2019 and the notification on Dark patterns issued on 30th November 2023 which states

“dark patterns” shall mean any practices or deceptive design pattern using user interface or user experience interactions on any platform that is designed to mislead or trick users to do something they originally did not intend or want to do, by subverting or impairing the consumer autonomy, decision making or choice, amounting to misleading advertisement or unfair trade practice or violation of consumer rights;

For details of the Consumer Protection Act and penalties refer here:

The rules also provide a list of practices that may be considered as “Dark Pattern Practices” which include “False Urgency”, “Basket sneaking”, “Confirm shaming”, “Forced action”, “Subscription trap”, “Interface interference”, “Bait and Switch”, “Drip Pricing”, “Disguised Advertisement”, “Nagging”, “Trick question”, “SaaS billing”, “Rogue Malware”, etc.

Under DPDPA 2023, the “Fiduciary” who is a trustee of the Data Principal is obligated to process the personal data only for a “lawful purpose”. The intention of the Consumer Act and the above rule is to indicate that it is not lawful to use “Dark Patterns” and it could lead to a penalty of upto Rs 250 crores under DPDPA.

I request all the Tech Experts to review the AI Enabled Data Analytics patterns used by them and check if they are not “impairing the consumer autonomy, decision making or choice and trick users to do something they originally did not intend doing.

DGPSI therefore recommends that there is a need to audit the use of AI enabled Data Analytics, and ensure that it is in compliance to DPDPA requirements. DGPSI also tecommends a specific policy for “Monetization” as well as “Discovery consent”.

I suggest that the interesting equation that ETCIO coined for their conference needs to be modified as

where i is the complex number representing the DPDPA impact.

(P.S: Sorry to use Complex Number theory in explaining the concept. Ignore if you want)

If you disagree, please let me know why? If you agree, please let me know how you are going to meet the compliance gap when DPDPA becomes effective whenever the Government notifies the date of effect for penalties.

Naavi

About Vijayashankar Na

Naavi is a veteran Cyber Law specialist in India and is presently working from Bangalore as an Information Assurance Consultant. Pioneered concepts such as ITA 2008 compliance, Naavi is also the founder of Cyber Law College, a virtual Cyber Law Education institution. He now has been focusing on the projects such as Secure Digital India and Cyber Insurance
This entry was posted in Cyber Law. Bookmark the permalink.