(P.S: DGPSI=Digital Governance and Protection Standard of India)
The EU AI act from the compliance perspective mainly addresses the handling of AI at three specific contexts. First is the development of AI (manufacturer) and second is the deployment of AI (provider or deployer). The third category which is is the distribution of AI software. (Importer or distributor)
Article 1(a) states that the regulation applies to
“providers placing on the market or putting into service AI systems or placing on the market general-purpose AI models in the Union, …,importers and distributors of AI systems, product manufacturers …and..affected persons“.
The term manufacturer has been used in EU-AI act because one of the legislative concerns is the use of AI within other products such as AI in Automobiles. Here AI may be used as part of the automated product system mainly for enhancing the quality and security of the product as different from the use of AI in privacy situation where the emphasis is “Profiling”, “Targeted Advertising”, “Behavioural Manipulation” etc.
In terms of compliance we need to look at each of the three contexts differently. Out of these contexts, the development and deployment are the key areas of compliance. The “Affected persons” are relevant from the perspective of identifying the “Harm” or “Risk” in deployment.
At the development stage, AI developer/manufacturer needs to be transparent and ensure that the algorithm is free from bias. At the same time the developer should ensure that the machine learning process uses data without infringing copyright.
When a physical product manufacturer such as an automobile manufacturer embeds an AI system for say efficient braking based on visual reading of an obstruction, he may be using it as a “Component” and the responsibility for compliance as a developer should be primarily on the AI software manufacturer though it gets transferred to the automobile manufacturer by virtue of the embedded product marketed by them. In the IT scenario such usage of embedded products are more accurately identified as “Joint Data Fiduciaires” or “Joint Data Controllers”. In the context of an automobile manufacturer, the role of the automobile manufacturer as a “Data Fiduciary” is not clearly recognized but DGPSI recognizes this difference and looks at the component as a “Data Processor” the responsibility for which is with the component manufacturer unless it is consciously taken over by the auto manufacturer.
The developer needs to establish an appropriate process for “Compliance during the process of development of an AI” which includes a proper testing document that can be shared with the deployer as part of the conformity assessment report.
At the deployment stage, the control of the AI system has been passed on to the “Deployer” and hence the role of the developer in compliance during usage is less.
But Article 61 of EU AI act prescribes the Post-Marketing Monitoring system which is required to be set up by the “Providers” to ensure compliance of EU AI act. Here EU AI act appears to use the term “Provider” from the perspective of both the developer and the deployer.
DGPSI however wants to maintain the distinction between the “Developer” and “Deployer” and build compliance separately. Under DGPSI, Both the Development monitoring process as well as the deployment monitoring process can be expressed in terms of a Data Trust Score or DTS which is the way DGPSI expresses the maturity of compliance in general.
AI-DTS-Developer and AI-DTS-Deployer could be two expressions that can be used to express the compliance.
AI deployers are the “Data Fiduciaries” under DPDPA 2023 and the compliance concern is mainly on how the personal data collected in India is being processed by the AI system.
Article 61 of the EU-AI act provides the requirement of a “Post market monitoring” by the providers for the high risk AI systems. Let us look at Article 61 as the basis for AI-DTS-Deployer.
Article 61.1 states that
“Providers shall establish and document a post-market monitoring system in a manner that is proportionate to the nature of the artificial intelligence technologies and the risks of the high-risk AI system.”
Article 61.2 states
The post-market monitoring system shall actively and systematically collect, document and analyse relevant data which may be provided by deployers or which may be collected through other sources on the performance of high-risk AI systems throughout their lifetime, and allow the provider to evaluate the continuous compliance of AI systems with the requirements set out in Title III, Chapter 2.
The narration under 61.2 indicates that the developer of AI system has to get the post-marketing feedback which is a debatable prescription.
It appears that through this prescription, the EU-AI act is legitimizing the installation of backdoor by the developer.
Under DGPSI we reject this suggestion and identify the responsibilities of the developer separately from that of the deployer. It is open for them to determine whether they will be “Joint Data Fiduciaries” sharing the compliance responsibilities or that the deployer takes over the responsibilities all by himself.
This is a key aspect of difference between the compliance requirements of EU-AI act and the approach of DGPSI as a framework of compliance. It is open for ISO 42001 to adopt the EU-AI act as it is expected to do but DGPSI will keep up the distinction which we consider will be flexible enough to consider that the “AI Backdoor” is a legitimate prescription but with the “Consent of the Deployer”.
This requires a full scale debate…
Naavi
P.S: This debate is to develop Privacy Jurisprudence and request experts to consider this as a brainstorming debate and add their positive thoughts to guide the law makers in India to develop a better AI act than the EU-AI act.