New Aadhaar App to assist Age Verification for DPDPA

The UIDAI has launched a new Aadhaar App which according to the Secretary of MeitY , can be used for age verification under DPDPA. Necessary amendments have been made to SWIK rules or the  Aadhaar authentication for goog governance  (Social welfare, Innovation, Knowldege) rules 2020 to enable private entities to provide service by using adhar authentication on secure basis.

This was expected and is a welcome move to resolve the difficulty of “Verifiable Consent” envisaged under DPDPA.

The new Aadhaar app is an official mobile application developed by UIDAI that enables digital, offline, and consent‑based Aadhaar verification. Unlike earlier apps, it allows users to verify their identity using Face Authentication or QR scanning without revealing their Aadhaar number. It offers features such as selective data sharing via QR codes, biometric lock/unlock, authentication history, and management of up to five family Aadhaar profiles. The app supports use cases like hotel check-ins, hospital visits, age verification, and gig worker verification

The new Aadhaar app offers several advantages over older verification methods.

  • Eliminates the need for physical Aadhaar cards.
  • Enhances privacy through masked and offline verification.
  • Faster identity verification for daily services.
  • Reduces risk of Aadhaar data misuse.
  • Works even in low or no‑internet environments.
  • Government‑backed and officially launched by UIDAI.
  • Several personal information updates can be completed using the app without visiting an Aadhar kendra.

Naavi

Reference:

The Hindu

About the new App at cleartax

Posted in Privacy | Leave a comment

PIL Filed in Madras High Court on Section 63 of BSA

Whenever an electronic document is to be presented as evidence in a Cour of law, the ITA 2000 expected a certificate to be produced about the reliability of the document for admission purpose. Earlier it was through Section 65B of Indian Evidence Act.

Naavi was the first person in India to produce such a certificate in a court case starting with the Stte of Tamilnadu vs Suhaskatti case in 2004. Since then more than 125 such certificates have been produced by Naavi in different courts.  While Naavi has stopped providing such certificates now due to the inability to attend court hearings to verify the certificates, there are other associates who have been developed for the purpose in Bangalore.

After the replacement of Indian Evidence Act with  Bharatiya Sakshi Adhiniyam (BSA), the new section related to certification is Secion 63.

The MHA made some changes in the Section 65B certification requirements in the process.

Now a PIL has been filed in Chennai by Sri S.Balu, former Additional SP, on behalf of Cyber Society of India. Mr Balu who was in charge of the Chennai Cyber Crime Police Station and has worked on many Cyber Crime cases along with Naavi. The petition has been filed at the Madras High Court as a writ petition WP/0047513/2025 in the Court of Honourable Chief Justice  G.Arul Murugan. It will be taken up along with WP37423,27426,27880/2024 on a future date. (Not announced).

The petition has prayed for amendment to Section 63(4)(c) citing impracticality of certain provisions of the Section.

The respose of the Court would be interesting and we  shall follow the developments here.

Section 63(4) of BSA is reproduced here for reference

In any proceeding where it is desired to give a statement in evidence by virtue of this section, a certificate doing any of the following things shall be submitted along with the electronic record at each instance where it is being submitted for admission, namely:-

(a) identifying the electronic record containing the statement and describing the manner in which it was produced;

(b) giving such particulars of any device involved in the production of that electronic record as may be appropriate for the purpose of showing that the electronic record was produced by a computer or a communication device referred to in clauses (a) to (e) of sub-section (3);

(c) dealing with any of the matters to which the conditions mentioned in sub-section (2) relate, and

purporting to be signed by a person in charge of the computer or communication device or the management of the relevant activities (whichever is appropriate) and

an expert shall be evidence of any matter stated in the certificate; and

for the purposes of this sub-section it shall be sufficient for a matter to be stated to the best of the knowledge and belief of the person stating it in the certificate specified in the Schedule.

Naavi

Refer: Earlier article  Section 63 of Bharatiya Sakshya Adhiniyam

 

Posted in Privacy | Leave a comment

FDPPI would like to facilitate DPDPA petitions in Supreme Court to be cleared at the earliest

India has witnessed a continued battle on introduction of Privacy laws since 2 decades. Every time the Government makes an  attempt whether in 2006 when the Personal Data Protection Bill 2006 was introduced in the Parliamemnt by the then Government of which Mr Kapil Sibal was a part to the period of  2017 to 2023 when Supreme Court through the Putaswamy Government pushed the need for a law, there has been opposition for the law on one ground or the other.

Now after a long delay, Government of the day has taken steps to announce the time line of implementation. The law was enacted on 11th August 2023 but the implementation is happenning only on 13th May 2027 nearly 4 years later.

From 2023 to till date activists had the freedom to assist the Government to make appropriate challenges provided they were willing to have some flexibility to understand that “Privacy Cannot be allowed to be a tool of Criminals to hide”. They however waited till the date of implementation was frozen and have now gone to the Supreme Court.

On the face of it, the petitions of Reporter’s Collective, Mr Venkatesh Nayak and NCPRI are focussed on the dilution of the RTI Act but the petitions are not limited to the controversy on Section 44(3). The prayer extends to scrapping of DPDPA and the rules.

The grounds apart from the RTI Act is  “Unfettered powers to the Government on surveillance”, “DPB susceptible to  Executive Control” , “Vagueness,overboard  and arbitrary”, “Disproportionate to the needs”, “Enabling unreasonable digital searches” , “Lack of balance between protection of Privacy Rights and Right to Information” etc.

One of the petitions specifically asks for striking down of Sections 5, 6, 8, 10, 17, 18, 19, 36, and 44(3), alongside Rules 3, 6, 7, 8, 9, 13, 16, 17, and 23 of the 2025 Rules.

In the past we have seen that the Government of India has not adequately defended the rights of citizens in the Supreme Court against the powerful advocates such as Mr Kapil Sibal, Prashant Bhushan and Vrinda Grover. These are firebrand advocates who are considered capable of swaying the views of the Court through their commendable skills of articultion.

FDPPI is committed to ensuring that DPDPA is implemented without further delay. While we do support many changes in the rules to enable “Compliance without Pain” and “Penlty without Grudge”, the objective of “DPDPA implementation at the earliest” remains in the forefront.

We shall therefore give a series of informative articles here which explains each of the DPDPA Clauses on which some objection has been raised.

We may also have to take a look at Subash Chandra case or Ankit Garg case or Girish Ramachandra Deshpande case which have been cited along with the Puttaswamy case to defend the petitions.

We believe that the Rules are flexible and can be tweaked if necessary. Supreme Court also has the power to read down any of the provisions of the law. A Combination of “Reading down” and “Tweaking of the Rules” can together satisfy the petitioners without the need for scrapping the law.

We hope the information provided here would help the other professionals to understand and follow the case more effectively. (The next hearing is on March 23)

Follow us and contribute your thoughts…

Naavi

Posted in Privacy | Leave a comment

10 year journey with GDPR

On 25th May 2016, GDPR became a law. It provided a window of 2 years for implementaion and hence the law became effective from 25th May 2018. We now have the experience of 8 years of implementation and hundreds of cases where penalties were imposed. According to the enforcementtracker.com, 2775 complaints have been recorded and a total fine of of 6.8 billion Euros have been imposed. We are not awaare of how much has been actually collected and the state of litigations. Now about 30-40 fines are being imposed each month. (refer tracker report 2025).

The highest fine imposed  was EUR 1.2 billion on Meta Platforms. Some of the other countries have mocked the astronomical fines imposed by GDPR authorities in various countries. These fines have remained under dispute and we need to wait a long time before they become a reality. Since EU had a data protection directive even before GDPR, there were trials based on the earlier directive undertaken after 25th May 2018.

Many countries who followed EU with their own laws also adopted measures to impose their own fines and a global cost of data management was imposed on the industry. Out of these UK has imposed fines of about 15 million  pounds. Cumulative data of other countries is not easily available.

The practice of imposing fines on global turnover basis and on foreign entities, created a fear and urgency for compliance but has not endeared GDPR to the organizaions.

Organizations incurred high costs of compliance particualrly during the period 2018-2020 and have been maintaining substantial expenses since then.  During 2016-2018 according to one survey, the investment for compliance was around $7.8 billion and since then there is annual expenditure of around $10 million each year by about 40% of organizations while around 88% spend less than $1 million. In 2025, the global market for GDPR tools was estimated to be around $3.7 billion. A conservative estimate on a global level indicatesmore than $20 billion invested in compliance.

In India it is estimated that the industry would spend around Rs 10000 crores in the next 3 years on compliance.

 The transparency brought about by GDPR is good for the public but there is still problems of cosnent fatigue and the realization that this cost can finally only be borne by the consumers in the long run since large data processors have continued to prosper.

The smaller entities in the industry (Despite exemptions provided under GDPR)  have however borne the brunt of the problems arising out of increased compliance burden.

India now has an opportunity to learn from these developments and ensure that SMEs and MSMEs are not unduly harassed as if this is a new tax regime. The responsibility for this falls squarely on the Data Protection Board and the MeitY.

While many other organizations will look at the so called “Rs 10000 crore Market” and how they can exploit it, FDPPI is concerned about

a) How to increase awareness of compliance particualrly at the industry level

b) How to ensure that the penalty system remains fair

c) How to ensure that the rules of compliance are  practical

We have miles to go before we sleep…to achieve “Compliance without Pain and Penalty without a grudge”.

Naavi

 

Posted in Privacy | Leave a comment

The AI Summit ..Sarvam AI mayam…But where is AI security?

The India impact AI summit has been a great success despite the first day problem of crowd management and the needless embarassment caused by one of the exhibitors. It has created a high degree of awareness in the Indian public and also  drawn international attention to India’s progress in the field.  It will take some time for the current status of AI to be fully understood in the “Sarvam AI mayam” euphoria created by the event

Despite the different reports about the event in the media, there is not much coverage on the “AI Risks” both to the users and to the society.

Normally innovators are not concerned about the impact of any new technology on the society. The talk of “Ethics” is simply an eye wash. Untill “Ethics” is enforced through a law which is sufficiently deterrant, no commercial organization can be expected to recognize “Ethics” beyond the word being repeated in speeches.

It is the responsibility of the society to conisder if India has to recognize the AI risks and take regualtory steps to ensure that they donot become a problem like how Cyber Crimes have become a problem for the society.

AI driven Risks may manifest both as operational Risks as well as AI driven Cyber Crimes. They will create a larger challenge to the society which cannot be ignored.

These are additional to the debate whether AI will result in Job Losses, Businesses going bust, AI taking over humans etc.

Were there any stalls in the summit on these themes?… Were there panel discussions?…Were there expert  talks? Were there solutions discussed?….. We need to explore.

In the meantime, I leave below some instances of AI related issues in health care which I had collected a few days back which should open our eyes on operational risks in the use of AI.

  • UnitedHealth & Humana “nH Predict” Algorithm (2025):
  • AI algorithm used to deny coverage to elderly patients had a 90% error rate on appeal.
  • The system, optimized for cost-cutting, disproportionately impacted patients, with humans often overturning 9 out of 10 denials.
  • Dermatology AI Bias (2024):
  • A study on skin cancer detection AI found that most systems struggled to perform on non-white skin, with significant performance drops in sensitivity for dark-skinned individuals.
  • Pulse Oximeters Overestimation (2024):
  • A UK review confirmed that pulse oximeters, often aided by AI, tended to overestimate oxygen levels in people with darker skin, leading to potential delays in treatment.
  • Epic Sepsis Model (2022/2024):
  • A widely deployed sepsis prediction model in hundreds of U.S. hospitals was found to have a very poor, failing performance compared to its advertised performance
  • It missed 67% of sepsis cases while triggering excessive false alarms.
  • Fake Medical Information (2025):
  • Studies showed that AI chatbots, such as GPT-4, failed to gather complete medical histories and sometimes generated incorrect, dangerous diagnoses based on simulated patient conversations.
  • ECG Misinterpretation (2025):
  • In a 2025 trial, an AI-enabled ECG tool wrongly flagged a heart attack for a healthy 29-year-old woman, illustrating how models can be “statistically confident while still being clinically wrong”.
  • NEDA “Tessa” Chatbot (2023):
  • The National Eating Disorders Association had to disable its chatbot, Tessa, after it was found to be providing dangerous weight-loss advice and calorie-tracking recommendations to people with eating disorders.
  • Data Privacy Violations (DeepMind):
  • Google’s DeepMind received criticism after it was revealed that the NHS had provided data on 1.6 million patients to train its “Streams” app without proper patient consent.
  • Robotic Surgery Failures (2023):
  • AI-powered robotic systems have shown failures where the electrical current can leave the robot, resulting in accidental burns to surrounding tissues

Let us study such incidents and try and find solutions in the form of technology and governance.

We need to start discussing solutions to AI risks and the need for new regulations including modification of ITA 2000 and introduction of the concept of Neuro Rights within DPDPA.

Naavi

Posted in Privacy | Leave a comment

The DPDPA Challenge in Supreme Court

The Supreme Court heard three petitions on February 16 challenging DPDPA act as well as the rules.

The key aspects of the disputes raised are

  1. Section 44(3) which amends RTI act section 8(1) will dilute the current procisions.
  2. Government seeks powers to seek data from data fiduciaries
  3. The Act fails to bring a balance between Right to Privacy and Right to freedom of information.

Naavi.org would analyse the petitions in detail in due course. We are in receipt of copy of one of the petitions posted by Mr Apar Gupta on his website. Copies of the other two petitions are still not available.

Naavi

Posted in Privacy | Leave a comment