The Digital Personal Data Protection (DPDP) Act of 2023 lays down the obligations of a Data Fiduciary. It received presidential assent on August 11, 2023, and will come into force on a date to be subsequently notified. The law reinforces the right to privacy and underlines the necessity of processing data only for lawful purposes.

Definition of Data Fiduciary

According to section 2 (i) of the Act, ‘Data Fiduciary’ means any person who alone or in conjunction with other persons determines the purpose and means of processing personal data.

General Obligations

Section 8 of the Act expounds upon the General Obligations of the Data Fiduciaries.

First and foremost, Data Fiduciaries bear the overarching responsibility for ensuring compliance with the provisions of the DPDP Act and its allied rules. This responsibility holds true irrespective of any agreement to the contrary or the failure of a Data Principal to carry out the duties provided under this Act. In essence, Data Fiduciaries are entrusted with the duty to ensure that all data processing activities, whether conducted by them directly or on their behalf by Data Processors, are in strict compliance with the law.

Data Fiduciaries are granted the authority to engage Data Processors for processing personal data, but this must occur under the aegis of a valid contract. Such contracts are integral to upholding a robust level of data integrity when personal data is processed for the purpose of offering goods or services to Data Principals.

Furthermore, when personal data is to be used in order to influence decisions related to the Data Principal or is poised to be disclosed to another Data Fiduciary, it becomes the obligation of the Data Fiduciary to ensure the completeness, accuracy, and consistency of this data.

To guarantee effective adherence to the law, Data Fiduciaries are bound to implement appropriate technical and organizational measures. These measures serve as a linchpin in safeguarding personal data and averting data breaches.

In the same vein, Data Fiduciaries must ensure the protection of the personal data within their custody, as well as that processed on their behalf by Data Processors. To accomplish this, they are required to put in place reasonable security safeguards designed to prevent data breaches.

In the unfortunate event of a personal data breach, the regulations mandate that Data Fiduciaries promptly inform the Data Protection Board and each affected Data Principal about the breach in a form and manner which shall be prescribed.

On the front of data retention, Data Fiduciaries are expected to erase personal data under two primary conditions: when the Data Principal withdraws their consent, or when it becomes reasonable to assume that the specified purpose for data processing is no longer being served (whichever is earlier). However, the time-frame for this determination may vary, depending on the nature of the Data Fiduciary and the purposes for which data is processed, and will be prescribed later.

In addition to these obligations, Data Fiduciaries must also publish the business contact information of a Data Protection Officer, if the provision for appointing one is applicable to them. Or, Data Fiduciaries are to appoint an individual who should be capable of responding to questions raised by Data Principals regarding the processing of their personal data. Furthermore, establishing an effective grievance redressal mechanism is also mandated since it is imperative for ensuring transparency, accountability, and fostering trust.


The obligations of Data Fiduciaries within the framework of DPDP Act are substantial and have far-reaching implications. These regulations are meticulously crafted to safeguard personal data, cultivate trust between Data Fiduciaries and Data Principals, and ensure ethical, responsible, and compliant data usage in today’s data-centric world. The DPDP Act mandates penalties ranging up to Rs. 250 crores, making compliance imperative.

Major Privacy Updates of the Week

Government of India Rolls Out First Edition of India AI 2023: Expert Group Report

The Ministry of Electronics and Information Technology (MeitY) organized seven specialized teams to collaboratively develop the guiding principles, goals and tactical plans for each of the key areas of IndiaAI. The first edition of the report which is endorsed by the Prime Minister would be serving as the cornerstone of India’s comprehensive AI strategy. The government’s approach to AI is quite advanced and widely spread over various spheres as demonstrated by the initiatives under the IndiaAI program.

The initiative adopts a much-dedicated strategy to address the shortcomings in the current AI landscape ranging from issues related to the computing infrastructure, management of data, research and development, funding and skill development. The seven working groups have laid down elaborate steps to create Centers of Excellence (CoEs) and have further outlined the governing structure for the collection of data, processing and storage which would be supervised by the National Data Management Office.

The report further provides insights on how India can utilize its own demographic population and IT expertise to grow AI skills and ultimately fortify the AI computing infrastructure through various public and private collaborations

CLEARVIEW AI Successfully Appeals for A $9 Million Fine in the U.K

Clearview AI, a New York-based company that created a facial recognition app by scraping billions of public photos from the internet, has been exempted from paying a £7.5 million ($9.1 million) fine by a British appeals court. The court ruled that the UK’s chief data protection agency lacks jurisdiction over how foreign law enforcement uses data from British citizens. Despite this, regulators in Australia, Canada, and Europe have found Clearview AI in violation of their privacy laws for collecting citizens’ data without consent. The company has also been fined 20 million euros ($21 million) each by data protection agencies in France, Italy, and Greece. These fines pose a significant financial risk to Clearview AI, which has raised just over $38 million in investment. However, the company may be able to overturn these fines based on the argument it successfully used in the UK, according to James Moss, a data protection specialist.

Hackers Leak Millions More 23and Me User Records on Cybercrime Forum

A hacker known as Golem has leaked a new dataset containing records of four million 23andMe users on the cybercrime forum BreachForums. This comes two weeks after the same hacker leaked other user data from the genetic testing company. The newly leaked data reportedly includes information on individuals from Great Britain and wealthy people in the U.S. and Western Europe. 23andMe is currently reviewing the data to verify its legitimacy.

Previously, on October 6, 23andMe announced that some user data had been compromised through a technique known as credential stuffing. In response, the company advised users to change their passwords and enable multi-factor authentication. They also launched an investigation with third-party forensic experts. The company blamed the breach partly on customers reusing passwords and an opt-in feature called DNA Relatives, which could potentially allow hackers to scrape data from multiple users by accessing a single account.

There are still many unanswered questions, including the techniques used by the hackers, the extent of the data stolen, and the hackers’ intentions. An earlier claim by a hacker on another forum, Hydra, suggested possession of 300 terabytes of 23andMe data, but this has not been verified. The full extent of the data leak remains unclear.


Google unveils legislative framework for protecting kids online

The Legislative Framework to Protect Children and Teens Online has been released to outline principles for laws aimed at enhancing online safety for young internet users. The framework emphasizes the need for age-appropriate features and services designed with safety in mind. Google is actively working on such protections, including parental controls in Family Link on YouTube and non-personalized advertising for children and teens. The framework supports the idea that technology companies have a responsibility to create safer online environments and endorses legislative models based on age-appropriate design principles.

However, the framework also urges policymakers to be cautious in their approach to avoid unintended consequences, such as blocking access to essential services or requiring unnecessary personal information. The goal is to collaborate with a diverse set of organizations to establish strong, consistent standards for online safety while respecting the unique perspectives of different societies.

California Enacts In-Vehicle Camera Privacy Law

California Governor Gavin Newsom has signed Senate Bill 296, which requires motorists to be notified if their in-vehicle cameras capture images. Introduced by Sen. Bill Dodd, the legislation also prohibits the sale of these images to third parties or for advertising purposes. The law aims to give consumers more control over their personal information and addresses concerns about the potential misuse of in-vehicle camera data by data brokers and other third parties. However, the law will not affect the use of cameras for traffic safety purposes.

Curated by: Prajwala D Dinesh, Ritwik Tiwari, Ayush Sahay


Keep up to pace with this high-impact weekly privacy newsletter that
features significant data privacy updates, trends, and tools that can
help to make your life secure & easier every day!

*By clicking on subscribe, I agree to receive communications from Tsaaro