Introduction
The Digital Personal Data Protection Act (DPDPA), 2023 and the Draft DPDP Rules, 2025 have ushered in a new era of data privacy in India. This framework, with an emphasis on enforcing data privacy, has ignited a sense of regulatory urgency. While this signals a significant shift towards stronger privacy protections, it has also created a wave of uncertainty, particularly among startups and emerging technologies.
AI startups often house a wealth of sensitive data, making them prime targets for attackers seeking to exploit this information, creating an orchard for potential threats. The challenge here is understanding the needs of each startup and engaging in compliance practices that are not too resource intensive. For startups navigating lean resources and rapid scaling, compliance with the DPDP framework is a daunting challenge. AI startups specifically must make compliance a priority since AI systems depend on enormous datasets, many of which contain personal data, to operate efficiently. AI in the past has proven to have the ability to scan and analyze data to uncover sensitive facts that people might not want to disclose, this brings with it major privacy issues. The absence of adequate guardrails might result in abuse or illegal access to confidential data.
From appointing Data Protection Officers to managing consent, under the DPDP Framework, startups are grappling with questions about operationalizing compliance in a manner that aligns with their business needs.
Informed Consent
Under Section 4 of the DPDPA, data fiduciaries are required to secure explicit consent from individuals before processing their personal data. This consent must be freely given, specific, informed, and unambiguous. The Draft DPDP Rules emphasize that consent notices must be clear, specific, and easy to understand independently. This means that startups must now establish systems to securely collect, manage, and document user consent, ensuring compliance with these requirements. This starts with providing standalone notices outlining the types of personal data collected, the purposes for data collection, and the uses to be enabled by such processing activity.
These notices must also detail how users can withdraw consent and exercise their rights under the DPDPA. The Rules provide for a consent manager framework to facilitate user consent management. Startups must implement systems that enable users to transparently give, review, and withdraw consent at any time. AI Startups can ensure compliance with consent requirements by developing a user-friendly consent mechanism, creating sufficient granular consent options and ensuring transparency in AI training and decision-making.
Data Security Measures
AI startups handle large-scale, structured, and unstructured data, making them prime targets for cyber threats. The DPDPA requires organizations to implement robust organizational and technical safeguards to protect personal data against unauthorized access, use, disclosure, alteration, or destruction.
- Startups must prioritize the adoption of advanced security measures. This includes implementing encryption to secure sensitive information during storage and transmission, as well as deploying strong access controls to ensure that only authorized personnel can access personal data.
- Some of the reasonable security measures under the Draft DPDP Rules include implementing measures like encryption, obfuscation, masking or the use of virtual tokens mapped to specific personal data.
- Further regular security audits, vulnerability assessments, and penetration testing to identify and address potential risks form a part of the organizational measures that may be undertaken.
Ensuring that sufficient security measures are taken by AI startups to secure their AI model is crucial.
Apart from the security measures, it is also important for organizations to have a strong breach response plan. Since AI systems continuously learn and process data, breach response strategies must be tailored to dynamic AI models.
The draft rules also lay down certain timelines for intimation of breach that must be adhered to. In case of any breach, the AI startup as data fiduciaries must ensure that they take timely action and notify the Data Protection Board as well as the affected Data Principals. Data fiduciaries must provide further information, such as facts of the event, circumstances and reasons behind the breach, remedial actions and report on notifications given to affected Data Principals.
Cross-border Data Transfer
The global nature of tech and AI operations adds another layer of complexity to the establishment. Different countries enforce varying regulations concerning data protection, making it challenging for organizations to ensure compliance while operating across international jurisdictions. AI Startups with global operations, engaging in cross-border data transfers must ensure compliance with these regulations by adhering to the prescribed standards. Data Fiduciaries must ensure that they adhere to the requirements and standards prescribed by the Central Government under Section 16 for the transfer of data outside the territory of India. Additionally, AI Startups must also comply with sectoral regulations governing cross-border transfer of personal data.
Data Retention and Deletion
The Act requires organizations to retain personal data only for as long as necessary to fulfil the purposes for which it was collected. They must establish and implement clear policies for data retention that align with these guidelines. The draft DPDP Rules provide for specific data retention periods based on the purpose for which the data is being collected and processed. Once the data is no longer needed, they should ensure its secure deletion or anonymization to prevent unauthorized access or misuse. Data Principals must be informed 48 hours before their data is to be erased. This process can include automated systems for tracking data lifecycles, conducting regular audits to identify redundant data, and securely erasing it in compliance with industry best practices. By adopting these measures, startups can reduce data-related risks and demonstrate accountability in handling personal information.
Data Principal Rights Management
The DPDP Act outlines various rights for Data Principals, such as the Right to Access, the Right to Request Deletion, the Right to Grievance redressal, etc. Startups must establish processes to enable users to exercise these rights efficiently. This includes creating a clear, accessible mechanism for submitting requests, publishing transparent timelines for responses, and ensuring compliance with the Act’s requirements. Further, just the existence of these rights is not enough; Data Principals must be made aware of these rights at the time of obtaining their consent for processing data in a manner that is easy to understand. In the case of AI startups, facilitating explainability and transparency of the AI model makes it easier for the organization and users to understand, enforce and manage the rights.
Children’s Data Protection
The DPDP Framework imposes certain special obligations on data fiduciaries processing the personal data of children (persons under the age of 18). Startups will have to implement reliable verifiable parental consent and verification mechanisms. Before collecting or processing children’s data, startups are required to obtain verifiable parental or guardian consent to ensure compliance with the Act. It is also essential to limit the collection of children’s data to only what is strictly necessary, ensure its secure handling, and avoid practices that could exploit minors. The Draft DPDP Rules further elaborate on verifiable parental consent stating that verification can be done using reliable existing data, self-declared details, or government-issued virtual tokens, such as those from a Digital Locker service provider. Data Fiduciaries must also ensure that they do not use children’s data for behavioural monitoring or tracking.
AI startups must take steps to implement a clear parental consent mechanism, ensure compliance in exempted cases, adhere to data minimization and purpose limitation requirements, safeguard children from harmful content, and conduct regular audits for ongoing compliance.
Privacy by Design
There is a growing emphasis on the concept of “privacy by design,” which advocates for incorporating privacy considerations into the development lifecycle of AI systems. This proactive approach aims to minimize risks associated with data collection and processing from the very beginning, ensuring that privacy is a fundamental aspect of AI system design rather than an afterthought. Adopting the privacy-by-design approach will play a significant role in easing regulatory compliance.
Conclusion
For AI startups in India, compliance with the DPDP Act and Draft DPDP Rules is essential for fostering growth and building trust. By embedding privacy-by-design, ensuring data minimization, purpose limitation, and adhering to data retention requirements, startups can demonstrate commitment to responsible data practices. Obtaining consent as mandated by the DPDP Framework, conducting DPIAs, and maintaining transparency in data handling will not only help comply with the law but also provide a competitive edge in the Indian market. However, balancing AI innovation with privacy compliance, addressing bias and fairness, and managing the high costs of implementing privacy-enhancing technologies remain critical challenges for startups striving for sustainable growth. Ultimately, navigating these challenges effectively will enable AI startups to thrive while maintaining public trust and ensuring long-term success in a privacy-conscious market.
Tsaaro Consulting, in collaboration with PSA Legal Counsellors and Advertising Standards Council of India, has authored a whitepaper titled ‘Navigating Cookies: Recalibrating Your Cookie Strategy in Light of the DPDPA’. If you want to learn more about cookie consent management, read the whitepaper by clicking here.
The Ministry of Electronics and Information Technology (MeitY) has released the Draft DPDP Rules, 2025 for Public Consultation!