Safeguarding children’s personal data has become a priority for businesses, organisations, governments and regulatory bodies. As children increasingly engage in the digital world, there is an urgent need to establish strong privacy frameworks and practices. It is essential for both businesses and regulators to adopt measures that ensure children’s data is handled responsibly. On 28th March 2024, Singapore’s Personal Data Protection Commission (PDPC) released Advisory Guidelines on the PDPA for Children’s Personal Data in the Digital Environment (The Guidelines), to be read with Chapter 8 of the PDPC’s Advisory Guidelines on the PDPA for Selected Topics, to further clarify the applicability of PDPA in relation to children’s data.
Scope
These guidelines apply to any organisations whose products or services may be accessed by children (any individual below the age of 18 years) including social media, EdTech, online games, smart devices, etc.
Communication and Notice requirements
While communicating with children, it is essential for organisations to adopt age-appropriate language and media. Organisations must ensure that the language used in notices and communications is easily understandable by children. The guidelines specify that notifications of consent, purpose, data protection policies and terms and conditions must be in an easily understandable language. For example, the organisation can use plain and simple language accompanied by audio or visual elements.
Consent
Section 13 of the PDPA allows organisations to collect and process data with the valid consent of the relevant individual. Under Singapore’s PDPA, children aged 13-17 can give valid consent for the collection, use, and disclosure of their personal data if they understand the implications, i.e., if the purpose and consent requirements are conveyed to them in readily understandable language. However, if the organisation has doubts about the child’s understanding or if business requirements necessitate a higher age of consent, parental consent becomes necessary. The mechanism for withdrawing consent must be as simple as the process for giving it. The PDPC regards consent obtained from an individual or their parent during childhood as remaining valid when the individual turns 18.
Reasonable Purposes for processing of children’s data
Section 8 of the PDPA deals with the concept of purpose limitation and states that an organisation may collect and process an individual’s data only for a purpose that a reasonable person would deem to be appropriate. The PDPC adopts a principles-based approach, emphasizing the protection of children due to their vulnerability in the digital environment. The guidelines outline some reasonable purposes, including:
- Age assurance to restrict access to age-appropriate content.
- Protecting children from harmful content.
- Using behavioural data to guide children towards safety resources.
Organisations are also expected to limit the collection and sharing of children’s data. For example, children’s account information should not be public or searchable by default.
The Guidelines specifically discuss the need for data minimisation in the context of geolocation data and data processed for ascertainment of age. The PDPC allows organisations to use age verification or estimation methods beyond account registration to safeguard children, stressing data minimisation and not requiring national IDs unless mandated by law. If behavioural or telemetric data collected for age assurance can identify users, it is classified as personal data under the PDPA. Similarly, The PDPC classifies geolocation data as personal data if it can identify an individual when combined with other identifiers. To protect children, organisations should minimise data collection, disable geolocation by default, and opt for approximate rather than precise location data.
Protection of Children’s Data
Children’s personal data is considered sensitive under the PDPA and requires higher protection standards. Organisations should implement Basic Practices like ICT security policies and risk assessments for outsourcing, and Enhanced Practices such as using 2FA/MFA for admin access and conducting penetration tests before launching new systems. PDPC’s Guide to Data Protection Practices for ICT Systems provides a list of basic and enhanced practices that may be implemented by organisations.
Data breach notification
Under Part 6A of the PDPA, organisations must notify affected individuals, including children, of notifiable data breaches. If possible, parents or guardians should also be informed to help mitigate harm. If parental contact details are unavailable, notifications should be in child-friendly language so that they understand the consequences of the breach Organisations should also advise children to inform their parents or guardians about the breach
DPIA
The guidelines advise organisations to conduct DPIAs to develop appropriate policies for the purpose of fulfilling PDPA’s Accountability Obligation. Organisations are encouraged to do so before launching products or services accessible to children to identify and mitigate data protection risks. Sample questions to be considered when conducting a DPIA are provided as an annexure to the guidelines.
Conclusion
The PDPC’s guidelines underscore the importance of safeguarding children’s personal data in the digital environment. By ensuring age-appropriate communication, obtaining valid consent, and implementing robust security measures, organisations can enhance compliance with the PDPA while protecting young users. The emphasis on DPIAs, data minimisation, and responsible data processing further reinforces the need for proactive privacy measures. As digital interactions continue to evolve, organisations must remain vigilant in adapting their data protection practices to keep children safe online.
If your organization is dealing with copious amounts of personal data, do visit www.tsaaro.com.
Tsaaro Consulting, in collaboration with PSA Legal Counsellors and Advertising Standards Council of India, has authored a whitepaper titled ‘Navigating Cookies: Recalibrating Your Cookie Strategy in Light of the DPDPA’. If you want to learn more about cookie consent management, read the whitepaper by clicking here.
The Ministry of Electronics and Information Technology (MeitY) has released the Draft DPDP Rules, 2025 for Public Consultation!
Learn more about the Draft Rules here:
News of the week

- Lawsuit Against LinkedIn Over AI Data Use Dismissed
A class action accusing LinkedIn of violating user privacy by sharing Premium customers’ private messages to train AI models was dismissed. Plaintiff Alessandro De La Torre withdrew the case nine days after filing, following LinkedIn’s assurance that no private messages were used for AI training. While LinkedIn had updated its privacy policy in September 2024, it was confirmed that private messages were never disclosed.

- DeepSeek Data Leak Exposes Over One Million Sensitive Records
On January 29, cybersecurity firm Wiz Research revealed that Chinese AI company DeepSeek suffered a major data leak, exposing over one million sensitive records due to a misconfigured cloud storage instance. The leaked data included chat logs, API secrets, and system details. Although DeepSeek secured the database within an hour of notification, the breach raises concerns over data security and potential regulatory scrutiny under GDPR and CCPA.

- Italy Blocks DeepSeek Chatbot Over Privacy Concerns
Italy’s data protection authority, the Garante, ordered Chinese AI startup DeepSeek to block its chatbot in the country after it failed to address privacy concerns regarding data collection, legal basis, and storage. DeepSeek’s insufficient responses and refusal to recognize the regulator’s jurisdiction led to the immediate block. Data regulators in Ireland, France, South Korea and the Netherlands are also scrutinizing DeepSeek’s privacy practices.
Click here to know more about privacy concerns associated with Deepseek.

- Government allocates Budget of ₹5 Crore for Data Protection Board of India
The Ministry of Electronics and IT (MeitY) has increased the Data Protection Board of India’s (DPBI) budget by 2.5 times to ₹5 crore for FY26, from ₹2 crore in FY25. ₹50 lakh is allocated for capital expenditure, including a digital portal, and ₹4.5 crore for salaries and operations. The DPB, key to enforcing the Digital Personal Data Protection Act, 2023, will function as a digital office, with draft data protection rules expected to be notified mid-year. The board will address personal data breaches and user complaints.

- Community Health Center Notifies Over 1 Million Patients of Data Breach
Community Health Center (CHC), a leading Connecticut healthcare provider, is notifying 1,060,936 patients of a data breach that exposed personal and health data, including names, Social Security numbers, medical diagnoses, and treatment details. The breach occurred in mid-October 2024 but was only discovered on January 2, 2025. Although the hacker accessed sensitive information, CHC reported no data encryption or operational disruption. This breach highlights a growing trend of data theft extortion in healthcare, prompting proposed HIPAA updates by the U.S. Department of Health and Human Services to strengthen patient data protection.