Ireland’s online safety watchdog will prioritise child protection on digital platforms throughout 2025, as part of its growing role under the EU’s landmark Digital Services Act (DSA).
The Coimisiún na Meán, Ireland’s media and online safety regulator and key DSA enforcer, is actively contributing to forthcoming EU-wide guidelines on child safety expected to be published later this year. These guidelines are intended to help major tech companies like Meta, TikTok, and YouTube meet stricter obligations in preventing harm to minors online.
As the lead supervisory authority for several global digital giants with EU headquarters in Dublin, Ireland’s position is pivotal in shaping and enforcing EU digital policy. Under the DSA, Very Large Online Platforms (VLOPs) and Search Engines are required to assess and mitigate systemic risks, particularly those affecting children, such as exposure to harmful content or algorithmic amplification of dangerous behaviour.
The Commission’s increased focus on child safety reflects broader EU concerns about the mental health and well-being of young people in digital spaces. In its 2025 enforcement strategy, the regulator is expected to ramp up scrutiny of how platforms design their services, including age verification, content moderation, and advertising transparency aimed at minors.
The finalised EU guidelines will help harmonise standards across member states, reinforcing obligations already set out in Article 34 of the DSA, which requires platforms to consider the specific needs of vulnerable groups, including children.
For more information on the DSA and Coimisiún na Meán’s role, see the European Commission’s DSA overview and Coimisiún na Meán’s official website.
Would you like a summary of the key child protection measures under the Digital Services Act?