- Both platforms have been placed under Enhanced Supervision and must regularly account for progress in implementing rectification measures until IMDA is satisfied that issues have been resolved
- They are also required to provide supporting data and information to demonstrate effectiveness of their rectification measures by 30 June 2026
SINGAPORE – 31 MAR 2026
1. The Infocomm Media Development Authority (IMDA) has issued Letters of Caution to X and TikTok for serious weaknesses in their measures to proactively detect and remove child sexual exploitation and abuse material (CSEM) and terrorism content respectively. They have also been placed under Enhanced Supervision. IMDA found a 120% increase in cases of CSEM on X originating from or targeting Singapore users, up from 33 cases in 2024 to 73 cases in 2025. On TikTok, IMDA found 17 cases of terrorism content shared by Singapore-based accounts for the first time in 2025. CSEM and terrorism content are very egregious harms, and the Code of Practice for Online Safety – Social Media Services (the “SMS Code”) requires designated Social Media Services (DSMSs)1 to proactively detect and swiftly remove CSEM and terrorism content, through the use of technologies and processes, before users encounter such content.
2. These findings are part of the second Online Safety Assessment Report (the “Report”) 2025 on DSMSs, which assesses the presence, comprehensiveness and effectiveness of the online safety measures implemented by DSMSs to mitigate risks from harmful content, as required by the SMS Code. The inaugural Report published last year assessed that DSMSs had put in place the baseline safety measures. This second Report builds upon this baseline and highlights the areas of weakness DSMSs need to address, as well as improvements they have made over the past year. The Report allows users, including parents, to make informed decisions for themselves and their children about the risks and available safety measures on the various DSMSs.
X and TikTok placed under Enhanced Supervision
3. As part of its assessment for the Report, IMDA conducted tests for CSEM and terrorism content to obtain an indicative sample of such content across the DSMSs. The 73 CSEM cases detected on X all had a Singapore nexus, and involved content sharing or linking to CSEM, as well as self-generated CSEM. This occurred despite IMDA sharing its analysis of the CSEM cases and their indicators with X in 2024. All 73 cases also violated X’s own policies against CSEM and were only removed by X when IMDA flagged the cases to X.
4. IMDA detected 17 cases of terrorism content shared by Singapore-based accounts on TikTok. These cases primarily comprised videos with edited footage or audio related to known transnational terrorist organisations. In addition, when some of these were reported to TikTok via its in-app user reporting mechanism, TikTok found that the content did not violate its community guidelines. This demonstrated that TikTok did not accurately assess the terrorism content when they were user reported. TikTok only removed them when IMDA flagged the cases to TikTok.
5. Both X and TikTok have accepted IMDA’s findings and committed to put in place specific measures to rectify these serious weaknesses, in particular to enhance their automated detection systems through the use of AI and incorporation of additional signals to improve their proactive detection of CSEM and terrorism content respectively. To ensure accountability for the effective implementation of these rectification measures, IMDA has issued Letters of Caution to X and TikTok to place both services under Enhanced Supervision and require them to:
- Provide regular updates to IMDA on their progress in implementing the rectification measures they have committed to, until IMDA is satisfied that the issues are adequately resolved.
- Provide supporting data and information to IMDA in their next annual online safety report due on 30 June 2026, to demonstrate the effectiveness of their implementation of the rectification measures.
6. Should X or TikTok fail to satisfy IMDA that they have improved the effectiveness of their measures to address the specific types of CSEM and terrorism content that IMDA has detected, IMDA will not hesitate to explore further options, including potential regulatory action under the Broadcasting Act.
More can be done by DSMSs to improve child safety measures
7. In the 2025 Report, while there has been improvement in some areas highlighted in the 2024 Report, IMDA also identified areas of weakness that the DSMSs will need to account for. IMDA urges the DSMSs to continue strengthening these measures.
Overview of DSMSs' Online Safety Ratings
8. In particular, more can be done by some DSMSs to improve their safety measures for all users and children, their user reporting and resolution mechanisms, as well as their data accountability. Facebook, YouTube and HardwareZone were found to have weaknesses in the effectiveness of their child safety measures, which could lead children to easily access age-inappropriate content. The comprehensiveness of child safety measures across different DSMSs also varied greatly. Instagram and TikTok reported the most comprehensive child safety measures, while HardwareZone and X only had a few baseline measures. Given the rapidly evolving online safety risk landscape, especially for children, DSMSs must continue to prioritise enhancing the comprehensiveness and effectiveness of their measures to minimise children’s exposure to harmful and age-inappropriate content.
9. Most DSMSs improved the effectiveness and timeliness of their responses to user reports. All DSMSs, except TikTok, took action on a greater proportion of legitimate user reports on content that violated their own community guidelines in 2025. Their action rates ranged from 54% to 93% in 2025, compared to approximately 50% or less in 2024. TikTok was the only DSMS to decline in the effectiveness of its user reporting mechanisms, with its action rate for legitimate user reports declining from 39% in 2024 to 25% in 2025. All DSMSs also improved on the time they took to act on such user reports.
Overview of DSMSs’ Action Rates on Legitimate User Reports
Online safety of users, especially children, of utmost priority
10. IMDA’s main priority as Singapore’s online safety regulator is to ensure a safe online environment for users in Singapore and to protect children, in particular, from harmful content. Throughout the year, IMDA has engaged the DSMSs on weaknesses in their online safety measures, flagged harmful content, and raised concerns when risks were detected. While IMDA adopts a collaborative approach to engage with the DSMSs, we will hold the DSMSs accountable when we assess that their online safety measures do not adequately achieve the outcomes of the Code. Under the Broadcasting Act, IMDA also has powers to direct social media services to block access to egregious content found on their services. Our overriding objective remains to ensure the online safety of Singapore users, especially children.
Next steps for online safety in Singapore
11. All DSMSs will need to provide IMDA with updates on the steps taken to improve on their areas of weakness in their next annual online safety report. At the same time, IMDA will continue to engage the DSMSs regularly throughout the year to highlight emerging online safety risks and ensure the DSMSs have the required measures in place to protect Singapore users.
12. Online safety risks continue to evolve, as technology has made it easier to create and disseminate harmful content. DSMSs will have to remain vigilant and continually improve the effectiveness of their online safety measures, especially for children.
13. IMDA is constantly monitoring the rapidly evolving online safety risk landscape and reviewing the relevance of its regulations including the SMS Code. In 2025, IMDA made it a requirement for Designated App Distribution Services to implement age assurance measures to ensure children do not download apps that are inappropriate for their age. To ensure that online safety measures are effectively and accurately applied to children, IMDA plans to extend age assurance requirements to DSMSs. We are also studying how online safety requirements for children can be further enhanced. IMDA is currently in discussions with DSMSs and more details will be announced later this year.
14. IMDA’s Online Safety Assessment Report and the DSMSs’ annual online safety reports are published in full on IMDA’s website at www.imda.gov.sg/online-safety for public reference.