Instagram to alert parents if teens repeatedly search self-harm content 

February 26,2026

RED FM News Desk

Instagram says it will begin notifying parents if their teenager repeatedly searches for suicide or self-harm-related terms within a short period, as governments increase pressure on social media companies to better protect young users. 

Starting next week, parents in Canada, the United States, Britain and Australia who are enrolled in Instagram’s optional supervision feature will receive alerts if their child attempts to access such content. 

The platform said its current policy already blocks searches related to suicide and self-harm and directs users to support resources instead. 

The move comes amid growing global concern about online safety for children, including scrutiny over artificial intelligence tools such as the Grok chatbot, which has been criticized for generating non-consensual sexualized images. 

Several governments are considering stronger restrictions. Britain said in January it was reviewing potential measures to better safeguard children online, following Australia’s decision in December to ban social media access for those under 16. Spain has become the latest European country to make plans to ban social media for children under the age of 16. 

“We will protect them from the digital Wild West,” Prime Minister Pedro Sánchez said at the World Governments Summit in Dubai. France, Denmark and Austria have also announced that they are considering their own national age limits. The U.K. government has opened a public consultation on potentially banning social media use for children under 16, part of a broader set of measures it says are aimed at safeguarding young people’s wellbeing 

Instagram’s existing “teen accounts” require parental permission for users under 16 to change certain settings. Parents can also opt into additional monitoring features with their teenager’s consent. These accounts restrict access to sensitive material, including sexually suggestive or violent content.