Instagram introduces new features to combat sextortion scams

Instagram introduces new features to combat sextortion scams

Instagram will soon prevent users from screenshotting or screen-recording images and videos intended for one-time viewing, as part of Meta’s ongoing efforts to tackle sextortion on the platform. The new safety features, announced by Meta on Thursday, aim to protect teens from being coerced into sharing intimate images that scammers could use for blackmail.

Meta, Instagram’s parent company, revealed that this update builds on its commitment to enhancing safety measures for young users. Tools that blur nude images in direct messages and hide follower and following lists from potential sextortion accounts will also become permanent features on Instagram.

This move comes amid growing concerns over the rise in sextortion scams, particularly those targeting teenage boys. Law enforcement agencies around the world have reported a spike in these scams, often involving the use of social media platforms to exploit vulnerable users. According to the UK’s Internet Watch Foundation, 91% of the sextortion reports it received in 2023 involved boys.

In response to these alarming trends, Instagram’s latest tool will block the ability to screenshot or screen-record images and videos sent through its “view once” or “allow replay” mechanisms in Direct Messages. This restriction will apply to both the mobile app and the web version of Instagram, further reducing opportunities for scammers to capture and misuse sensitive content.

Antigone Davis, Meta’s head of global safety mentioned to sources that these measures are designed to safeguard teens without requiring parental intervention. They have put in built-in protections so that parents do not have to do a thing to try and protect their teens, However, she acknowledged the evolving nature of sextortion, stating that perpetrators will continually attempt to bypass protective features. To help combat this, Meta is launching a new campaign aimed at educating teens and parents on how to recognize sextortion attempts and prevent exploitation.

While the move has been welcomed by child protection advocates, some experts have questioned why similar protections are not being rolled out across all Meta platforms, including WhatsApp, where sextortion and grooming also occur. Richard Collard, associate head of child safety online policy at the NSPCC, praised the steps taken by Meta but urged the company to extend these protections further, noting that online abuse occurs at scale across various platforms.

In the UK, regulators have taken notice of the increased risks to children online. Ofcom, the UK’s communications watchdog, has warned that social media companies will face heavy fines if they fail to protect young users. Dame Melanie Dawes, Ofcom’s chief executive, stressed that it is the responsibility of tech firms to ensure safety, not parents or children. This warning comes ahead of the Online Safety Act, which is set to be implemented next year and will impose stricter regulations on online platforms.

Instagram is also tightening its privacy settings for under-18s, moving teens into accounts with stricter default settings that require parental supervision to modify. These changes are part of Meta’s broader strategy to minimize risks for younger users while maintaining a safer online environment.

As sextortion scams continue to rise, Instagram’s new safety measures mark an important step in the fight to protect children from online exploitation.

 

Leave a Reply

Your email address will not be published. Required fields are marked *