Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Tech

Meta will auto-blur nudity in Instagram DMs in latest teen safety step

Meta has announced that it is testing new features on Instagram intended to help protect young people from unwanted nudity or sextortion scams. This includes a feature called DM Nudity Protection, which automatically blurs images detected as containing nudity.

The tech giant will also urge teens to protect themselves by sending them a warning encouraging them to think twice before sharing intimate images. Meta hopes this will increase protection against scammers who might send nude images to trick people into sending their own images in return.

It is also make the changes he suggests make it more difficult for scammers and potential criminals to find and interact with teenagers. Meta says it is developing new technology to identify accounts “potentially” involved in sextortion scams and apply certain limits to how these suspicious accounts can interact with other users.

In another step announced Thursday, Meta said it has increased the data it shares with multi-platform online child safety program Lantern – to include more “sextortion-specific signals.”

The social media giant has long had policies prohibiting the posting of unwanted nudes or seeking to coerce other users into sending intimate images. However, this does not prevent these problems from running rampant online and causing misery to dozens of adolescents and young people, sometimes with extremely tragic consequences.

We have collected the latest changes in more detail below.

Nudity screens

Nudity protection in DMs aims to protect teenage Instagram users from cyberflashing by placing nude images behind a security screen. Users can then choose to view it or not.

“We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat,” Meta said.

Nudity Safety Screen will be enabled by default for under-18s worldwide. Older users will see a notification encouraging them to enable it.

“When nudity protection is enabled, people sending images containing nudity will see a message reminding them to be careful when sending sensitive photos and that they can unsend those photos if they change their minds,” he adds.

Anyone who tries to transmit a nude image will see the same warning to reconsider their decision.

The feature is powered by on-device machine learning. So Meta said it would work in end-to-end encrypted chats because the image analysis is done on the user’s own device.

Safety tips

As part of another protective measure, Instagram users who send or receive nudes will be directed to safety tips – containing information about the potential risks involved – which Meta says have been developed with the tips of experts.

“These tips include reminders that people may take screenshots or forward images without your knowledge, that your relationship with the person may change in the future, and that you should review profiles carefully in case they are not. not who they say they are,” he wrote. . “They also link to a range of resources including the Meta Safety Center, helplines, StopNCII.org for over 18s and Take It Down for under 18s.

It’s also testing pop-up messages for people who may have interacted with an account that Meta removed for sextortion, which will also direct them to relevant expert resources.

“We’re also adding new child safety hotlines from around the world to our in-app reporting feeds. This means that when teenagers report relevant concerns – such as nudity, threats to share private images or sexual exploitation or solicitation – we will direct them to local child safety hotlines, when they are available,” he adds.

Technology to spot sextortionists

Although Meta says it deletes sextortionist accounts as soon as it becomes aware of them, it must first identify bad actors to shut them down. Meta is therefore trying to go further: it says it is “developing technology to help identify where accounts can potentially engaging in sextortion scams, based on a series of signals that could indicate sextortion behavior.”

“While these signals do not necessarily prove that an account has violated our rules, we take precautionary measures to prevent these accounts from finding and interacting with teen accounts,” it continues, adding: “This builds on the work we have done. we already do this to prevent other potentially suspicious accounts from finding and interacting with teens.

It’s not clear what technology Meta uses for this, or what signals might point to a potential sextortionist (we asked for more) – but, presumably, it can analyze communication patterns to try to detect bad actors.

Accounts flagged by Meta as potential sextortionists will face restrictions on how they can message or interact with other users.

“(All) messages that potential sextortion accounts attempt to send will go directly to the recipient’s hidden requests folder, meaning they will not be notified of the message and will never have to see it,” writes -he.

Users who are already chatting with potential scam or sextortion accounts will not have their chats closed but will be shown security notices “encouraging them to report any threats to share their private images and reminding them that they can say no to anything that makes them feel uncomfortable,” by Meta.

Teen users are already protected from receiving direct messages from adults they aren’t connected with on Instagram (and also from other teens in some cases). But Meta goes a step further by not displaying the “Message” button on a teen’s profile to potential sextortion accounts, that is, even if they are connected.

“We are also testing hiding teens from these accounts in the subscriber, following and like lists, and making it more difficult for them to find teen accounts in search results,” he adds. .

It’s worth noting that the company has come under increasing scrutiny in Europe over child safety risks on Instagram, with authorities asking questions about its approach since the law came into force. digital services (DSA) of the bloc last summer.

A long, slow journey to safety

Meta has previously announced measures to combat sextortion – most recently in February, when it expanded access to Take It Down.

The third-party tool allows users to locally generate an intimate image hash on their own device and share it with the National Center for Missing and Exploited Children, creating a repository of non-consensual image hashes that companies can use to seek and remove revenge. porn.

Meta’s previous approaches had been criticized because they forced young people to upload their nudes. In the absence of strict laws regulating how social networks should protect children, Meta has been left to self-regulate for years – with uneven results.

However, with certain requirements placed on platforms in recent years, such as the UK’s Children’s Code, which came into force in 2021 – and, more recently, the EU’s DSA – tech giants like Meta must finally grant more attention to the protection of minors.

For example, in July 2021, Meta defaulted youth Instagram accounts to private accounts just before the UK compliance deadline. Even stricter privacy settings for teens on Instagram and Facebook followed in November 2022.

In January, Meta also announced that it would default to stricter messaging settings for teens on Facebook and Instagram, still with limits on messages sent to teens they are not already connected with, shortly before the The DSA’s full compliance deadline comes into effect in February.

The slow and iterative evolution of Meta’s features regarding protections aimed at younger users raises questions about what took so long to implement stricter protections – suggesting that it opted for a cynical minimum of protection in an effort to manage usage impact and prioritize engagement. security. (This is exactly why Meta whistleblower Francis Haugen repeatedly denounced his former employer.)

When asked why it wasn’t also rolling out the latest protections announced for Instagram users to Facebook, a Meta spokeswoman told TechCrunch: “We want to address where we see the greatest need and need. greater relevance – which when it comes to unwanted nudity and education. teens about the risks of sharing sensitive images – we think it’s on Instagram DMs, so that’s what we focus on first.

techcrunch

Back to top button