Meta’s removal of the end-to-end encryption (E2EE) technology for its Instagram Direct Messages (DMs) could pose serious risk to internet users, including children, warn cybersecurity experts at SMEX.
Earlier last week, Meta quietly announced in an update to the Instagram Help Center that E2EE will no longer be supported on the platform starting May 8, 2026.
This move was justified by a Meta spokesperson who said that the optional E2EE feature was not widely used. The change also comes amid increasing concerns about child safety on social media platforms.
Digital rights advocates warn that removing E2EE could introduce significant security and privacy risks, adding to growing scrutiny over Meta’s recent practices, which SMEX has documented and analyzed numerous times.
With end-to-end encryption (E2EE), a message is scrambled into unreadable code before it leaves the sender’s device, travels that way across the internet, and is only unscrambled when it reaches the receiver, explained Dia Kayyali, Senior Policy Analyst and digital rights expert at SMEX.
De-encryption is not the solution
“You cannot solve child sexual abuse by giving more people access to sensitive chats,” says Madeleine Belesi, a senior cybersecurity analyst at SMEX.
In many cases, abuse occurs between individuals who already know each other, according to Belesi. The content in question is often already stored on personal devices, and perpetrators can easily create new accounts if banned. Simply increasing surveillance over private messages risks exposing millions of ordinary users without effectively addressing the root of the problem.
In West Asia and North Africa, the stakes are even higher for regular users who rely on E2EE for protection. SMEX’s Digital Safety Helpdesk has confirmed through multiple tickets that the exposure of private messages can often lead to public shaming in environments where legal and social protections for women and vulnerable groups remain limited.
These concerns are accompanied by a growing distrust in how platforms manage user data. Meta has repeatedly enforced a complete ban on the accounts of journalists and activists, specifically in Palestine, for allegedly violating their community guidelines.
Removing E2EE has not been shown to protect children, whereas encrypted communications are proven to benefit many groups, including children.
This move makes WhatsApp Meta’s only end-to-encrypted platform. However, Meta was recently scrutinized for the inclusion of Meta AI on WhatsApp, compromising user privacy and even E2EE by allowing the chatbot to access some messages if prompted by one of the recipients.
WhatsApp is encrypted, but “metadata reveals a lot”
Apps like WhatsApp have long implemented this model. But even with encryption, companies still collect metadata, the digital equivalent of information written on the outside of a sealed envelope.
This includes who you are talking to, when messages are sent, how often you communicate, your device type, and your approximate location based on IP address. While this data does not reveal message content, it can still paint a detailed picture of a user’s behavior and network.
“Metadata reveals a lot about people, including information about health conditions, political and religious associations, and more,” says Dia Kayyali. “You are trusting Meta with that information by using WhatsApp. That means it is also available for government requests,” they add.
If your network is intercepted by a malicious threat actor (like a hacker), they can access all of this information by intercepting and following your metadata.
Meta can also use it to tailor ads to you or for marketing. You may even see professional WhatsApp connections, for example, inspiring friend suggestions on Instagram, explains Kayyali.
Meta primarily uses this data internally to enable the profiling of users to sell highly-targeted ads to businesses and companies looking to sell a product. This is why when you look for shoes on Facebook, you could start seeing multiple ads for shoes on your Instagram feed. It is a highly monetized business model.
Signal as a safer option
A viable alternative that digital rights advocates and cybersecurity experts recommend is Signal App, an end-to-end encrypted messaging platform built on a strong and private foundation. Signal’s code is open source, which means that any internet user can visit the code and identify gaps to improve it.
Signal, unlike WhatsApp, only collects minimal metadata. Belesi says that Signal collects only the date in which the account was created and the last time the user was connected to the app.
The main difference between the two platforms is that Signal is operated by a nonprofit, whereas Meta is a corporation—which means that it monetizes your data for revenue.
The main privacy “trick,” explains Belesi, is the fact that Signal uses the “sealed sender” model, i.e. Signal does not know who sent the message and cannot locate their IP address and other information.
The broader issue raised by Meta’s decision is not only about one feature on one platform. Rather, it is about its general disregard towards privacy and human rights. Earlier this year, SMEX called out Meta for its lack of substance in addressing important issues in its 2024 human rights report. Meta was also scrutinized several times by SMEX and other members of civil society for their unfair content moderation practices, including their failure to fairly moderate political ads during Israel’s genocide on Gaza.
Ultimately, what experts are saying is that encryption is not an obstacle to user safety. Quite on the contrary, E2EE is an essential part of it; especially in the WANA region, where the ability to communicate privately remains vital for communities at risk.
The post Meta is making Instagram DMs less private, WANA users could pay the price appeared first on SMEX.
