Over the past year, digital rights have faced increasing pressure. As governments and corporations deploy and develop new technologies at a rapid pace, the regulations needed to safeguard these rights still lag behind.
Against the backdrop of escalating repressive measures, we’ve witnessed corporate complicity of the AI-facilitated genocide in Gaza, regressing standards for content moderation, and neglect for vulnerable groups online.
On this Human Rights Day, SMEX remains dedicated to documenting violations by both governments and tech companies and advocating against them. This piece outlines our observations on the state of digital and human rights this year, and the efforts we took to respond to the growing threats on digital rights.
Using AI without necessary safeguards puts human rights at risk
While companies marketing AI promise efficiency, precision, and improved security outcomes, its deployment by malicious actors in Palestine and across West Asia and North Africa tells a different story.
AI has been used to target civilians allegedly accused of terrorism and of belonging to Hamas in Palestine, while in other states like the United Arab Emirates (UAE), it has been widely used for surveillance, especially targeting political dissidents.
In another real-world experience of AI, refugees have reported having been subjected to an extorted acquisition of their biometric data by authorities in order to obtain vital humanitarian aid.
These patterns are not isolated. Insights gained from our Digital Safety Helpdesk and SMEX’s broader research fed into our submission the United Nations Special Rapporteur on Countering Terrorism (UNSRCT), which focused on the pattern and human right impacts of the expanding use of AI systems in “counterterrorism” operations in the region, in particular in Palestine and Gulf countries.
Our submission argues that the use of AI in the WANA region at this stage is simply too dangerous, given both its occasional technological failures and the oppressive legal contexts these AI tools are being deployed in.
SMEX recommended that tech corporations halt the selling of AI services to governments. We also called on states, especially those outside the WANA region, to uphold the United Nations Guiding Principles for Business and Human Rights (UNGPs).
The UNGPs urge states to engage in due diligence through the strict monitoring and prosecution of corporations domiciled within their borders that might be aiding and abetting human rights abuses.
Internet access and infrastructure under threat
SMEX has documented cases where access to information is being constrained through unequal access to internet infrastructure and misapplied platform moderation policies. These barriers disproportionately affect vulnerable communities living under authoritarian control and can limit their ability to organize and access critical information in times of crisis.
In October, SMEX joined the UC Irvine International Justice Clinic in calling out the Tunisian government for its forced relocation of sub-Saharan migrants into remote areas with weak or nonexistent internet coverage and digital connectivity.
In the joint submission to the United Nations Committee on the Elimination of Racial Discrimination (CERD), we argued that the isolation of migrants from areas where digital literacy is accessible, paired with the government’s failed efforts to address online hate speech, has effectively excluded migrants from the digital sphere, violating their rights, such as the right to equal treatment, the right to freedom of movement, and other rights protected under the Convention on the Elimination of Racial Discrimination (ICERD).
Building on this focus on systemic exclusion from the right to information, SMEX submitted a public comment to Meta’s Oversight Board (OSB) on the company’s removal of two posts documenting the fall of the Assad regime. These posts were published by members of Hay’at Tahrir El Sham, previously designated as a terrorist organization by the U.S. When the OSB overturned these removals, it directly cited SMEX’s comment, acknowledging that Meta’s policies had been misapplied.
Our submission highlighted how the removal of the two posts at issue reflects a recurring pattern: Meta’s persistent over-moderation of Arabic content and a systematic failure to account for the political and human rights realities of the WANA region, as well as a simultaneous failure to understand coded hate speech such as that directed at Syria’s Druze communities. Meta’s over-moderation practices suppress vital content during political crises, restricting Syrians’ right to access information during a crucial time.
The issues raised in SMEX’s submission to the OSB are only part of a broader systematic problem with the company’s content moderation practices. For example, Meta’s Crisis Policy Protocol (CPP) remains opaque, offering no meaningful transparency about how decisions are made.
Without contextual understanding of the situation in Syria, Meta’s policies risk silencing people in conflict zones, which may be the only information available to civilians seeking safety. Meanwhile, posts on social media are essential for holding perpetrators of human rights violations accountable at a later stage.
Cross-border engagement for digital rights
While our primary focus remains on the WANA region, we cannot overlook the inherent cross-border element of digital rights.
SMEX strives to engage with other jurisdictions, such as the European Union, which often bears effects on regions outside of its territory. One of the EU’s most promising legislations, the Digital Services Act (DSA), offers the possibility for such engagement.
As part of our efforts to engage with the EU, we answered the European Commission’s call for submissions on recurrent and prominent systemic risks in the EU and on measures for their mitigation (Articles 34 and 35 DSA) last spring.
Our submission to the Commission highlighted the effect that crises outside of the EU may have on freedom of expression within its territory. One example was the wrongful content moderation of content related to the genocide in Gaza, as well as the fact that faulty moderation of Arabic, a widely-spoken language inside the EU, brings about direct consequences to users residing in the Union.
SMEX remains committed to advocating for meaningful reforms and more transparent policies grounded in the lived realities of people across the region. As part of our steadfast commitment to holding governments accountable, our submissions addressed multiple human rights issues in Lebanon, with a focus on the lack of cooperation with international human rights mechanisms, the restrictions around freedom of expression and the attacks and harassment of bloggers, journalists, HRDs and political opposition.
The submissions also addressed the dire situation of data protection and privacy, the worsening state of digital inclusion and connectivity and the technology-facilitated gender based violence.
We submitted two sets of recommendations with two partners to Lebanon’s Universal Periodic Review (UPR) offering detailed, actionable recommendations aimed at strengthening digital rights and systemic barriers to internet and telecom access.
Our policy team also engaged directly during advocacy meetings in Geneva with diverse UN state delegations to push for our work to be included in the final set of recommendations given to the state of Lebanon in early 2026.
You can read our full recommendations to the UPR here.
On Human Rights Day, we remember that digital rights are human rights. As we continue to advocate for a safer online and offline space, we urge governments and companies to uphold their human rights obligations and refrain from prioritizing profit over people’s safety.
We remain hopeful that a rights-centered digital future is still possible, but only if governments and tech platforms move beyond mere symbolic commitments and actually address the structural barriers that limit access and safety online for users.
The post Human Rights Day: SMEX reflects on a year of advocacy for digital rights appeared first on SMEX.
