Nour Mohamed Owais
In May 2025, sexually explicit images allegedly depicting Syrian activist Eva Rachdouni spread online, accompanied by abusive posts revealing her personal information—including her place of residence and religious affiliation.
The images were fake, created using artificial intelligence, and circulated by dozens of pro-regime fake accounts loyal to Syria’s former government. The campaign reflected a recurring pattern of gender-based digital violence targeting female Syrian activists. This report documents a series of coordinated attacks involving defamation, gendered stereotyping, and legal or digital threats—occurring in a context where no effective legal framework protects women in digital spaces.
In May 2025, activist Eva Rachdouni was subjected to an organized online campaign in which deepfake images were circulated portraying her in sexually suggestive situations. The attack went beyond moral defamation, incorporating direct sectarian incitement through the publication of her personal details, including her home address and Christian identity.
The offensive content spread through dozens of fake accounts on Twitter and Facebook using hashtags such as #إيفا_فاضحة (Eva Exposed), alongside accusations of “serving Western agendas” and “insulting Muslims,” and other sectarian rhetoric.
Analysis conducted by the report’s author indicates that organized entities, including the Syrian Electronic Army network, helped amplify the campaign through more than 30 fake accounts that coordinated closely in timing, language, and content.
Images shared in the defamation campaign were examined using the AI or Not tool, for detecting AI-generated visuals. The analysis covered several fabricated images spread by fake accounts to verify their digital composition and sources. Preliminary results confirmed that the images bore distinct signs of synthetic generation, with high likelihood of having been created using AI tools.
Although Syrian law criminalizes incitement to hatred under Article 28 of the Cybercrime Law, it contains no provisions prohibiting deepfake use for generating and distributing defamatory images of women. This legal gap leaves women exposed to ongoing violations that can easily extend beyond the virtual realm to cause real psychological and social harm.
According to digital security researcher and trainer Raya Sharbain, “The absence of community and institutional support systems drives many women activists to temporarily withdraw from digital platforms as a preventive strategy against this kind of violence, which faces no meaningful legal or societal accountability.”
On May 6, 2025, a comment by Syrian journalist Jude Hamadeh on a video showing interim president Ahmad al-Shar’a playing basketball triggered a large-scale campaign of online abuse. Jude had written, “The basketball video is vulgar and shameless,” prompting a coordinated backlash within hours, led by pro-government journalist Ahmad al-Oqda, who urged his followers to “do your duty” while insulting her publicly.
Two days later, al-Oqda retracted his attack after counter-campaigns emerged defending Hamadeh, using hashtags such as #ابنة_أخ_شهيد (Niece of the Martyr), referencing her late uncle.
After activist Ghada al-Shaarani published a video criticizing Mustafa al-Bakour, the Suwayda governor under the new Syrian administration, she was met with a massive online assault. Around 85% of related posts contained direct insults, labeling her “hysterical” and “disrespectful.”
Al-Shaarani also faced legal prosecution under charges of “undermining state sovereignty,” along with digital defamation involving AI-generated images depicting her as a “snake.” Pro-regime accounts shared hostile content under hashtags such as #غادة_لا_تمثلنا (Ghada Does Not Represent Us). The campaign peaked on May 21, 2025, when edited videos portraying her criticism were widely circulated.
Researcher Raya Sharbain notes that “protecting activists requires preventive measures such as awareness of hostile campaign tactics and response strategies that include strengthening digital rights organizations, providing legal and psychological assistance, and engaging with platforms to shut down attacking accounts—which often respond positively to such interventions—to help build a technological and emotionally supportive environment for women.”
This report was produced as part of a training organized by Arabi Facts Hub in collaboration with the “Women Who Won the War” program—the first of its kind in Syria focused on detecting false narratives. The training chose to highlight the narrative of systematic violence against women as one of the most dangerous and underreported issues in the Syrian context.