• خبرگزاری آریافارسی
    • Arya News AgencyEnglish
    • Arya News Agencyالعربیه
خبرگزاری آریا
Thursday, December 11, 2025
  • Home
  • iran
    • world
      • Economy
        • Sports
          • Technology
            • Archive
            world

            From fake profiles to deepfake porn: The war Pakistani women cannot win alone

            Thursday, December 11, 2025 - 07:11:15
            From fake profiles to deepfake porn: The war Pakistani women cannot win alone
            Arya News - Experts say AI isn’t creating misogyny, but magnifying it, making harmful gender norms appear socially acceptable and widely endorsed.

            ISLAMABAD – Shukria Ismail, a young woman from Khyber Pakhtunkhwa’s conservative Kurram district, faced an uphill battle convincing her family to let her pursue journalism as a career. After her parents half-heartedly allowed her, she made sure to follow all the cultural norms expected of a Pathan woman: keeping her dupatta on and being careful about what she said or did.
            She threw herself into her work, attending training sessions, reporting from the field, appearing on-screen, and building professional networks. Things were going well for her. She was happy.
            In March 2024, however, her world came to a sudden halt when her sister discovered a fake Facebook profile in Shukria’s name. The profile picture was a screenshot of her reading news on TV, while the cover photo carried a sexually explicit image. “At first, I brushed it off as something common in our field,” she says. “But then my sister told me that obscene messages were being sent to our relatives and even to people who had old property disputes with my father and uncle.”
            The targeting was precise, suggesting someone close to her was behind it. Eventually, she found out a relative was involved, acting out of a personal grudge. “They attached indecent images to my photos and spread them online,” Shukria recalls.
            The harassers didn’t stop there. Her family began receiving calls insinuating she was having affairs or questioning her character simply because she was a journalist. Even her younger brother’s college was dragged into the mess, with texts sent to his classmates saying she was involved in “questionable activities.”
            “It was clear they wanted to push me towards suicide or provoke an ‘ honour killing ,’” she adds.
            Through it all, her family stood by her, never blaming or abusing her. Yet the biggest blow came when they forced her to quit journalism entirely. “Being on air, reporting, writing — journalism became a crime in my own home. My parents forbade me from going to work. That was the day I died mentally,” she says.
            Rising cost of being a woman online
            Shukria is one of thousands of women whose lives have been upended by technology-facilitated gender-based violence (TFGBV). The term refers to acts of violence committed, assisted, aggravated, or amplified by the use of information and communication technologies (ICTs) or digital media against a person because of their gender.
            This year, the 16 Days of Activism against GBV runs from November 25 to December 10, under the UN theme “ UNiTE to End Digital Violence against All Women and Girls ,” spotlighting cyber harassment, deepfakes, and attacks that spill into real-world harm.
            In Pakistan, the damage rarely stays online: it hits mental health, careers, and, for women like Shukria, can end professional lives altogether.
            The Digital Rights Foundation’s Digital Security Helpline Annual Report 2024 shows a sharp rise in online harassment and TFGBV in the country, with 3,171 new cases recorded. Per the report, the year began with a surge in gendered disinformation and AI-generated images targeting women politicians and journalists during the February elections, aimed at humiliating and discrediting them both personally and professionally.
            Women journalists also faced blasphemy allegations, a tactic carrying severe offline risks, including mob violence. One journalist told DRF she feared for her life for the first time, while another faced a hacked X account and coordinated attacks on her identity and credibility.
            Pakistan remains one of the world’s most dangerous countries for women, and rising internet access has extended violence into digital spaces. Despite broadband reaching 58.6 per cent of the population, a gender gap persists: only 33pc of women regularly use the internet, and 23pc access it via someone else’s phone, leaving many without the knowledge or tools to protect themselves from TFGBV, the report said.
            Journalists encounter heightened hostility as well. Pakistan dropped two places to 152 on the 2024 World Press Freedom Index . Women journalists face sexualised threats, gendered disinformation, and image-based abuse, including AI-generated content.
            DRF research shows 55pc of women in media have experienced online abuse, yet only 14.2pc sought help; 70pc fear misuse of their images online, and 40pc report stalking or harassment.
            Meanwhile, data from the National Commission on the Status of Women (NCSW)’s 2023 report shows that almost 40pc of Pakistani women surveyed had faced digital cyber-bullying or harassment. Conviction rates remain low: only 92 cybercrime convictions were secured out of 1,375 cases in 2023.
            As the nature of online abuse grows more complex, those on the frontlines are seeing the impact most clearly. Digital rights expert and DRF founder Nighat Dad told Dawn that the organisation was witnessing “a disturbing new frontier of gender-based violence,” noting that the Digital Security Helpline had seen a steady rise in cases involving deepfake pornography, doctored images, voice manipulation, and AI-powered impersonation, especially targeting women and mostly women public figures.
            “These aren’t just new tools; they are intensifying old patriarchal patterns of control, blackmail, and public shaming,” she said.
            She added that “AI has lowered the barrier for abuse. With just a photo and a free app, anyone can manufacture a scandal. That level of scale and invisibility makes it harder than ever for survivors to find redress.
            We urgently need stronger platform-level safeguards, quicker takedown mechanisms for synthetic media, and gender-sensitive digital safety policies. But just as importantly, we need a survivor-centric ecosystem where helplines like ours are resourced, trusted, and recognised as vital for frontline support in fighting this evolving form of violence.”
            Smear campaign without borders
            Prominent journalist and prime-time anchorperson Mona Alam from Islamabad also faced a different but equally vicious online smear campaign that took a heavy toll on her mental health.
            “The campaign against me, involving the circulation of pornographic videos, began in December last year. It wasn’t a morphed image or a superimposed video; it was a real video of a sex worker filmed years ago. That same video had previously circulated on the dark web under the title ‘Nimra Mehra’s leak video,’ to malign a British-Pakistani singer, and that wasn’t Nimra either — just a woman who slightly resembled her. In December 2024, it was made viral again, this time labelled as ‘Mona Alam’s leaked video.’
            “I became aware of it because the campaigners were Pakistani. It was deliberately pushed in WhatsApp and Facebook groups, especially media groups with journalists, politicians, and professional contacts. Among the initial campaigners were individuals affiliated with political parties and some media professionals. Soon, others began making vlogs and videos, amplifying it further.”
            The journalist immediately approached the police and the Federal Investigation Agency’s (FIA) cybercrime wing in Islamabad, saying she was grateful to the then–DG FIA, Ahmed Ishaq Jahangir, who pursued the case diligently.
            “He had the initial campaigners arrested, kept in FIA custody for a few days, questioned, and their mobile phones confiscated for forensic examination. Courts eventually granted bail. Thanks to the ex-DG, the case was progressing well.”
            However, in February 2025, at a critical stage when all the phones had been seized and forensic work was pending, the DG was removed, and the FIA “abandoned the case”, she claimed. Despite repeated follow-ups, the case did not move.
            She did not give up and repeatedly contacted FIA Additional Director Ayaz Khan. Later, at a ceremony in the Presidency, she explicitly asked Interior Minister Mohsin Naqvi for help. “He listened at first, but as soon as I mentioned the FIA’s negligence, he walked past me without answering. I had hoped he would intervene, but he offered no help.”
            The campaign quickly shifted abroad. “When the initial campaigners were arrested, they moved it to Dubai and India, where they had accounts with millions of followers.
            Indian tabloids, social media accounts, and Bollywood-style gossip pages — all amplified it aggressively. Despite Pakistani officials’ usual reactions to Indian attacks, this malicious campaign received no attention here.”
            Mona was also disappointed by the silence of Information Minister Attaullah Tarar. “Although I didn’t explicitly ask him for help, he didn’t offer any either.” Dawn reached out to both Naqvi and Tarar for comment, but neither responded.
            Women politicians equal targets
            While Mona received no support from the political elite, even senior women lawmakers face similar treatment.
            Case in point is Punjab Minister for Information and Culture Azma Bokhari, whose sexualised deepfake video circulated in 2024, leaving her deeply distressed.
            Deepfakes, digitally manipulated videos or images that superimpose a person’s likeness onto another or fabricate content, are increasingly weaponised against women in the public sphere, particularly in conservative societies like Pakistan.
            Despite her position as a senior minister, Bokhari endured the same online harassment and character assassination as countless other women, illustrating how digital violence affects women across all levels and exposing the systemic nature of gendered abuse online.
            “Unless such cases are treated as test cases, where at least one perpetrator is actually punished, nothing will change,” she said, pointing out that Punjab now has dedicated defamation courts, which are civil in nature.
            “Cases like mine should go to such defamation courts. And once pursued, courts must show zero tolerance. No matter what the investigation does, the ball eventually lands in the court’s court,” she said.
            Bokhari stressed that without strict judicial action, digital violence cannot be curbed. She also warned that “if we turn such incidents into political issues, granting favours, or portraying someone as a victim to secure their release, this problem will never end.”
            Unseen scars of digital abuse
            For a non-public figure like Maria, a pseudonym, a deepfake cost her a potential marriage. Maria, a marketing professional from Karachi, had been in a relationship with a former boyfriend during university, but they parted ways and didn’t stay in touch.
            “I didn’t have any contact with him for two years. I was about to get married in three months and shared an engagement photo on Instagram. That’s when he suddenly messaged me, angry and jealous, and began threatening me that I had ‘betrayed him.’ He said he had kept my intimate pictures on his phone, claiming, ‘I had saved them for such an occasion only, and I will teach you a lesson.’”
            Though her images did not contain nudity, just photos of her hugging him, he sent them to her fiancé after doctoring them, sparking a huge fight that ended her engagement. “At that time, I was devastated and depressed for months.
            I was scared. I told my family, and though they were disappointed, they supported me. I told them that if this ever happens again, I would face it without fear. My biggest fear is over. The marriage is called off, but my izzat is intact. I did nothing wrong. If anyone is to blame, it’s him,” she said.
            The psychological impact of such harassment is severe and long-lasting. Huma Pervaiz, clinical psychologist and lecturer at Karachi’s DHA Suffa University, explained that manipulations like deepfakes “tend to demean, silence, or discredit women, particularly those in social or professional positions. Cyberbullying, trolling, and doxing create a constant sense of insecurity, which can harm mental well-being, erode self-confidence, and blur the line between online and offline abuse.”
            Samiha Sajid, clinical psychologist, added that even a single threatening message or manipulated image can trigger anxiety, hypervigilance, shame, and withdrawal from social spaces.
            She continued: “We must recognise online harassment as a serious mental-health threat. Pakistan needs stronger, gender-sensitive reporting systems, public awareness campaigns, and digital-literacy initiatives. Protecting women online is not just a legal responsibility, but it is essential for their psychological well-being and their full participation in society.”
            Journalist Gharida Farooqi, herself a victim of AI-fuelled harassment campaigns, told Dawn , “Artificial intelligence or automated insolence? Yet another unsafe space for women across the globe. Having been a victim of fake campaigns and real threats myself since 2014, even when AI was distant, it feels nothing has changed in 2025, rather intensified manifold for women.”
            She said that before stricter laws and implementation, the sensitisation and sanitisation of minds—both human and AI—towards women is urgently required, “so that the benefits of modern-day technology are harnessed and shared not just by mankind, but by all of humankind.”
            Threat of AI-driven misogyny
            The rise of AI has not only accelerated online abuse but reshaped how misogyny is produced, amplified, and weaponised in Pakistan’s digital spaces. Experts warn that without urgent systemic safeguards, women will face an increasingly hostile environment online.
            Digital rights activist Usama Khilji has been warning for years that technology would eventually supercharge the violence women face online. That moment has now arrived. He describes deepfakes and AI-generated intimate images as “a huge, huge risk,” especially in intimate relationships where former partners weaponise fabricated content to keep women silent or under control.
            Political networks, he added, have also begun deploying deepfakes to discredit women journalists and public figures, using misogyny as a strategic tool. Yet the cybercrime system remains ill-equipped. “Threats to create a deepfake don’t fall neatly into any section,” he told Dawn , noting that most women receive no meaningful legal protection despite the severity of the harm.
            Tech journalist Sindhu Abbasi sees the same pattern unfolding across the digital ecosystem, where AI has lowered the barrier for large-scale manipulation. Her research has taken her from AI “nudifier” apps — “Meta has already taken legal action against some of these developers,” she pointed out — to professional settings where photos are altered without consent.
            She cited a LinkedIn case where a woman’s standard headshot was modified to add cleavage before being posted publicly. “If companies are using AI, they should let you know,” she emphasised. “It affects your reputation and your professional image.”
            Abbasi also highlighted a rising trend of doctored YouTube thumbnails featuring Pakistani actresses, the unchecked spread of sexualised images, and the dangerous access gaps within state surveillance systems. In one case she investigated, a government officer used data from a woman’s official application to locate her home and assault her daughter. “What could go wrong? Possibly everything,” she said.
            For journalist and researcher Annam Lodhi, these incidents are not anomalies. Rather, they are the logical outcome of a gendered digital culture that predates AI but is now amplified by it. Her research on tech-facilitated gender-based violence shows just how vulnerable women remain.
            “All women in Pakistan’s digital space are highly vulnerable because they are the rare ones who voice their opinions,” she explained. Journalists, activists and political workers face the harshest climate, operating in spaces where harassment is already “widespread and normalised.”
            Lodhi noted that even before AI, morphed images caused serious harm. Now, she said, “AI makes harassment faster, more personalised, and harder to trace.”
            One interviewee told her that abusers were able to find her residence through “minor details shared over the years,” illustrating how AI-driven data scraping collapses the line between digital and physical danger. Deepfakes, Lodhi argued, weaponise cultural stigma around honour: “A woman’s face can be lifted from a profile photo and placed onto sexualised content within minutes.”
            Her survey of 439 users found that 73pc of women on X had faced harassment, and many now self-censor or withdraw altogether.
            For her, AI isn’t creating misogyny but “it magnifies it,” making harmful gender norms appear socially acceptable and widely endorsed.
            System struggling to protect women online
            For Ambreen, another pseudonym, the terror began quietly after a stranger began using her photos to create fake profiles, messaging her at random, disappearing instantly, and reappearing with new accounts. “He’d message me at 3am and then immediately deactivate his account so we couldn’t trace him,” she recalled. The man repeatedly pushed her to get on a Skype call, something she didn’t fully understand at the time but instinctively felt was unsafe.
            Isolated and afraid, she first approached the DRF, followed by the FIA’s cybercrime office in Islamabad. “I was in university at the time, studying, so I was very scared but also brave enough to take that step,” she said. She kept the ordeal hidden from her family, relying instead on friends and the few responsive officers she encountered.
            Eventually, with DRF’s support and multiple complaints filed, the harassment stopped. “Since I took multiple steps at once, I don’t know which one worked, but the messages stopped, and all the fake pages were taken down,” she said.
            This, however, was not the case with Ayesha, who chose to be identified by her first name only. She told Dawn that her harasser kept reappearing with “obscene profiles and lewd messages for five years before I mustered the courage to tell my husband.”
            “My husband was very upset with all this and did not agree to go to the FIA to complain,” she said. “He thought we would make matters worse by involving the police.”
            Ayesha’s experience reflects a broader failure of systems meant to protect women, a reality high court advocate Syed Miqdad Mehdi sees regularly. He said that while Pakistan’s cybercrime law, the Prevention of Electronic Crimes Act (Peca), 2016 , has evolved, its implementation still lags behind the scale of abuse. “Law is evolving with the change of modern challenges… but public awareness raising is very important, and capacity building is essential, especially with regards to GBV and TFGBV,” he said.
            He stressed the critical need to sensitise officers, build trust, and create victim-friendly mechanisms so women can report cases without fear of humiliation. “There is no doubt that the FIA… has been getting a lot of successful stories,” he acknowledged, “but despite this, there is a lot of pendency and a lot of load on their work.”
            “The specialised courts need to be sensitised,” he added, stressing that TFGBV cases often fall through the cracks of existing legal structures.
            For Fauzia Yazdani, a gender and governance expert with over three decades of experience, the core issue lies deeper, in social norms and the pace at which digital harms evolve.
            “When you come into the digital space, your socials and social apps were curated to be gender-blind,” she said. As new forms of online violence emerged, laws were drafted, “but as the digital space is evolving at such a speed, oversight and rules and regulations are not evolving.”
            Despite the presence of strong laws, from the Pakistan Penal Code to the women protection acts, “are they being implemented? That’s where the issue is,” she said.
            Yazdani described deepfakes, harassment bots, and algorithm-driven amplification as “crude corners of digital space that need to be monitored and regulated,” warning that the gap between law, oversight, and reality is widening.
            She argues that the surge in TFGBV is inseparable from Pakistan’s social and political climate. “Our societal narrative is tilted towards gender inequality… it stereotypes women, it categorises or typecasts them into certain roles,” she explained. The result is a landscape where political polarisation and misogyny intertwine and where women’s visibility, whether as journalists, politicians, or ordinary citizens, can become a liability.
            “It’s very convenient… you’re anonymous, you can harass someone, you can generate bots behind them… and still remain anonymous.”
            Turning tide against online GBV
            Technology-facilitated gender-based violence is not just a digital threat. It can change lives, careers, and families. As Yazdani explains, “For transformational change, we have to change the society’s narrative, we have to change the society’s thought process.”
            Without shifting norms, strengthening oversight, and bringing more women into tech, she warns, the country will continue to see TFGBV grow faster than the systems meant to contain it.
            Yet even with awareness, enforcement, and advocacy, the human cost is clear. Shukria’s story brings this painfully to life: nearly two years after being targeted online and forced out of journalism, she remains unable to write, speak publicly, or practise her profession.
            “I am engaged to be married soon, but I feel like I am sitting in a cage, only dreaming of the life I once had,” she says. “I hope that once I am with my husband, I can revive the career that was forcibly taken from me.”
            Imran Gabol from Lahore and Nadir Guramani from Islamabad contributed to this report.
            Like or Dislike: 0

            Short Link:
            News Code:
            Member Code:

            More News
            The UAE royal brothers fuelling the Sudan massacre
            The UAE royal brothers fuelling the Sudan massacre
            Owners fled after Indian nightclub blaze killed 25: police
            Owners fled after Indian nightclub blaze killed 25: police
            Photos show locals in Thailand and Cambodia taking shelter as border conflict intensifies
            Photos show locals in Thailand and Cambodia taking shelter as border conflict intensifies
            2025 on track to be among three hottest years recorded
            2025 on track to be among three hottest years recorded
            Reporters without Borders: 67 journalists killed over past 12 months
            Reporters without Borders: 67 journalists killed over past 12 months
            Middle Eastern Politics Headlines at 5:34 a.m. GMT
            Middle Eastern Politics Headlines at 5:34 a.m. GMT
            Middle Eastern Headlines at 5:34 a.m. GMT
            Middle Eastern Headlines at 5:34 a.m. GMT
            Cuba sentences ex-economy minister to life in prison for espionage
            Cuba sentences ex-economy minister to life in prison for espionage
            Russia Will Do Everything to Strengthen CSTO - Putin
            Russia Will Do Everything to Strengthen CSTO - Putin
            درج نظر الزامی میباشد
            Protected by FormShield
            Send
            • More News
            • Australia demands social media giants report progress on account bans for children under 16
            • Nepal’s far-western settlements rejoice as clean water reaches homes
            • Israeli Tank Opened Fire on UN Peacekeepers in Southern Lebanon - UNIFIL
            • Multiple streams of income beyond a stable job? Yes, say Singapore’s Gen Zs
            • Malaysia’s long-awaited electric train service linking Kuala Lumpur and Johor Baru makes its debut in preview run
            • U.S. calls on Houthis to release detained mission staff
            • The Philippines tests ‘transition credits’ to cut coal use in novel experiment
            • Indonesian police accused of using ‘excessive force’ in August protests
            • Yemeni city buckles under surge of migrants seeking safety, work
            • South Korea minister resigns over alleged bribes from church
            • Displaced Gaza families struggle as winter storm hits
            • ‘What’s the worst thing that’s gonna happen?’ South Australia Premier says social media ban is about protecting children
            • Can we build a digital world where women are not erased?
            • From fake profiles to deepfake porn: The war Pakistani women cannot win alone
            • Austria set to vote on headscarf ban in schools
            • In Japan, nearly 900 had driver’s licenses suspended for drunk cycling in Jan.-Sept.
            • Philippines’ audit commission flags health department over expired, expiring drugs worth millions
            • Seoul education chief outlines proposal to scrap college admission exam by 2040
            • Racing towards great white sharks in Australia
            • 2-year-old trapped in shaft in Bangladesh’s Rajshahi; oxygen being supplied continuously
            • Tradition or animal abuse? Korean bullfighting faces reckoning in parliament
            • Man arrested after allegedly molesting six-year-old girl at Singapore’s Changi Airport transit area
            • Clinical trial of Nipah virus vaccine launched in Bangladesh
            • Poor dental hygiene a growing issue among Malaysian children, say experts
            • Hanoi chokes under severe pollution as AQI levels hit health-warning zone


              خبرگزاری آریا

              "Arya News Agency" is an official and independent Iranian news agency with the slogan "Transparent, honest and professional movement in information dissemination."

              Join with Us:

              Thursday, December 11, 2025
              News Groups:
              • iran
              • world
              • Economy
              • Sports
              • Technology
              Arya Group:
              • مرکز مطالعات استراتژیک آریا
              • شرکت سرزمین هوشمند آریا
              • انتشارات پیشگامان اندیشه آریا
              © - Arya News Agency
              About us| Contact us| RSS| Links| Advanced search