ReportWire

Tag: glamour talks…consent

  • Zara McDermott: ‘It’s time we, as women, reclaim technology’

    [ad_1]

    The big question is: who is responsible? How can tech companies and social media platforms continue to allow such devastating abuse to unfold on their platforms every second of every day? How, in 2025, is it still acceptable for perpetrators to hide behind the so-called “safety” of a screen? And perhaps most concerningly, is our society breeding a generation where tech abuse is normalised?

    The reality is that tech-facilitated abuse will not stop until tech companies listen to survivors and implement safety measures by design. Tackling this issue also requires urgent government action, including investment in lifesaving support services for survivors.

    But we must remember: tech-facilitated abuse is not just a digital threat – it is a human issue. In conversations about tech abuse, perpetrators often become invisible, shielded behind screens and devices that offer a convenient scapegoat. The impact of technology lies in its use. Although it can be manipulated by perpetrators to inflict harm, it also holds immense potential to protect and empower women and girls.

    In September, I supported Refuge’s Tech Safety Summit, which explored not only the biggest issues facing survivors of tech-facilitated abuse, but also the most innovative ways technology is being used to keep women and girls safe. This includes a new platform called Survivor AI, a feminist AI letter generator that supports survivors in getting abusive content removed from the internet.

    Another example discussed in my documentary was Operation Atlas, created by the Metropolitan Police. It’s a digital processing tool used by police to speed up stalking and harassment investigations by analysing call records and messages in a matter of seconds. It means that there are now specific, trained officers dedicated to gathering this digital evidence.

    Through my work with Refuge, I’ve seen first-hand how their tech team supports women to use technology safely. Beyond the screen, I’ve also recently learned about how technology is being used to protect women and their children living in Refuge’s safe accommodation.

    To meet the needs of the diverse survivors it supports, Refuge launched its dispersed accommodation model in 2023 – an alternative to communal refuges that includes standalone properties ranging from flats and bungalows to three-bedroom houses.

    Not only is dispersed accommodation more accessible for many survivors, including those with disabilities or large families, but these properties are equipped with the latest technology to keep survivors safe. Discreet CCTV and video doorbells provide peace of mind and real security for survivors who have recently fled domestic abuse. I can’t help but think how beneficial this would be for survivors I’ve spoken to who have experienced, or are experiencing, stalking and harassment.

    To provide safe havens for more survivors than ever before, Refuge will soon be expanding its dispersed accommodation – made possible by its new partnership with Omaze. These properties will be owned by Refuge, improving the sustainability of its housing model, and will be equipped with built-in safety technology to ensure that survivors can finally begin healing.

    Although I’ve witnessed and experienced the devastating effects of tech-facilitated abuse, I remain hopeful. When used for good, technology has the power to help create a future where women and girls can live free from fear. It’s time we, as women, reclaim technology – not by retreating from the digital world, but by having meaningful conversations about tech-facilitated abuse and pushing for safer, more equitable digital spaces.


    To learn more about tech-facilitated abuse, visit www.refugetechsafety.org.

    Find out more about Refuge’s campaign with Omaze: www.Omaze.co.uk/pages/refuge.

    Refuge’s 24-hour National Domestic Abuse Helpline is available on 0808 2000 247, and its confidential live chat is accessible online at www.nationaldahelpline.org.uk.

    [ad_2]

    Zara McDermott

    Source link

  • I’m a professional dominatrix. Here’s how I fought back after a client stole my intimate images

    [ad_1]

    As the initial shock began to fade, Madelaine decided to take action. “I thought, I don’t want to live in a society where this is just par for the course, where this is just what happens,” she tells GLAMOUR. “It took years to get over it, but I knew that I was going to make a change; I just didn’t know how.”

    For around seven years, Madelaine turned her attention to campaigning. She participated in roundtables and interviews that informed the UK’s 2025 Pornography review, spoke out about financial discrimination against sex workers, and co-authored a piece on improving labour standards in the online sex industry. But Madelaine wanted to move quickly. “I knew I needed to do more, and I reached a point where I was exhausted by it all and thought to myself, ‘I just need a guardian angel’. I want to send that image safely. And I don’t think that’s too much to ask.” And so Image Angel was born.

    Earlier this year, Madelaine attended Glamour’s parliamentary roundtable about image-based abuse. In one of the most memorable speeches of the evening, Madelaine handed out her Image Angel business cards, encouraging people to pass them around the room until one found its way back to her. She held up the business card and pointed out that, thanks to fingerprint technology, she could technically find out the name of every single person who had touched the card. Similarly, Image Angel utilises watermark technology to track who has accessed an image or video shared on a platform, serving as a powerful deterrent against image-based abuse while also respecting the victim’s autonomy.

    Once Madelaine came up with the idea, she searched on LinkedIn for someone who could help make it a reality. “I emailed people at various tech companies and said, ‘Look, here’s the problem. Here are the current solutions. Please, can you help me or point me in the right direction to someone who can build this for me?’ Eventually, one person agreed. Over several months, we worked together to build this. It took so long, but it’s finally ready, it’s finally installed, and it’s finally protecting people.”

    “We need more people to insist that platforms use this technology,” says Madelaine. “We need more platforms to take on the technology, and we need the law to tighten up and say that prevention is better than cure.” She reflects on her own experience of image-based sexual abuse: “If Image Angel had been installed, I could have at least found out which platform it had come from. The platform could have then banned that user. They could have helped me add that user, username, or user’s data to a hashing list, ensuring no one ever interacts with that person in an online forum again.”

    While much of the rhetoric surrounding ‘sending nudes’ focuses on victim-blaming, Image Angel offers something new. “Denying people the freedom to send a picture or shaming someone because they choose to send a picture isn’t a progressive society,” says Madelaine. “We should allow people to have fun, play and flirt, but knowing that they can safely do that.

    “We used to roll about in the hay, and now we send images and messages. And those life experiences build you. It’s exciting and thrilling. You get a flutter when you receive that message. So why can’t you respond in a way that feels authentic?”

    For more from Glamour UK’s Lucy Morgan, follow her on Instagram @lucyalexxandra.

    [ad_2]

    Lucy Morgan

    Source link

  • The stories we tell (and consume) about sexual violence can actually make the world a better place for survivors

    The stories we tell (and consume) about sexual violence can actually make the world a better place for survivors

    [ad_1]

    What’s fascinating is that seeing the play either live or on the screen inspires viewers not only to reflect, but also to act. Remarkably, it has led to lasting and ongoing real-world change.

    “Through NT Live and NT at Home, we have made it possible for over half a million people to see Jodie Comer’s peerless performance and experience the power of this remarkable play,” says Kate Varah, executive director of the NT. “It simply wouldn’t be possible to reach these numbers in a theatre. This accessibility allows the powerful stories told through theatre to drive real-world change.”

    For one thing, it has inspired numerous women to find their own voice. “Women who had never spoken about a rape perpetrated against them found courage to tell their close people and many gave evidence to law enforcement,” playwright Suzie Miller tells us. “I know Jodie Comer and the producers join me in the humbling experience of reading so many messages of individual life changing experiences that came about after watching the play live or on NT Live.”

    So many women reached out about the life-changing impact of the play, the film’s producer, James Bierman, reached out to Everyone’s Invited, a charity that offers a safe space for victims of sexual assault to share their stories.

    “The act of watching Prima Facie enables and encourages survivors to confront and share their own real-world experiences, helping individuals have the confidence to share their stories and underlining the need for a safe space for survivors,” says Soma Sara, Founder of Everyone’s Invited.

    But the play has also inspired changes to the system itself.

    The filmed version of the play is now being used for judicial education in a number countries. It’s included as a module for secondary school consent education. It is used as an education tool in continuing education of various parts of the UK Police force. It has even been used as source in legal changes here in the UK.

    “A Northern Ireland-born High Court Judge at the Old Bailey had the influence to include a viewing of the NT Live version of the play be mandatory viewing for judges in Northern Ireland,” says Miller. “Another Judge called me to say after seeing the play live she had redrafted the direction to the jury on rape law incorporating some of the language and messages of the play.”

    The play has also led to the creation of TESSA (The Examination of Serious Sexual Assault Law) by four London barristers, as Miller says, “to interrogate how they can contribute their knowledge to changing the law from within.”

    Kate Parker, a former barrister and founder of the Schools Consent Project, has used the play as part of her work to educate young people about the nuances of consent.

    “As far as the Schools Consent Project is concerned, Prima Facie has been transformative,” she says. “Since the play’s first run in London in 2022, we have seen a 52% yearly rise in workshops, which means 245 additional consent workshop were delivered to approximately 8,500 young people. This brings our total number to 55,000 students educated about consent.”

    Parker also launched a New York branch of the charity when the play travelled to Broadway. “We’ve now taught consent to over 5000 students in NYC, including in Spanish. We’ve fundraised over £150,000 globally as a direct result of the play.”

    [ad_2]

    Meg Walters

    Source link

  • The image-based sexual abuse announcement is meaningless nonsense. Here’s why

    The image-based sexual abuse announcement is meaningless nonsense. Here’s why

    [ad_1]

    For survivors of IBSA, this empty announcement is a slap in the face. Many survivors, including myself, have experienced firsthand how difficult it is to get justice for these types of offences. Despite the existing laws, the criminal justice system often fails to take these cases seriously. The process is confusing, support services are lacking, and police often don’t take survivors seriously.

    In response to this announcement, many survivors have voiced their frustrations, with some even writing an open letter to the Department for Science, Innovation, and Technology (DSIT). The consensus is clear: this announcement is not an improvement. It does nothing to address the systemic failures in the handling of image-based sexual abuse. It offers no additional protections, no better support services, and no new tools for survivors to pursue justice. This isn’t a meaningful change—it’s lip service.

    If the government truly cared about tackling IBSA and making the internet a safer place for women and girls, it would go beyond announcing meaningless administrative changes and address the core issues that allow these abuses to continue.

    First, there needs to be significant investment in survivor support. Legal aid, counselling services, and survivor advocacy are all woefully underfunded.

    Second, the government should focus on prevention. Educational campaigns about consent, respect, and the consequences of IBSA should be rolled out across schools and online platforms. Media literacy programs are crucial to raising awareness and helping individuals, particularly young people, understand the harmful effects of sharing non-consensual images.

    The UK government’s announcement about reclassifying IBSA as a priority offence is awful, as someone who has experienced IBSA and seen the devastating effects it has on women, I’m saddened and angry that it’s being used as lip service. It is a routine administrative procedure being spun as a significant new measure, designed to boost public perception without delivering any real change.

    Survivors of image-based sexual abuse deserve far better than this. If the government truly wants to tackle online abuse, it needs to go beyond PR stunts and take meaningful action—by providing more support for survivors, and focusing on preventing these offences in the first place.


    Find out more about GLAMOUR’s campaign in partnership with the End Violence Against Women Coalition (EVAW), Not Your Porn and Professor Clare McGlynn, demanding that the government introduces a dedicated, comprehensive Image-Based Abuse law to protect women and girls.

    Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.

    [ad_2]

    Patsy Stevenson

    Source link

  • Success! It will soon be illegal to create (not just share) deepfake porn

    Success! It will soon be illegal to create (not just share) deepfake porn

    [ad_1]

    Creating sexually explicit deepfake pornography will be made a criminal offence in England and Wales, thanks to a new amendment to the Criminal Justice Bill.

    It comes after GLAMOUR’s Consent survey, in partnership with Refuge and Rape Crisis England & Wales, found that 91% of women think that deepfake technology poses a threat to the safety of women. Since then, we have been shouting about this issue as loudly as possible.

    In February earlier this year, GLAMOUR partnered with Greg Clark, Chair of the Science, Innovation, and Technology Committee, to host a parliamentary roundtable about the threat of deepfake technology to women. And last month, we were proud to officially support the world’s first global virtual summit on deepfake abuse.

    Our work hasn’t gone unnoticed. During an exclusive interview with the Minister for Victims and Safeguarding Laura Farris, the minister noted that she’d seen our campaign – including our demo outside Parliament – on Instagram and had since followed our coverage of the issue.

    What will be the new law on deepfake porn?

    At present, there is no legal recourse against those who create deepfake pornography of other people without their consent. Under the Online Safety Act, only the distribution or sharing of deepfake porn is criminalised.

    The new offence, proposed by Conservative MP Laura Farris and the Ministry of Justice, will be punishable with an unlimited fine and a criminal record. Only if the image is then shared could offenders face prison time.

    Per a government press release, “The new law will mean that if someone creates a sexually explicit deepfake, even if they have no intent to share it but purely want to cause alarm, humiliation or distress to the victim, they will be committing a criminal offence.”

    “It will also strengthen existing offences as if a person both creates this kind of image and then shares it, the CPS could charge them with two offences, potentially leading to their sentence being increased.”

    Farris described the amendment to the Criminal Justice Bill as “an important opportunity to deal with the creation of deepfake images”.

    She further describes the creation of deepfake pornography as a “gateway offence”, which is reflected by the decision not to impose prison time on offenders.

    “One of the realities is that some of the perpetrators of this offence are teenage boys”, Farris explains. She notes that the Law Commission has previously flagged concerns about over-criminalising young people.

    “If you create a [deepfaked sexually explicit image] in the privacy of your own bedroom, it’s still a crime. It could be punishable with up to an unlimited fine, and you would get a criminal record.

    “To reflect the fact that creating the image is a gateway offence, there won’t be a custodial sanction, and you won’t join the Sex Offender’s register.”

    This changes if you share an image of deep-faked sexual content, Farris clarifies, which could result in a two-year custodial sentence and going on the Sex Offender’s register.

    Farris also anticipated that there may – almost ironically – be privacy concerns for offenders, who believe that it shouldn’t be a crime to create a deepfaked image for their own gratification in the privacy of their own home. She has a convincing analogy:

    “We have an offence in this country of creating a dangerous explosive. So even if you create an explosive in your kitchen, you’re committing an offence – albeit a low-level one. We recognise that if that material falls into the wrong hands or if the motive of the creator changes, then it has the potential to cause catastrophic harm.”

    You can use the same analogy, in a psychological sense, for creating deepfakes, Farris explains. “It’s not enough to say that ‘I just created it to use for my own gratification’,” she says. “If that gets shared, it can have a catastrophic effect on a person’s life.”

    What do the experts and campaigners think?

    Professor Clare McGlynn, an internationally recognised expert on tech-facilitated abuse who spoke at GLAMOUR’s roundtable, has welcomed the amendment, saying:

    “The Government’s announcement is a welcome recognition that deepfake porn is now an invisible threat pervading the lives of all women and girls. Deepfake technology is now so easy to use and access that being deepfaked can happen to any of us at any time, and there is little we can do about it.

    “Deepfake porn steals women’s identities and autonomy; it’s a digital forgery.”

    “Right now, someone can make deepfake porn of you without your consent, tell you they’ve done that, tell you they’re using it for sexual arousal, and there’s nothing you can do; it’s not unlawful.

    [ad_2]

    Lucy Morgan

    Source link

  • It’s not just Taylor Swift; all women are at risk from the rise of deepfakes

    It’s not just Taylor Swift; all women are at risk from the rise of deepfakes

    [ad_1]

    But seeing high-profile women victimised in this way also has a profound impact on regular women and girls. When Ellie Wilson, an advocate for justice reform, tweeted about the troubling response to the deepfakes of Swift, she was met with her own flurry of online abuse. “People threatened to make similar deepfake images of me,” she tells GLAMOUR. “These attacks for merely stating my opinion highlight just how dangerous it is for women to simply exist on the internet.”

    Olivia DeRamus, the founder and CEO of Communia, a social network created by and for women, notes that even speaking up against deepfaking puts other women in danger. “Just talking about [deepfaking] as a woman paints a target on my back, along with other advocates, the female journalists covering this, the #swifties speaking out, and even female politicians who want to tackle the issue.”

    Professor Clare McGlynn emphasises that deepfaking represents a threat to all women and girls, citing the “potentially devastating impact on our private and professional lives.”

    It’s clear that deepfake technology is rapidly hurtling out of control. Amanda Manyame cites “rapid advances in technology and connectivity” that make it “increasingly easy and cheap to create abusive deepfake content”. She adds, “Cyberspace facilitates abuse because a perpetrator doesn’t have to be in close physical proximity to a victim.

    “In addition, the anonymity provided by the internet creates the perfect environment for perpetrators to cause harm while remaining anonymous and difficult to track down.”

    Moreover, most countries are ill-equipped to deal with tech-facilitated harms like deepfaked image-based abuse. In the UK, it is an offence – under the Online Safety Act – to share deepfake pornographic content without consent, but it fails to cover the creation of such images. “This gap,” Manyame explains, “has created an enabling environment for perpetrators who know they are unlikely to be discovered or punished. The situation is worsened by the lack of legal accountability governing the tech sector, which currently does not have to ensure safety by design at the coding or creation stage.”

    Meanwhile, the tech sector itself is alienating victims. As Manyame tells GLAMOUR, “Content moderation on tech platforms relies primarily on reporting by victims, but reporting mechanisms are generally difficult to use, and many platforms frequently do not respond to requests to remove abusive content or only respond after a long time.”


    What is the law on deepfakes in the UK?

    According to Micheal Drury, Of Counsel at BCL Solicitors, “There is no direct law prohibiting the sharing of ‘deep fakes’ unless those images are pornographic. In that case, the recently created offences under the Online Safety Act 2023 will mean that a crime has been committed as long as the person whose image is shared (real or fake) has not consented and the person sharing does not believe they have consented.

    “There is no direct civil wrong allowing the person said to be shown in the image to sue. For those in the same position as Taylor Swift, the obvious solution is to rely upon the copyright of one’s image (if copyrighted), a breach of privacy or data protection laws; harassment (as a civil wrong), perhaps defamation, or criminal law more generally.”


    Can anything be done about deepfake technology? Let’s start with legislation. The Online Safety Act criminalises the sharing – not the creation – of non-consensual deepfake pornography, which could, as Sophie Compton, co-founder of #MyImageMyChoice, a movement tackling intimate image-based abuse, tells GLAMOUR, create “greater accountability for tech companies.” Whether this legislation will be effective is another story.

    Sophie explains that the current legislation allows tech companies to effectively “mark their own homework”. She points out that search platforms drive plenty of traffic to deepfake pornography sites – can the Online Safety Act clamp down on this? “The government needs to tackle Big Tech and their role in promoting and profiting off deepfake abuse, and get the sites and web services that are profiting off of abuse blocked from the mainstream internet.”

    Professor Clare McGlynn from Durham University notes that while the Online Safety Act has the potential to tackle deepfake pornography, “There is a real risk that the legislation is a damp squib, all rhetoric and little change.” She points out that Ofcom, the UK’s communications regulator, is currently consulting on the guidance it will use to enforce the Act. “Ofcom needs to challenge the social media companies to make a step-change in their approaches […] It should focus on proactive regulation being human-rights-enhancing. It can enable women to live freer lives online.”

    Ultimately, though, we need to address the misogynistic culture that empowers users to create harmful, non-consensual content of women. Helen Mort survived being deepfaked; she asks, “What are the cultural and social factors that make people abuse images in this non-consensual way?”

    We’re still looking for answers.


    GLAMOUR has reached out to representatives for Taylor Swift and X for comment.

    If you have had your intimate images shared without your consent, remember that you are not alone, and there is help available. Get in touch with the Revenge Porn Helpline at help@revengepornhelpline.org.uk. There is also a step-by-step guide on notyourporn.com, which should be followed before taking any action.

    For more from Glamour UK’s Lucy Morgan, follow her on Instagram @lucyalexxandra.



    [ad_2]

    Lucy Morgan

    Source link