News

Deepfake Porn Victims Are Seeking Federal Protections Through Legislation

Google Sites

While some states like Virginia and California have passed laws targeting deepfake pornography, the lack of federal protection can leave victims without legal recourse. Honza Cervenka, a lawyer who specializes in nonconsensual pornography, told Refinery29 that for these videos and images to be considered image-based sexual abuse, the breasts or genitals of the person would have to be shown, which is often not the case in deepfake pornography. “It sort of falls through the cracks of many of the laws that were written with the original revenge pornography, rather than this more sophisticated deepfake imagery,” Cervenka said. 

Uldouz Wallace, an Iranian actress, was one of the stars targeted by the 2014 iCloud hack, in which private photos of celebrities including Kirsten Dunst, Jennifer Lawrence, and Kate Upton, were leaked online. Wallace, who was 25 when her private photos were hacked, watched in the ensuing years as deepfake pornography was made from her photos. “It’s several layers of different types of abuse,” Wallace said, “with the deepfake aspect of it after the whole initial hack and leak. There’s just so much [fake content] now that I don’t even know what’s what.”

Wallace is now affiliated with the Sexual Violence Prevention Association (SVPA) an organization that uses “advocacy, education, and community engagemen”to “create a world where everyone can live free from the threat of sexual violence.” In an open letter, SVPA is calling on Congress to ban deepfake porn. “Right now, there are no [federal] laws banning the creation or distribution of deepfake porn,” the letter reads. “Until there are consequences, deepfake pornography will continue to increase.” 

Omny Miranda Martone, the Founder & CEO of SVPA, said the organization is committed to helping pass federal legislation against deepfake pornography and educating people on why it’s so harmful. “People are like, well, why do [victims] even care? It’s not real anyways. It’s not actually them,” Martone said. “I don’t think a lot of people fully understand the consent piece of this – that you don’t have the person’s consent and this is a violation of autonomy and privacy.”

As the use of artificial intelligence and deepfake technology becomes even more common – there is an increasing need for standards for protection to be set through bills like the Preventing Deepfakes of Intimate Images Act which was introduced by Rep. Joseph Morelle (D–N.Y.). “As artificial intelligence continues to evolve and permeate our society, it’s critical that we take proactive steps to combat the spread of disinformation and protect individuals from compromising situations online,” Morelle said. As of publication, the bill has not advanced through the House of Representatives.

Let us slide into your DMs. Sign up for the Teen Vogue daily email.

Google Sites

Source link