- cross-posted to:
- politics@lemmy.world
- cross-posted to:
- politics@lemmy.world
In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.
What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.
“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”
It’s disappointing that AOC supports this capitalist law. This law is not against harassment. The DEFIANCE act creates a new kind of intellectual property.
Friendly reminder we’ve had photoshop for decades. Legislation can’t keep up with technology and trying to do so will almost always come at the cost of constitutional rights. Like freedom of expression.
If I want to photoshop a dick on Trump’s face, nobody should be allowed to tell me no. It’s not fucking “interpersonal violence”.
Photoshopping a dick onto Trump’s face is 100% protected expression. Producing a photoreal deepfake of him balls deep in Lindsey Graham’s ass while Mitch McConnell can be seen holding the camera in a mirror wearing a ballgag and cuck strap then posting it online either without context or trying to pass it off as real is a problem.
Oddly specific. Donnie, is that you, trying to get ahead of a video that’s about to leak?
About to leak? Somebody hasn’t been paying attention to the news.
The problem isn’t just scale, it’s ease of use. Photoshop took time, skill, and it was usually still pretty apparent that a photo had been manipulated—not to mention the evidence when you can find the original elements in the actual photo, which could’ve been done by anyone who was willing to search enough in order to debunk it. Now AI gives near flawless photoshop skills to every single person, infinitely upping the likelihood that a complete fabrication of unique elements, untraceable to any original photo, can cause serious harm.
Remember that pope in the puffy jacket photo? It had telltale signifiers of AIgen, but it still fooled insane amounts of people. Now, make the photo abusive and with a small amount of work, erase the AI flaws. And release it at an opportune time for the bad actor (I would bet a lot that we will see some of this as the election nears. A truly groundbreaking “October surprise”). What’s that old saying? “A lie will travel halfway around the world before the truth has a chance to pull its boots on?”
You’re right that it’s hard for legislation to keep up with technology. But that’s because technology companies are insanely rich and can endlessly lobby. And we have corrupt as fuck legislators. We could keep up with technology. But the system is broken in favor of those who want zero oversight. And it breaks further every time one of them is successful. Regulating massive companies to hobble their ability to cause lasting damage should not be mentioned alongside terms like “freedom of expression.” Yes, the power to create these images is technically in the hands of the people feeding the AI the prompt, but restricting the abilities of a company to hand dangerous tools to anyone and everyone isn’t the same thing as restricting people’s right to create. I think that’s a dangerous way of thinking.