What we desperately need is for lawmakers and technology companies to start seeing the problem of nonconsensual deepfake porn as the emergency it is and holding its creators and facilitators to account. Alas, at this point, however, there’s no putting the deepfake genie back in the bottle. After the Twitch controversy, for example, Google searches for deepfake porn boomed. Writing about all this is tricky because, depressingly, any sort of coverage risks directing more people to deepfake sites. This isn’t just porn, it’s terrorism it’s meant to punish and silence women. The rise of AI-generated imagery has taken that harassment to hideous new heights. You have to balance your work with the exhausting and sometimes expensive work of fighting back against nonstop harassment. And yet, increasingly, this is the reality of being a woman who is in the public eye even the smallest bit. It should not be part of my job to be harassed, to see pictures of me ‘nude’ spread around.” “It should not be part of my job to have to pay money to get this stuff taken down. “This is what it looks like to see yourself naked against your will being spread all over the internet,” she said. “This is what it looks like to feel violated, this is what it looks like to feel taken advantage of,” she said in a 30 January live stream. One of the women allegedly targeted by the deepfake porn, streamer QTCinderella, spoke out about the toll it had taken on her mental health. After he was caught out, he issued a pretty terrible apology video (which featured his wife crying in the background) and claims to have wired a Los Angeles-based law firm “about $60,000” to cover any woman on Twitch who wanted to use their legal services to get those images and videos removed from websites. He only admitted to this, by the way, because he slipped up in a live stream and accidentally showed browser windows open to a website of someone who made deepfakes of streamers. Earlier this year a Twitch streamer called Brandon “Atrioc” Ewing admitted to buying and watching deepfake porn of his female colleagues. The rise of deepfake porn – which has been facilitated by major companies – is already having a massive impact on ordinary people’s lives. “A creator offered on Discord to make a five-minute deepfake of a ‘personal girl,’ meaning anyone with fewer than 2 million Instagram followers, for $65,” NBC reports. Increasingly, the women targeted aren’t just celebrities – you can get deepfake porn made to order, featuring anyone you like. A 2019 report by Sensity, a company which detects and monitors deepfakes, found that 96% of deepfakes were non-consensual sexual deepfakes, and of those, 99% featured women. It comes as no surprise that it’s women who are largely affected by the rise of deepfake porn. Business is booming so much that “two popular deepfake creators are advertising for paid positions to help them create content”. The website creators use the online chat platform Discord to advertise their wares and people can pay for it with Visa and Mastercard. As NBC News found, two of the largest websites that host this content are easily accessible through Google. You don’t need to go to the dark web or be particularly computer savvy to find deepfake porn.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |