IEEE Spectrum – A new report finds that deepfake software is still prohibitively hard to use, but that may not be true for much longer – “If you wanted to make a deepfake video right now, where would you start? Today, an Amsterdam-based startup has published an audit of all the online resources that exist to help you make your own deepfakes. And its authors say it’s a first step in the quest to fight the people doing so. In weeding through software repositories and deepfake tools, they found some unexpected trends—and verified a few things that experts have long suspected. The deepfake apocalypse has been lurking just over the horizon since about 2017. That’s the year the term was invented by the eponymous reddit user “u/deepfakes,” who weaponized the face-swapping technology to graft famous actresses into porn. Fears of other kinds of misuse quickly spread: In the two years since, U.S. Congressional panels have convened to assess whether faked video scandals will undermine democracy. Apps have popped up that use deepfake methods to turn any photo of a fully-clothed woman into a naked snapshot. There have been reports of fraud and identity theft abetted by deepfakes. And yet, so far rumors have largely outpaced actual deployment. To be sure, nonconsensual pornographic videos are a scourge. But deepfakes have not spread beyond these boundaries. Politics is dogged more often by “cheapfakes” and “shallowfakes,” low-tech media manipulations best characterized by the infamous video of U.S. house majority leader Nancy Pelosi slowed down to make her look unwell or drunk. “The apocalypse is a little overhyped in my view,” says Siddharth Garg, a researcher at New York University.
So—is a true crisis coming, and if so, when? The new report, issued today by the counter-deepfake firm Deeptrace, based in Amsterdam, tries to answer those questions. Deeptrace is one of several new startups and academic groups that aim to build ”deepfake detectors” to guard against the coming storm of AI-generated content. But the company’s employees realized they first needed to better understand what exactly is out there. So they conducted the first major audit of the deepfake environment on the open web…”
Sorry, comments are closed for this post.