Google won’t downrank top deepfake porn sites unless victims mass report
Today, Google announced new measures to combat the rapidly increasing spread of AI-generated non-consensual explicit deepfakes in its search results.
Because of “a concerning increase in generated images and videos that portray people in sexually explicit contexts, distributed on the web without their consent,” Google said that it consulted with “experts and victim-survivors” to make some “significant updates” to its widely used search engine to “further protect people.”
Specifically, Google made it easier for targets of fake explicit images—which experts have said are overwhelmingly women—to report and remove deepfakes that surface in search results. Additionally, Google took steps to downrank explicit deepfakes “to keep this type of content from appearing high up in Search results,” the world’s leading search engine said.