White House urges tech industry to shut down marketplace for sexually abusive AI deepfakes – The Mercury News

By MATT O'BRIEN and BARBARA ORTUTAY (AP Technology Writers)

President Joe Biden's administration is pushing the technology industry and financial institutions to shut down a growing marketplace for sexually abusive images created using artificial intelligence.

New generative AI tools have made it easy to remodel an individual's likeness right into a sexually explicit AI deepfake and share these realistic images in chat rooms or social media. The victims – be they celebrities or children – have little probability of stopping it.

The White House on Thursday is asking on corporations to cooperate voluntarily within the absence of federal law. By committing to a series of concrete actions, officials hope the private sector can curb the creation, distribution and monetization of such non-consensual AI images, including explicit images of kids.

“When generative AI came on the scene, everyone was speculating about where the first real dangers would emerge. And I think we have the answer,” said Biden's chief science adviser Arati Prabhakar, director of the White House Office of Science and Technology Policy.

She told the Associated Press a “phenomenal increase” within the distribution of images without their consent, driven by AI tools, targeting women and girls particularly in ways in which can turn their lives the other way up.

“Whether you're a teenager, whether you're a gay kid, these are the issues that people are facing right now,” she said. “We've seen an acceleration because generative AI is evolving very quickly. And the fastest thing that can happen is for companies to step up and take responsibility.”

A document seen by AP ahead of its release Thursday calls for motion not only from AI developers, but in addition from payment processors, financial institutions, cloud computing providers, search engines like google and the gatekeepers — namely Apple and Google — that control what makes it into mobile app stores.

The private sector must step as much as stop the “monetisation” of image-based sexual abuse, particularly by restricting payment access for web sites that promote explicit images of minors, the federal government said.

Prabhakar said many payment platforms and financial institutions have already stated that they’ll not support corporations that distribute offensive images.

“But sometimes it's not enforced; sometimes those terms of service don't exist,” she said. “And that's an example of something that could be enforced much more strictly.”

Providers of cloud services and app stores for mobile devices could also “restrict web services and mobile applications that are marketed with the aim of creating or modifying sexual images without the consent of the persons concerned,” the document says.

And whether the photo was taken by artificial intelligence or an actual nude photo was posted online, survivors should give you the option to get online platforms to remove it more easily.

The most famous victim of pornographic deepfake images is Taylor Swift, whose ardent fan base hit back in January when offensive AI-generated images of the singer-songwriter began circulating on social media. Microsoft vowed to strengthen its security measures after among the Swift images were traced back to its AI visual design tool.

Last summer, the Biden administration mediated voluntary commitments by Amazon, Google, Meta, Microsoft and other major technology corporations to equip latest AI systems with a series of safeguards before they’re released to the general public.

Biden then signed a ambitious executive order in October, which goals to guide the event of AI in order that corporations can profit from it without compromising public safety. While the main focus was on broader AI issues, including national security, it also highlighted the emerging problem of AI-generated child abuse images and the seek for higher ways to detect them.

However, Biden also said the federal government's AI protections should be backed by laws. A bipartisan group of U.S. senators is now urging Congress to spend not less than $32 billion on development over the following three years. artificial intelligence and funds Measures for protected managementHowever, it has largely postponed calls for these protective measures to be enshrined in law.

Encouraging corporations to get entangled and make voluntary commitments “does not change the fundamental need for Congress to take action here,” says Jennifer Klein, director of the White House Gender Policy Council.

Longstanding laws already criminalize the production and possession of sexual images of kids, even in the event that they are fake. Federal prosecutors filed charges earlier this month against a Wisconsin man who allegedly used a well-liked AI image generator called Stable Diffusion to create 1000’s of AI-generated realistic images of minors engaged in sexual acts. A lawyer for the person declined to comment after his arraignment on Wednesday.

But there may be little control over the technical tools and services that enable the creation of such images. Some of them are situated on dubious business web sites that reveal little details about who runs them or what technology they’re based on.

The Stanford Internet Observatory in December it said has found 1000’s of images of suspected child sexual abuse in the large AI database LAION, an index of online images and captions used to coach leading AI image producers comparable to Stable Diffusion.

London-based Stability AI, which owns the most recent versions of Stable Diffusion, said this week that it had “not approved” the discharge of the sooner model the Wisconsin man allegedly used. Such open-source models are difficult to rebottle because their technical components are publicly available on the web.

Prabhakar said it shouldn’t be just open source AI technology that’s causing harm.

“It's a broader problem,” she said. “Unfortunately, a lot of people in that category seem to be using image generators. And we've just seen such an explosion in that space. But I don't think it's neatly divided into open source and proprietary systems.”

image credit : www.mercurynews.com