Incredible: White House, SAG-AFTRA respond as Taylor Swift ‘considers legal action’ over sexually explicit AI images…

Incredible: White House, SAG-AFTRA respond as Taylor Swift 'considers legal action' over sexually explicit AI images...

 

The images, made using AI, circulated on X on Wednesday January 24; it took the company over 17 hours to remove them.

 

Incredible: White House, SAG-AFTRA respond as Taylor Swift 'considers legal action' over sexually explicit AI images...
“Ie are alarmed by the reports of the of the circulation of images that you just laid out, false images to be more exact, and so while social media companies make their own independent decisions about content management we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual intimate imagery of real people,” press secretary Karine Jean-Pierre said in a conference on Friday January 26.

“Sadly though, too often, we know that lack of enforcement disproportionately impacts women and they also impact girls who are the overwhelming targets of online harassment and abuse.”

She added that the issue is “one that the Biden-Harris Administration has been prioritizing since day one”, and continued: “Of course Congress should take legislative action. That’s how you deal with some of these issues.”

New York Congressman Joe Morelle has already drafted legislation – the Preventing Deepfakes of Intimate Images Act – that he hopes to push through Congress.

SAG-AFTRA, the union that supports actors of which Taylor, 34, is a member, also shared a statement and called for technology companies to “stop exploitation of this nature”.

“The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning,” the union said in a statement. “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

“We support Taylor and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy,” the statement concluded.

Taylor has not commented publicly but reports have suggested that she is considering taking legal action. The term deep fake is used when a video or image of a person has been digitally altered so that they appear to be someone else, or be doing something else, than the original videos of images showed.