With January punctuated by grim, deepfake images of Taylor Swift and George Carlin, the Screen Actors Guild says it āmust be made illegalā.
The mixture of fascination, hype and unease that surrounded artificial intelligence in 2023 has only escalated as weāve tottered through the first month of 2024.
Earlier in January, a purportedly AI-generated video of the late George Carlin delivering a stand-up routine emerged, which made headlines and prompted an understandably damning response from the comedianās daughter, Kerry Carlin.
In more recent days, deepfake, pornographic images of Taylor Swift began doing the rounds on Twitter/X; having circulated for several hours, the images were finally removed. At the time of writing, the social media company has even taken the drastic step of blocking all searches of Swiftās name.
That such a high-profile and admired performer would have her image used in this way appears to have galvanised opinion over the misuse of generative AI, with the White House, Microsoft and the Screen Actors Guild (SAG-AFTRA) all issuing their responses.
SAG-AFTRA, first of all, weighed in on the 26th January, with its statement describing the Swift images as āupsetting, harmful and deeply concerningā and arguing that generating them should be made illegal:
The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning. The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late. SAG-AFTRA continues to support legislation by U.S. Rep. Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again.
The Guild also mentioned the George Carlin video situation, with the key chunk perhaps being this one:
Companies or individuals attempting to replicate an individual’s creative work for profit, publicity or clout are infringing upon intellectual property rights and devaluing the essence of human artistry. We must remember that everything generated by A.I. originates from a human creative source and human creativity must be protected.
A White House spokesperson, meanwhile (as reported by Venturebeat) said that it was āalarmedā by the Swift images, adding that while ālegislative actionā needed to be taken, āsocial media companies [ā¦] have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people.ā
Microsoft ā which has a big stake in the AI bubble, with its stake in OpenAI as well as its own Copilot chatbot tech ā has also condemned the Swift images. CEO Satya Nadella, speaking to NBC Nightly News, said that there needed to be more āguardrailsā in place, and that ālaw enforcement and tech platformsā should work together to stamp out the creation of fake images.
(Somewhat embarrassingly for Microsoft, The Verge reports that those Swift images may have been generated using the companyās own Microsoft Designer image creation software.)
In recent days, the George Carlin video has also been removed from the web as the late comedianās estate filed a copyright infringement lawsuit against the company that made it. Interestingly, that company, Dudesy, has since backpedalled on the initial claim that the bogus standup special was created by AI; āThe YouTube video āIām Glad Iām Deadā was completely written by Chad Kultgen,ā a spokesperson told Arstechnica, referring to the companyās co-founder.
If thereās a silver lining to these high-profile misuses of generative AI, itās that it may provide the catalyst for legislation that will at least help to reduce its impact on the rest of the public. Campaigners and individuals have repeatedly pointed out the damage the spread of deepfake imagery can do; a documentary on the subject, Sophie Compton and Reuben Hamlynās Another Body, was screened to much acclaim at the South by Southwest festival in 2023.