Quantcast
Press "Enter" to skip to content

Deepfakes of Taylor Swift have emerged as the dark side of AI everybody widely expected arrives

When AI first launched many on social media and in our community pointed out how the software was likely to go very badly wrong. The latest entry into the we told you so column is here in all its glory.

According to reports (and yes we’ve seen the images but aren’t going to publish them out of respect for Ms Swift), somebody took it upon themselves to use the AI software to create sexually explicit images of Taylor Swift — one of which features her indecent and surrounded by numerous men in what appear to be a locker room (presumably a reference to her romance to Travis Kelce.)

The development is the latest in a long string of scandals for the popular software. Although the software has some extraordinary capabilities that could benefit us all, the dark sides of AI are emerging for everyone to see. This incident joints the discovery of last week’s deepfake of Joe Biden in New Hampshire who tried to falsely convince people not to vote.

It is unclear if the United States has any formal regulation to prevent this. President Joe Biden attempted to sign a previous executive order last year on the matter that prevents the creation of these kinds of images assuming they use real people.

Be First to Comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Verified by MonsterInsights