Social Question

JLeslie's avatar

(NSFW) Is using deep fakes for porn a good thing? (NSFW)

Asked by JLeslie (65420points) 3 months ago

I heard something about using deep fakes regarding porn mentioned on TV today. I don’t really know what specifically they were referring to, but it seemed to me if we can computer generate sex scenes, we don’t have to use and abuse real women for sex scenes. I guess maybe porn stars might say it cuts into their ability to make money, but I think it doesn’t have to. They could use the actress’ image and pay her if she is famous in the porn world.

Observing members: 0 Composing members: 0

8 Answers

Forever_Free's avatar

This topic has been around for awhile.

There are currently no federal laws in the United States against the creation and sharing of non-consensual deepfake pornography. There are determined to change this, calling for a federal law that makes non-consensual deepfake porn illegal, and changes to Section 230 of the Communications Decency Act, which shields online platforms from liability over user-generated content. This is the online landscape that has allowed creating and trading non-consensual deepfake pornography to develop into a thriving business.

ragingloli's avatar

Right now, deep fakes in porn are used to replace the faces of porn actors with the faces of non-porn celebrities without their consent. You take an existing porno, and, for example, put the face of Scarlett Johannsen on the porn actress.
And that is not a positive thing.

JLeslie's avatar

@ragingloli Of course I am completely against that.

Blackberry's avatar

It’s a bad situation. Basement dwellers can use a famous person’s face, or any person…to put them in an existing porn.

So the classic scenario that you will see 100% is loser guys making porn with the face of any woman that doesn’t like them.
You’ll see deep fakes of people’s moms for bullying kids in school etc.

Blackwater_Park's avatar

IMO, this will pass. It will because it will become so easy, so common, and petty that it will fall down into the noise floor of the internet and nobody will care. With this, real footage will be indistinguishable and not believable either.

cookieman's avatar

The latest target of this, as @ragingloli explained, is Taylor Swift. Seemingly because she is “ruining” football — at least according to a certain subset of insecure man-babies.

jca2's avatar

The news was saying that the problem is that it’s almost impossible to find out the origin of the fake images, so nobody will be punished.

The news also said this is why you should put your social media settings on “private.”

I think this is the tip of the iceberg as far as what we’re going to see generated, falsely, from celebrities and even regular people. The news clip that was talking about Taylor Swift also talked about some some high school girls from a local town (not sure if it was NJ or Long Island) and their faces were put on someone else’s body and videos were made.

cookieman's avatar

@jca2: These A.I. tools are easy to access. The Taylor Swifts of the world, as horrible as it is, can handle themselves. I worry about the new level of bullying and harassment against kids and teens that this will bring.

Answer this question

Login

or

Join

to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther