The creation of sexually explicit deepfake content is likely to become a criminal offense in England and Wales as concern grows over the use of artificial intelligence to exploit and harass women.
I just imagine someone showing up to my work and presenting that contract and next thing you know I’m stuck in the dryer with only my stepson Esteban to help me…
You’re not the first to think of it and it’s where this whole idea will fall flat on it’s face.
There’s just no way to actually check if the subject of a photo consented to having their photo taken. That was difficult enough with physical cameras, it’s so much more difficult now that no camera is involved in generating the image.
I mean, if I were to post an image here in this comment - how can the Fediverse possibly verify that I have the right to post it?
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world
This is a most excellent place for technology news and articles.
Step one… create consent deepfake…
I don’t like that I thought it… But It pains me to say it will be used as a defense at some point.
I just imagine someone showing up to my work and presenting that contract and next thing you know I’m stuck in the dryer with only my stepson Esteban to help me…
You’re not the first to think of it and it’s where this whole idea will fall flat on it’s face.
There’s just no way to actually check if the subject of a photo consented to having their photo taken. That was difficult enough with physical cameras, it’s so much more difficult now that no camera is involved in generating the image.
I mean, if I were to post an image here in this comment - how can the Fediverse possibly verify that I have the right to post it?