OpenAI: 'We also don’t allow GPTs dedicated to fostering romantic companionship'

ChatGPT’s new AI store is struggling to keep a lid on all the AI girlfriends::OpenAI: ‘We also don’t allow GPTs dedicated to fostering romantic companionship’

Why? Let it happen.

There are zero issues with consent. It is a chatbot not a living or sentient.

There are zero issues with STDs or pregnancy scares.

No one is going to want this as a substitute for a real relationship and if they do you might be doing the world a favor by allowing them a way out of the datingpool.

Sure it could be used for honeypots but we have had that for pretty much all of history. People fake affection or who they are to get what they want.

I just don’t see how this is at all different than the forms of erotica we already have and society has adapted to.

Ghostalmedia
link
fedilink
English
309M

The Fart or Flashlight app of this decade. What a time to be alive.

There was an app that could discern between a fart and flashlight? Man, unbelievable!

The Pantser
link
fedilink
English
79M

Not hotdog

Fartlight the most popular app in 2009

Ghostalmedia
link
fedilink
English
139M

It sure beats playing Fart or Flashlight the old way!

@AllonzeeLV@lemmy.world
link
fedilink
English
32
edit-2
9M

To be fair, it cares about you exactly as much as your OnlyFans crush.

Probably a cheaper obsession.

Her 2 (2024)?

@Mango@lemmy.world
link
fedilink
English
19M

removed by mod

danielfgom
link
fedilink
English
49M

It’s just a big money grab! Everyone is trying to get rich quick. Like with the App Store. Everyone is hoping their bot breaks into the big time and makes them rich.

This is what is terrible about society. Few are making bits that help people, they make bots that appeal to the base desires. A race to the bottom if you will…will man ever learn???

@_number8_@lemmy.world
link
fedilink
English
119M

why? why not let people just retreat into fantasy? it’s probably healthier than many common coping mechanisms. i mean, it’s a chatbot, how much can you do with it?

let people have their temporary salve to get them thru whatever they were going thru such that they were resorting to this. and if it’s not temporary, ok, fine? better to have some outlet than be even more mentally isolated. maybe in 50 years this will be common, who knows.

I am pretty sure its just to avoid controversy, look up the recent news about “laion” for an example, gpt4 isn’t just text anymore, it can generate images also.
Altman talked about we may sometime all have our own personal AI’s tailored to our own needs and sensitivities. But almost everyone has a different idea of if and where there should be a line.

If I have an AI tailored for me and my sensitivities then it should have no filter whatever filter it has should be defined and trained by me.

Someone else artificially trying to adjust my personality through AI to fit whatever arbitrary norms they believe it should have is cancer.

@webghost0101@sopuli.xyz
link
fedilink
English
1
edit-2
9M

I am inclined to agree, i believe that once society is able to fill everyone’s needs and everyone can summon any ai vr experience they want crime will stop to exist, there would be nothing to gain from committing harm. But i fear the simulated role-play in the context of psychological torture, csam could lead to making dangerous people more confident before we get to that post-scarcity. Maybe you say chatgpt inst realistic enough for it now, but i will be soon.

training an LLM entirely by yourself with self curated text is beyond what is feasible, most ai researched today dont even know whats in all of the data they use. Its more then you can look at even with an extended lifetime and at best you can fine-tune a standard base model.

@devfuuu@lemmy.world
link
fedilink
English
79M

These kinds of things are not temporary. We know that humans can’t control themselves and aren’t rational enough to “just use it a bit”. It’s highly addictive and leads to people to remove themselves from reality.

@Hyperlon@lemmy.world
link
fedilink
English
19M

So?

@RainfallSonata@lemmy.world
link
fedilink
English
4
edit-2
9M

let people have their

I’d be very interested to see the gender breakdown, here.

@cyd@lemmy.world
link
fedilink
English
309M

Liability. Imagine an AI girlfriend who slowly earns your affection, then at some point manipulates you into sending bitcoins to a prespecified wallet set up by the model maker. Because models are black boxes, there is no way to verify by direct inspection that an AI hasn’t been trained with an ulterior agenda (the “execute order 66” problem).

@Kittenstix@lemmy.world
link
fedilink
English
29M

Yep, I was having a conversation with a guy that informs policy makers on ai, he had given a whole presentation to a school board meeting I went to a few nights ago.

He said that’s his highest recommendation when it comes to what should be done on the lawmaker side, pass bills that push for opening up those black boxes so we can ensure transparency.

@cyd@lemmy.world
link
fedilink
English
59M

Problem is, there isn’t a way to open up the black boxes. It’s the AI explainability problem. Even if you have the model weights, you can’t predict what they will do without running the model, and you can’t definitively verify that the model was trained as the model maker claimed.

@Kittenstix@lemmy.world
link
fedilink
English
19M

I see, my knowledge is surface deep so I admit this is new information to me.

Is there no way to ensure LLMs are safe for like kids to use as a tool for education? Or is it just inherently going to come with some risk of exploitation and we just have to do our best to educate students of that danger?

Some guy in the UK was allegedly convinced by his chatbot girlfriend to assassinate Queen Elizabeth. He just got sentenced a few months ago. Of course he’s been determined to be psychotic, but I could imagine people who would qualify as sane getting too deep and reading too much into what an AI is saying.

I believe Futurama has a lesson on this

@TexasDrunk@lemmy.world
link
fedilink
English
89M

I knew I should’ve shown him Electro-Gonorrhea: The Noisy Killer

If you are against this, you are also against dildos.

…what

@paddirn@lemmy.world
link
fedilink
English
129M

I’d love to have an AI assistant/girlfriend like JOI from Bladerunner 2049, something I could jerk off to one minute, then have her prepare my taxes and order a pizza the next. However, these ChatGPT girlfriends all seem like they’re just subscription chatbots. Maybe some day we’ll get there and nerds will work up a local, open-source slutty AI girlfriend, but for now they’re all just crap.

God damn, now we have to hear about of having a fucking AI chat bot considered cheating.

Just give the people what they want

Herr Woland
link
fedilink
English
139M

Time to add “mass manipulation by AI girlfriend” to the list

Porn and connection with others even virtual is pretty much the driver of adoption for all technology

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 214 users / day
  • 604 users / week
  • 1.38K users / month
  • 4.49K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog