Developers often prefer ChatGPT's responses about code to those submitted by humans, despite the bot frequently being wrong, researchers found.

ChatGPT has a style over substance trick that seems to dupe people into thinking it’s smart, researchers found::Developers often prefer ChatGPT’s responses about code to those submitted by humans, despite the bot frequently being wrong, researchers found.

It’s like crypto, or really any other con job.

It makes idiots feel smart.

Make a mark feel like they’re smart, and they’ll become attached to the idea and defend it to their death. Because the alternative is they aren’t really smart and fell for a scam.

When smart people try to explain that to the idiots, it just makes them defend the scam even harder.

Try to tell people chatgpt isn’t great, and they just ramble on about some nonsensical stuff they don’t even understand themselves and then claim anyone that disagrees just isn’t smart enough to get it.

It’s a great business plan if you have zero morals, which is why the method never really goes away, just moves to another product.

I have seen someone type “tell me how make a million dollar business” into chatgpt. Of course that’s not going to work. But LLMs have immediate obvious value that crypto does not, and I think making the comparison reveals a lack of experience with those useful applications. I’m using chatgpt nearly every day as a tool to help with coding. It’s not a replacement for a person, but it is like giving a person a forklift.

@joe@lemmy.world
link
fedilink
English
381Y

A caveat: This user analysis involved just 12 programmers being asked to assess if they prefer the responses of ChatGPT or those written by humans on Stack Overflow to 2,000 randomly sampled questions.

Nothing to see here.

What they should have done is asked those same 12 programmers to ask a common everyday question on Stack Overflow and then while waiting for a response, ask ChatGPT the same question.

I’d bet 50 bucks almost all of them would get an acceptable answer to their question out of ChatGPT 4 in far less time than it takes the moderators at Stack Overflow to delete the question. I can’t imagine any of the questions will actually be answered on SO.

@joe@lemmy.world
link
fedilink
English
41Y

Right. The problem with SO is that you don’t actually get to ask any questions; so reason would suggest anything is at least as good as SO-- even asking a house plant, or Siri, or whatever. Something that actually answers your question would obviously be a better option.

Stack Overflow brought their irrelevance on themselves, I suspect.

@daellat@lemmy.world
link
fedilink
English
0
edit-2
1Y

Certainly it’s gotten worse as we’ve all seen the news probably. When gpt4 came to the API it was impressive at times. A caveat always remained: don’t blindly trust it, but that goes for stack overflow replies too.

Ohh cool, a downvote and smug reply. Go back to reddit or something.

Lol https://mastodon.social/@rodhilton/110894818243613681

@abhibeckert@lemmy.world
link
fedilink
English
2
edit-2
1Y

I’ve seen that in the news. I haven’t experienced it at all. In fact I’m getting far better results now than I ever did before, though I suspect that’s mostly on me - experience using almost any tool will improve the output.

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 210 users / day
  • 601 users / week
  • 1.38K users / month
  • 4.49K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog