ChatGPT web traffic falls 10%, analytics show
www.theregister.com
external-link
Slow June, people voting with their feet amid this AI craze, or something else?

Slow June, people voting with their feet amid this AI craze, or something else?

@n3m37h@lemmy.world
link
fedilink
English
0
edit-2
10M

removed by mod

froggers
link
fedilink
English
51Y

I still use it sometimes, but ohhh boy it can be a wreck. Like I’ve started using the Creation Kit for Bethesda games, and you can bet your ass that anything you ask it, you’ll have to ask again. Countless times it’s a back-and-forth of:

Me: Hey ChatGPT, how can I do this or where is this feature?

ChatGPT: Here is something that is either not relevant or just does not exist in the CK.

Me: Hey that’s not right.

ChatGPT: Oh sorry, here’s the thing you are looking for. and then it’s still a 50-50 chance of it being real or fake.

Now I realize that the Creation Kit is kinda niche, and the info on it can be a pain to look up but it’s still annoying to wade through all the shit that it’s throwing in my direction.

With things that are a lot more popular, it’s a lot better tho. (still not as good as some people want everyone to believe)

I’ve been building a tool that uses ChatGPT behind the scenes and have found that that’s just part of the process of building a prompt and getting the results you want. It also depends on which chat model is being used. If you’re super vague, it’s going to give you rubbish every time. If you go back and forth with it though, you can keep whittling it down to give you better material. If you’re generating content, you can even tell it what format and structure to give the information back in (I learned how to make it give me JSON and markdown only).

Additionally, you can give ChatGPT a description of what it’s role is alongside the prompt, if you’re using the API and have control of that kind of thing. I’ve found that can help shape the responses up nicely right out of the box.

ChatGPT is very, very much a “your mileage may vary” tool. It needs to be setup well at the start, but so many companies have haphazardly jumped on using it and they haven’t put in enough work prepping it.

@80085@lemmy.world
link
fedilink
English
11Y

What method did you use to generate only JSON? I’m using it (gpt3.5-turbo) in a prototype application, and even with giving it an example (one-shot prompting) and telling it to only output JSON, it sometimes gives me invalid results. I’ve read that the new function-calling feature is still not guaranteed to produce valid json. Microsoft’s “guidance” (https://github.com/microsoft/guidance) looks like what I need, but I haven’t got around to trying it yet.

Using it for work from time to time, mostly when I have issues with HTML/CSS or some quick bash scripts. I’d probably miss copilot more. It saves a lot of time with code suggestions.

@zeppo@lemmy.world
link
fedilink
English
81Y

I love Stable Diffusion but I really have no use for ChatGPT. I’m amazed at how good the output can be… i just don’t have a need to generate text like that. Also, OpenAI has been making it steadily worse with ‘safety’ restrictions. I find it super annoying and even insulting when Bing-Sydney is “THIS CONVERSATION IS OVER”. It’s like being chastised by facebook or twitter for being ‘violent’ when you made a joke.

The ability to generate photographs and illustrations of practically anything, though, is fantastic. My girlfriend has been flagellating me into creating a bunch of really useless crap to promote her business on social media using SD, and I actually enjoy that part. I’ve made thousands of photos of scenery.

pngn
link
fedilink
English
61Y

I’m not really surprised at all, a lot of people I know wouldn’t stop talking about it for the grand total of maybe 2 weeks but then it all went quite. In fairness this is a sample of people who are all non-tech people, so I think a lot of it is just the fact they probably forgot the name of it or how to turn their computer on (definitely the case for some).

I have a number of language models running locally. I am really liking the gpt4all install with Hermes model. So in my case i used chatgpt right up untill i had one i could keep private.

How do I get started with the models you are mentioning?

On that, what would people recommend for a locally hosted (I have a graphics card) chatgpt-like LLM that is open source and doesn’t require a lot of other things to install.

(Just one CMD line installation! That is, if you have pip, pip3, python, pytorch, CUDA, conda, Jupiter note books, Microsoft visual studio, C++, a Linux partition, and docker. Other than that, it is just one line installation!)

I looked into this too and it’s pretty resource heavy. I actually had a really good conversation with Chatgpt about making a separate instance of itself locallly. It’s worth talking to it about that and some of the price options

I use it now and again but I couldn’t imagine paying $20+ a month for it.

@dep@lemmy.world
link
fedilink
English
21Y

It was in the major TV news cycle for weeks but now it’s back to normal levels I’d say. Curious onlookers without a real need have moved on.

AwkwardLookMonkeyPuppet
link
fedilink
English
01Y

It’s not a craze. ChatGPT is going to change 80% of the jobs on the planet, and most people don’t even know what it is.

@Magiwarriorx@lemmy.world
link
fedilink
English
3
edit-2
1Y

I still use free GPT-3 as a sort of high level search engine, but lately I’m far more interested in local models. I havent used them for much beyond SillyTavern chatbots yet, but some aren’t terribly far off from GPT-3 from what I’ve seen (EDIT: though the models are much smaller at 13bn to 33bn parameters, vs GPT-3s 145bn parameters). Responses are faster on my hardware than on OpenAI’s website and its far less restrictive, no “as a large language model…” warnings. Definitely more interesting than sanitized corporate models.

The hardware requirements are pretty high, 24GB VRAM to run 13bn parameter 8k context models, but unless you plan on using it for hundreds of hours you can rent a RunPod or something for cheaper than a used 3090.

What exact ones are you using and how can I use them?

@Magiwarriorx@lemmy.world
link
fedilink
English
1
edit-2
1Y

This vid goes over it in better detail than I can.

@simple@lemmy.world
link
fedilink
English
61Y

Personally I’ve abandoned ChatGPT in favor of Claude. It’s much more reliable.

@eleitl@lemmy.world
link
fedilink
English
21Y

The recent changes made it faster but near useless for coding.

@80085@lemmy.world
link
fedilink
English
21Y

GPT-4 is quite a bit better, but the subscription is expensive. I subscribe because I think it saves me quite a bit of time. I use it almost every day for things like refactoring (shorter) blocks of code, “translating” code into different languages or frameworks, or just for generating examples for completing tasks using frameworks or libraries I’m unfamiliar with.

Tried it a few times with poor results, it will eventually get better I guess.

@JshKlsn@lemmy.ml
link
fedilink
English
31Y

I stopped using it when they turned paid for something like $25 CAD per month.

Then they released a “free” version with a waitlist, which always seemed full.

Have they changed it back since? I just kinda stopped caring when I couldn’t access it anymore when I needed to. And $25 CAD is crazy!

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 186 users / day
  • 583 users / week
  • 1.37K users / month
  • 4.49K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog