Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.
Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.
The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.
This is a most excellent place for technology news and articles.
They’ll pay for it. When the tech companies decide, it’s a thing to make money off & advertise it, all the good ants will buy, buy, buy and the rest of the time they will work, work, work for it.
They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.
I wouldn’t even pay less.
AI in Movies: “The only Logical solution, is the complete control/eradication of humanity.”
AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!”
deleted by creator
Please drink verification can.
Yeah this is probably more likely. It’s just so depressing
And what do the companies take away from this? “Cool, we just won’t leave you any other options.”
Plenty of companies offering sane normal solutions and make bank in the process
84% said no.
16% punched the person asking them for suggesting such a practice. So they also said no. With their fist.
Most people have pretty decent ai hardware already in the form of a gpu.
Sure dedicated hardware might be more efficient for mobile devices, but that’s already done better in the cloud.
Google coral TPU has been around for years and it’s cheap. Works well for object detection.
https://docs.frigate.video
There’s a lot of use cases in manufacturing where you can do automated inspection of parts as they go by on a conveyor, or have a robot arm pick and place parts/boxes/pallets etc.
Those types of systems have been around for decades, but they can always be improved.
I honestly have no Idea what AI does to a processor, and would therefore not pay extra for the badge.
If it provided a significant speed improvement or something, then yeah, sure. Nobody has really communicated to me what the benefit is. It all seems like hand waving.
Its bad enough they shove it on you in some websites. Really not interested in being their lab rats
Depends on what kind of AI enhancement. If it’s just more things nobody needs and solves no problem, it’s a no brainer. But for computer graphics for example, DLSS is a feature people do appreciate, because it makes sense to apply AI there. Who doesn’t want faster and perhaps better graphics by using AI rather than brute forcing it, which also saves on electricity costs.
But that isn’t the kind of things most people on a survey would even think of since the benefit is readily apparent and doesn’t even need to be explicitly sold as “AI”. They’re most likely thinking of the kind of products where the manufacturer put an “AI powered” sticker on it because their stakeholders told them it would increase their sales, or it allowed them to overstate the value of a product.
Of course people are going to reject white collar scams if they think that’s what “AI enhanced” means. If legitimate use cases with clear advantages are produced, it will speak for itself and I don’t think people would be opposed. But obviously, there are a lot more companies that want to ride the AI wave than there are legitimate uses cases, so there will be quite some snake oil being sold.
well, i think a lot of these cpus come with a dedicated npu, idk if it would be more efficient than the tensor cores on an nvidia gpu for example though
edit: whatever npu they put in does have the advantage of being able to access your full cpu ram though, so I could see it might be kinda useful for things other than custom zoom background effects
But isn’t ram slower then a GPU’s vram? Last year people were complaining that suddenly local models were very slow on the same GPU, and it was found out it’s because a new nvidia driver automatically turned on a setting of letting the GPU dump everything on the ram if it filled up, which made people trying to run bigger models very annoyed since a crash would be preferable to try again with lower settings than the increased generation time a regular RAM added.
Ram is slower than GPU VRAM, but that extreme slowdown is due to the bottleneck of the pcie bus that the data has to go through to get to the GPU.
This is yet another dent in the “exponential growth AGI by 2028” argument i see popping up a lot. Despite what the likes of Kurzweil, Musk, etc would have you believe, AI is severely overhyped and will take decades to fully materialise.
You have to understand that most of what you read about is mainly if not all hype. AI, self driving cars, LLM’s, job automation, robots, etc are buzzwords that the media loves to talk about to generate clicks. But the reality is that all of this stuff is extremely hyped up, with not much substance behind it.
It’s no wonder that the vast majority of people hate AI. You only have to look at self driving cars being unable to handle fog and rain after decades of research, or dumb LLM’s (still dumb after all this time) to see why. The only real things that have progressed quickly since the 80s are cell phones, computers, etc. Electric cars, self driving cars, stem cells, AI, etc etc have all not progressed nearly as rapidly. And even the electronics stuff is slowing down soon due to the end of Moore’s Law.
Idk robots are absolutely here and used. They’re just more Honda than Jetsons. I work in manufacturing and even in a shithole plant there are dozens of robots at minimum unless everything is skilled labor.
I might be wrong but those do not make use of AI do they? It’s just programming for some repetitive tasks.
They use machine learning these days in the nice kind, but I misinterpreted you. I interpreted you as saying that robots were an example of hype like AI is, not that using AI in robots is hype. The ML in robots is stuff like computer vision to sort defects, detect expected variations, and other similar tasks. It’s definitely far more advanced than back in the day, but it’s still not what people think.
AI for IT companies is looking more and more like 3D was for movie industry
All fanfare and overhype, a small handful of examples that do seem a solid step forward with millions others that are just a polished turd. Massive investment for something the market has not demanded
It’s just a gimmick, a new “feature” to justify higher product prices.
barely a feature, just a buzzword
Why would I pay more for x company to have a robot half ass the work of all the employees they’re gonna cut?
So the trades have been unknowingly fucking with AI for decades, because of the time honored tradition of fucking with apprentices.
A lot of forums are filled with absolutely unhinged advice, and sprinkled in there is some good advice. If you know what you’re doing, you can spot the bullshit.
But if you don’t know anything about it, the advice seems perfectly reasonable. There’s a skill in giving unhinged advice. Literally you can’t get your master cert without convincing at least one apprentice to ask where the board stretcher is.
Do I actually have a dedicated vise for Vaseline when I run a tap cycle or is that old timer bullshit? HOW WOULD YOU POSSIBLY KNOW??
It just doesn’t really do anything useful from a layman point of view, besides being a TurboCyberQuantum buzzword.
I’ve apparently got AI hardware in my tablet, but as far as I’m aware, I’ve never/mostly never actually used it, nor had much of a use for it. Off the top of my head, I can’t think of much that would make use of that kind of hardware, aside from some relatively technical software that is almost as happy running on a generic CPU. Opting for AI capabilities would be paying extra for something I’m not likely to ever make use of.
And the actual stuff that might make use of AI is pretty much abstracted out so far as to be invisible. Maybe the autocorrecting feature on my tablet keyboard is in fact powered by the AI hardware, but from the user perspective, nothing has really changed from the old pre-AI keyboard, other than some additions that could just be a matter of getting newer, more modern hardware/software updates, instead of any specific AI magic.
I guarantee most this AI bullshit is nothing but a backdoor to harvest more user info, anyway.