Personally I find quantum computers really impressive, and they havent been given its righteous hype.

I know they won’t be something everyone has in their house but it will greatly improve some services.

@Chocrates@lemmy.world
link
fedilink
English
32M

I think AI is falling into disillusionment and Quantum Computers feel at least 10 years behind.

Pennomi
link
fedilink
English
32M

AI is falling into disillusionment for like the 10th time now. We just keep redefining what AI is to mean “whatever is slightly out of reach for modern computers”.

@jacksilver@lemmy.world
link
fedilink
English
22M

Hahaha, I kept saying this to myself while going through this thread. I mean there is a whole wiki page on the concept of AI winters because it’s such a common occurrence - https://en.m.wikipedia.org/wiki/AI_winter

@Aceticon@lemmy.world
link
fedilink
English
102M

The answer for that exists as a superposition of multiple possibilities but as soon as somebody manages to read it it will decohere into just the one.

@kitnaht@lemmy.world
link
fedilink
English
302M

Pretty sure QC is down at 0,0 right now. They haven’t gotten it to work in the way it’s been envisioned yet. The theory is there, but until something is quantifiably working, there’s basically no hype behind it.

They work, but it’s expensive and POC stage. They’re mostly just not scaled to the level that we think we can take them to.

@davidgro@lemmy.world
link
fedilink
English
82M

I’d say very slightly past that. Quantum computers do work right now, but it’s the same way the Wright brothers’ first plane worked: as proof of concept and research, but not better than existing tech for solving any problems.

And it’s not that they fail to meet expectations of the designers, as far as I know they do exactly what they are built to do as well as predicted with the tech we have. Just the press is expecting more.

@MataVatnik@lemmy.world
link
fedilink
English
2
edit-2
2M

Quantum computers are now where neural nets were in the 1980s.

@Glowstick@lemmy.world
link
fedilink
English
5
edit-2
2M

I think this graph doesn’t have to move left to right, it can also move right to left. On several occasions quantum computing started to move up the “tech trigger” slope, but without any functional applications for the current technology the point slid back down to the left again.

I think the graph needs at least one more demarcated region. After “tech trigger” there needs to be “real world applications”. Without real world applications you can never progress past the tech trigger phase.

In chemistry this is the equivalent of Energy of Activation. If a reaction can’t get over the big first step, then it can’t proceed on to any secondary steps

Somewhere around 0,0 or 1,1

There are amazing possibilities in the theoretical space, but there hasn’t been enough of a breakthrough on how to practically make stable qubits on a scale to create widespread hype

Approaching the point of disillusionment.

They started to work, but hardly anyone cares. They are still far from being good, or affordable.

Brownian Motion
link
fedilink
English
12
edit-2
2M

Quantum Computing is still climbing the slope from TT to the Peak of Inflated Expectations. There is still little to no major hype, as its still in “R&D/testing” it is slow, it is expensive (Very) limited due to all the surrounding tech required to make it work like cooling, containment etc…

Compare this to AI.

AI is at and heading down from the Peak towards the Trough of Disillusionment. It was easy (relatively) to implement, easy to evolve as how nVidia did, simply throw more silicon at it. The Hype was easy to generate because even while totally misinformed, media and other people out there thought they could easily sell it. Even though most of what they claimed was turd, it sounded amazing and a game changer even in the early stages, and businesses lapped it up. Now they are feeling the pain, and seeing that there are still major hurdles to get past.

@AA5B@lemmy.world
link
fedilink
English
0
edit-2
2M

AI is way different. It’s more like a series of hills where Sysiphus is pushing the boulder up to the peak, only to see another higher peak as the boulder rolls down the slope of disillusionment.

The thing is that quite a few things initially called AI have climbed that hype curve, rolled down into disillusionment, and quite a few have climbed back to a plateau of increased productivity. Each time we realize that’s either not AI or only a step toward AI. We’ve gotten a lot of useful functionality but the actual progress seems to be mainly clarifying what intelligence is or is not

@Xeroxchasechase@lemmy.world
cake
link
fedilink
English
12M

The kind of LLM that caused this hype with GPT3 is in R&D since the 60’s. I belive we’re in the 70’s of Quantum Coputing. When It’ll be measured, it’d be just as easy and relatively cheep to produce and advance as AI today

I think we’re still headed up the peak of inflated expectations. Quantum computing may be better at a category of problems that do a significant amount of math on a small amount of data. Traditional computing is likely to stay better at anything that requires a large amount of input data, or a large amount of output data, or only uses a small amount of math to transform the inputs to the outputs.

Anything you do with SQL, spreadsheets, images, music and video, and basically anything involved in rendering is pretty much untouchable. On the other hand, a limited number of use cases (cryptography, cryptocurrencies, maybe even AI/ML) might be much cheaper and fasrer with a quantum computer. There are possible military applications, so countries with big militaries are spending until they know whether that’s a weakness or not. If it turns out they can’t do any of the things that looked possible from the expectation peak, the whole industry will fizzle.

As for my opinion, comparing QC to early silicon computers is very misleading, because early computers improved by becoming way smaller. QC is far closer to the minimum possible size already, so there won’t be a comparable, “then grow the circuit size by a factor of ten million” step. I think they probably can’t do anything world shaking.

@Smokeydope@lemmy.world
link
fedilink
English
12
edit-2
2M

Quantum computers have no place in typical consumer technology, its practical applications are super high level STEM research and cryptography. Beyond being cool to conceptualize why would there be hype around quantum computers from the perspective of most average people who can barely figure out how to post on social media or send an email?

People thought the same of binary computers in their development phase.

@AA5B@lemmy.world
link
fedilink
English
112M

Pretty much on the blue line. They cost a lot of money for being barely functional, and it’s not clear whether they’ll ever be anything more

@AA5B@lemmy.world
link
fedilink
English
12M

Quantum computers don’t lie: it’s not like thawed can run generative ai

ℍ𝕂-𝟞𝟝
link
fedilink
English
32M

Either somewhere on the far left, and we’ll see some actual breakthrough with major impact in the future which actually gets hyped, or on the far right and it already happened, it was just too niche for anyone other than a specific small group to notice.

Trough of disillusionment

@aodhsishaj@lemmy.world
link
fedilink
English
22M

You think we’ve made it that far?

Amazing computational speedups if you regularly use any of these incredibly specific algorithms. Otherwise useless.

Quantum as a service may exist as a business.

@bunchberry@lemmy.world
link
fedilink
English
6
edit-2
2M

Uh… one of those algorithms in your list is literally for speeding up linear algebra. Do you think just because it sounds technical it’s “businessy”? All modern technology is technical, that’s what technology is. It would be like someone saying, “GPUs would be useless to regular people because all they mainly do is speed up matrix multiplication. Who cares about that except for businesses?” Many of these algorithms here offer potential speedup for linear algebra operations. That is the basis of both graphics and AI. One of those algorithms is even for machine learning in that list. There are various algorithms for potentially speeding up matrix multiplication in the linear. It’s huge for regular consumers… assuming the technology could ever progress to come to regular consumers.

literally for speeding up linear algebra

For a sparse matrix where you don’t need the values of the solution vector.

I.e. a very specific use case.

Quantum computers will be called from libraries that apply very specific subroutines for very specific problems.

Consumers may occasionally call a quantum subroutine in a cloud environment. I very much doubt we will have a quantum chip in our phone.

@aodhsishaj@lemmy.world
link
fedilink
English
32M

Yes, but, quantum TPM or TPU chips would allow for far more complex encryption. So you’d likely have a portiion of the SOC with a quantum bus or some other function.

However you’re correct that it’d take a seachange in computing for a qbit based OS

Strong, post quantum encryption doesn’t require quantum computers. It uses different mathematical objects (e.g. matrices)

@aodhsishaj@lemmy.world
link
fedilink
English
22M

True. However there is still a usecase. You could sign a cert for uefi much like a payment would. Useful for distributed compute.

https://www.nature.com/articles/s41467-023-39519-w

@bunchberry@lemmy.world
link
fedilink
English
1
edit-2
2M

Why are you isolating a single algorithm? There are tons of them that speed up various aspects of linear algebra and not just that single one, and many improvements to these algorithms since they were first introduced, there are a lot more in the literature than just in the popular consciousness.

The point is not that it will speed up every major calculation, but these are calculations that could be made use of, and there will likely even be more similar algorithms discovered if quantum computers are more commonplace. There is a whole branch of research called quantum machine learning that is centered solely around figuring out how to make use of these algorithms to provide performance benefits for machine learning algorithms.

If they would offer speed benefits, then why wouldn’t you want to have the chip that offers the speed benefits in your phone? Of course, in practical terms, we likely will not have this due to the difficulty and expense of quantum chips, and the fact they currently have to be cooled below to near zero degrees Kelvin. But your argument suggests that if somehow consumers could have access to technology in their phone that would offer performance benefits to their software that they wouldn’t want it.

That just makes no sense to me. The issue is not that quantum computers could not offer performance benefits in theory. The issue is more about whether or not the theory can be implemented in practical engineering terms, as well as a cost-to-performance ratio. The engineering would have to be good enough to both bring the price down and make the performance benefits high enough to make it worth it.

It is the same with GPUs. A GPU can only speed up certain problems, and it would thus be even more inefficient to try and force every calculation through the GPU. You have libraries that only call the GPU when it is needed for certain calculations. This ends up offering major performance benefits and if the price of the GPU is low enough and the performance benefits high enough to match what the consumers want, they will buy it. We also have separate AI chips now as well which are making their way into some phones. While there’s no reason at the current moment to believe we will see quantum technology shrunk small and cheap enough to show up in consumer phones, if hypothetically that was the case, I don’t see why consumers wouldn’t want it.

I am sure clever software developers would figure out how to make use of them if they were available like that. They likely will not be available like that any time in the near future, if ever, but assuming they are, there would probably be a lot of interesting use cases for them that have not even been thought of yet. They will likely remain something largely used by businesses but in my view it will be mostly because of practical concerns. The benefits of them won’t outweigh the cost anytime soon.

Why are you isolating a single algorithm?

To show that quantum computing only helps with very specific parts of very specific algorithms.

A QC is not a CPU, it’s not a GPU, it’s closer to a superpowered FPU.

If they would offer speed benefits, then why wouldn’t you want to have the chip that offers the speed benefits in your phone?

if somehow consumers could have access to technology in their phone that would offer performance benefits to their software that they wouldn’t want it.

Because the same functionality would be available as a cloud service (like AI now). This reduces costs and the need to carry liquid nitrogen around.

The issue is not that quantum computers could not offer performance benefits in theory.

It is this. QC only enhances some very specific tasks.

It is the same with GPUs. A GPU can only speed up certain problems. You have libraries that only call the GPU when it is needed for certain calculations.

Yes, exactly my point. QC is a less flexible GPU.

I don’t see why consumers wouldn’t want it.

Because they would need to use the specific quantum enhanced algorithms frequently enough to pay to have local, always on access.

They will likely remain something largely used by businesses but in my view it will be mostly because of practical concerns. The benefits of them won’t outweigh the cost anytime soon.

Agree. Unless some magic tech, like room temperature superconductors, turns up there will only be quantum as a service supplied for some very specific business needs.

@bunchberry@lemmy.world
link
fedilink
English
12M

Because the same functionality would be available as a cloud service (like AI now). This reduces costs and the need to carry liquid nitrogen around.

Okay, you are just misrepresenting my argument at this point.

Actually I think we are mostly agreeing.

The difference is that you think that the technology will quickly be made cheap and portable enough for mass consumption and I think it will remain, for quite some time, niche and expensive, like high end, precision industrial equipment.

I’m so dreadfully sorry. I cannot help myself. Please forgive me.

It’s “zero kelvins” not “zero degrees Kelvin.”

@bunchberry@lemmy.world
link
fedilink
English
22M

You don’t have to be sorry, that was stupid of me to write that.

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 175 users / day
  • 576 users / week
  • 1.37K users / month
  • 4.48K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog