[HN Gopher] Quantum computing hype is bad for science
       ___________________________________________________________________
        
       Quantum computing hype is bad for science
        
       Author : nkurz
       Score  : 91 points
       Date   : 2021-07-21 12:12 UTC (1 days ago)
        
 (HTM) web link (www.linkedin.com)
 (TXT) w3m dump (www.linkedin.com)
        
       | briantakita wrote:
       | The Science industry is built on using buzzwords to get research
       | $$$. Probably the same with most industries that reach a certain
       | institutional scale. Institutional involvement leads to politics
       | which leads to buzzwords & right-speak.
       | 
       | Another case of "for the love of money is the root of all kinds
       | of evil".
        
       | smoldesu wrote:
       | The hype around QC is going to be _essential_ in getting more
       | people interested. If we can encourage people to learn the
       | concepts of entropy, state and other fundamentals while their
       | minds are still plastic, I say go for it!
       | 
       | This article has a very elitist tone, which I can mostly forgive
       | because of the subject matter. Hell, I even understand where
       | they're coming from with regards to how poorly AI was
       | marketed/integrated into our modern workflows. _However_ , I
       | think the conjecture that you're reaching is that 'transparency
       | matters', which is true (albeit not particularly profound). The
       | best solution that I can imagine is ensuring that the next
       | generation of programmers has access to quantum runtimes.
        
         | tsimionescu wrote:
         | There are many more decades of research to be done before there
         | will be any kind of need for quantum computation programmers.
         | Right now the field needs quantum physicists and computer
         | science mathematicians to actually develop the physical
         | computers and basic operation concepts, not programmers per se.
         | 
         | There is also a good chance that QC will remain a small niche
         | in the computing landscape even with fully functional QCs,
         | similarly to DSP programming or hardware or real-time code. QCs
         | algorithms have classical parts that run on classical
         | computers, very little of the actual logic of the program is
         | related to quantum effects, even for something like Shor's
         | algorithm.
        
           | jnwatson wrote:
           | I think DSP programming is a good analogy. There will be a
           | handful of codecs/waveforms/algorithms that you treat as a
           | black box that you load into the QC, and then the other 99.9%
           | of your system will be classical.
        
         | mcguire wrote:
         | Interested in _what_ , precisely? Why do you believe usable
         | quantum computers will exist in those people's lifetimes?
        
         | reikonomusha wrote:
         | Why do more people need to be interested?
         | 
         | Why do we need to ensure access to quantum computers to
         | programmers?
         | 
         | Seriously, any programmer can fire up a quantum simulator for
         | any number of the quantum instruction languages and be more
         | productive than with a real quantum computer.
         | 
         | Because of the hype, we've all been led to believe that we are
         | "ready" to program quantum machines and we just need to train
         | more people through boot camps, hackathons, and summer schools.
         | It's simply not true.
         | 
         | The quantum computers of today _are_ programmable (barely), and
         | the programs _do_ run (though you can only run a dozen or so
         | "statements" before you get junk results), but they're so
         | wildly bad compared to what you'd expect out of a textbook that
         | you easily conclude "the scientists have work to do".
         | 
         | Scientists _do_ have more work to do, but it seems like every
         | month there's a perfectly respected scientist who gets a $15MM
         | series A and starts spouting the same misinforming junk that
         | quantum computing is going to help FedEx with logistics, or
         | steel mills with operations planning. Then they hire a bunch of
         | good academic people, pay them software-engineer salaries, and
         | string them along to help them perpetuate the fundraising
         | machine--not by actually doing science of course--hoping to
         | also have a quantum computer /software/applications/algorithms
         | be built as a by-product.
         | 
         | Money is very attractive to people, especially physicists who
         | frequently find themselves jumping ship for an alternative,
         | higher-paying career. There must be around 100 quantum
         | companies now, most of them startups, and--to my knowledge--
         | zero of them providing anything demonstrated to be a valuable
         | commercial product. Some of them are definitely doing good work
         | here and there, but in the bigger picture, the profit motive--
         | whether shareholder value or venture capital returns--
         | consistently undermines their ability to do research.
        
           | smoldesu wrote:
           | So with all that being said, your solutions is to ostracise
           | more people? That checks out.
        
             | reikonomusha wrote:
             | I didn't propose a solution. But right now, as it stands,
             | money is motivating the perpetuation of misinformation. I'm
             | OK with that ending.
        
             | mcguire wrote:
             | Who is ostracizing anyone? The only solution I see
             | suggested is to take a quick look at the teeth of the horse
             | you are being sold.
        
           | iamstupidsimple wrote:
           | QC represents a fresh start for computer scientists to make
           | their mark, much like machine learning was the past decade.
        
             | reikonomusha wrote:
             | Computer scientists had QC to "make their mark" for the
             | past 30 years, and will continue to have it for the next
             | 30. It's available irrespective of an industry full of
             | cash-grabbing and misleading marketing.
        
             | mcguire wrote:
             | Why do they need a fresh start? Has the hype machine for
             | machine learning started to come apart?
        
       | superjan wrote:
       | If you have an hour to spare and are interested in what quantum
       | computers might be good for in the long run, I recommend this
       | interview(podcast, Sean Carrol's Mindscape):
       | 
       | https://www.preposterousuniverse.com/podcast/2021/06/28/153-...
        
       | mikewave wrote:
       | I take issue with one part of this. Disclaimer, I work for
       | D-Wave....
       | 
       | > It is unclear how exactly one can verify that a "quantum code"
       | actually runs on a quantum computer (instead of a classical node
       | inserted between the cloud and the QC provider), and there is a
       | huge window for fraud there.
       | 
       | I would like to know what the article's author would take as
       | proof of this. All I can offer presently is my personal assurance
       | as a team member helping to keep D-Wave Leap running. Every day,
       | I work with a team of talented scientists, engineers, devops, and
       | developers to help ensure everything from pipeline performance to
       | monitoring cryogenics, and if there's one thing that I am certain
       | of, it's that our end user submissions run on real hardware at a
       | few millikelvin about absolute zero.
       | 
       | I am certain beyond any shadow of a doubt that we are using
       | quantum effects to achieve low-energy solutions to difficult
       | problems using annealing. I'm also certain that we're making huge
       | progress; from our massive lead in terms of raw qubit count
       | (5000+) to our work making each of those qubits connect to more
       | and more of their neighbours with less noise over time. There are
       | exciting things coming....
       | 
       | If other companies are getting away with anything less and
       | promising they're doing real-time quantum computing in the cloud,
       | (1) it would be a huge surprise, and (2) their lives must be a
       | lot cushier than ours, because it is a lot of work keeping
       | something like this running. You want to talk about the woes of
       | having to manage on-prem and hybrid cloud workloads, well, does
       | your datacenter have plumbing for liquid helium? You monitor the
       | temperature on a few server racks, but do you have to measure ten
       | thousand different datapoints about air and fluid temperature and
       | pressure?
       | 
       | Honestly, it's a lot of fun, it's an exciting thing to be working
       | on, and I don't agree with the author when he complains about
       | brain drain. You want brain drain, go and look at the infinitum
       | of startups hawking SaaS grift-ware like it's the next best thing
       | since sliced bread. Sorry if we find it more interesting to work
       | on this than on the next B2B way of slicing off a chunk of
       | someone else's revenue for providing something obvious. "We do
       | these things because they are hard", as the saying goes.
        
         | prof-dr-ir wrote:
         | If all you can offer is your "personal assurance" then that
         | seems to exactly confirms the author's point.
         | 
         | From my reading the article's point was really to note the risk
         | of fraud. It did not claim that this kind of fraud is actually
         | happening now.
         | 
         | I also think that the article is more nuanced on the topic of
         | brain drain than you make it out to be. Is your argument not
         | just whataboutery? And what do you think of the article's claim
         | that "it may not be a zero-sum game"?
        
           | reikonomusha wrote:
           | The thing is, the hardware doesn't do anything useful. So you
           | can in theory fake bad results... but that doesn't seem so
           | dangerous. If it's bad, there are few people to defraud,
           | except maybe investors that are bad at due diligence.
           | 
           | You can also try to fake good results (or even have truly
           | good results!), and trust me, the scientific community will
           | require unambiguous proof. DWave went through the wringer
           | pretty thoroughly some years ago for their claims.
           | 
           | There's another angle too: If the service actually does
           | something commercially useful or better, in some sense, it
           | might not matter what the specifics of the implementation
           | are. Ultimately customers are going to look at price and
           | performance and make decisions that way.
        
             | leoc wrote:
             | It might not matter commercially, but it certainly matters
             | from a scientific POV.
        
               | reikonomusha wrote:
               | The scientific stakeholders aren't proving themselves
               | with an opaque public cloud API. They're write detailed
               | research papers with data that go into peer reviewed
               | journals. The data is pretty profoundly scrutinized by
               | the community.
               | 
               | If a scientist or company that purportedly does science
               | _doesn 't_ do that, they're not taken seriously by other
               | members of the scientific community. No one is truly
               | believed at face value. I don't see any issue with the
               | possibility of bamboozling the community of scientists
               | through abject fraud. And there hasn't been any such
               | issue yet. (There have been retracted published claims,
               | but the retractions happened as a result of scientific
               | scrutiny.)
        
         | reikonomusha wrote:
         | To add to this as a former employee of a different quantum
         | company that provides cloud services, there was absolutely no
         | funny business about faking results. If you asked for quantum
         | computer results, you got them, and it's painfully obvious too.
         | 
         | Obviously it's in the realm of possibility that a company could
         | fake, but I think if anybody was caught doing that, they'd tank
         | their reputation within the community extraordinarily quickly.
         | 
         | I haven't heard of any serious or reputable company doing this.
         | 
         | As for other things you've said, I definitely disagree with you
         | and agree with the article that there is brain drain. That's
         | not to say every commercial entity is fully or continuously
         | responsible for it, but DWave, IBM, Google, and every other
         | company that currently or formerly over-promises or outright
         | lies has drawn people out of academia into frequently senseless
         | industrial positions.
        
           | normac2 wrote:
           | > To add to this as a former employee of a different quantum
           | company that provides cloud services, there was absolutely no
           | funny business about faking results. If you asked for quantum
           | computer results, you got them, and it's painfully obvious
           | too.
           | 
           | As someone totally unfamiliar with this world, I'm wondering
           | why it's painfully obvious? Slow?
        
             | reikonomusha wrote:
             | The results look terrible and incredibly noisy. Throw more
             | than a handful of instructions at any quantum computer
             | that's publicly accessible today, and you'll get noisy
             | mumbo jumbo.
             | 
             | The noise characteristics are pretty signature-like. It'd
             | be an engineering effort unto itself to produce realistic-
             | looking noise models and simulations.
        
         | jacquesm wrote:
         | That's fine, but until the results outshine the conventional
         | solutions there is no way for an outsider to tell how they were
         | obtained.
        
           | reikonomusha wrote:
           | Even this is becoming less true with access to deeper levels
           | of various vendors' stacks. It's possible to actually do
           | pulse-level experiments on various platforms, where the
           | results will match theoretical predictions. Again, to _fake_
           | that has no benefit to anyone and in fact would take an
           | enormous amount of work to create a time-domain solver. For
           | just a handful of qubits, it's not even feasible, at least to
           | do it accurately.
        
             | JanisErdmanis wrote:
             | There is no enormous work needed to create a time domain
             | solver. Libraries like `QuantumOptics.jl` and analogous on
             | Matlab, Python and Mathematica let's you define the
             | Hamiltonian of an idealized system and solve it. For 16
             | qubits the matrix size (dimension) is 2^16=65536, can be
             | solved very quickly on a local machine. Furthermore the
             | Hamiltonian matrix is sparse enabling more optimizations.
             | 
             | At the moment state of the art supercomputer can simulate
             | 47 qubits. For 46 the necessary resources are about 4 times
             | smaller. So with handful you meant order of 30 then yes.
             | Only a handful of qubits.
        
         | unixhero wrote:
         | All I have to say is ... /THREAD!!!
        
       | [deleted]
        
       | tus89 wrote:
       | _cough_ machine learning _cough_
        
         | sgt101 wrote:
         | sneeze Alphafold2 sneeze
        
         | qayxc wrote:
         | There's absolutely no comparing the two fields.
         | 
         | ML already has a _massive_ impact on industry and society as a
         | whole. The future of many careers will forever be altered even
         | by current ML application, let alone future developments.
         | 
         | From automated face recognition, to customer service, job
         | interviews, risk assessment and protein folding, ML has become
         | part of our daily lives already to varying degrees (of both
         | impact and success).
         | 
         | It's a field that won't go away and will only grow and probably
         | change quite a bit in the next decades. Admittedly we're far
         | from a Butlerian Jihad-situation, but there's no denying that
         | ML is much more than just hype.
         | 
         | AGI, now that's a different story.
        
       | pphysch wrote:
       | All hype is "bad" for science. Hype implies emotional investment
       | or faith in an expectation, and science is specifically about
       | _challenging_ expectations (i.e. hypotheses) via practical
       | experimentation.
       | 
       | There is plenty of experimental research, and early practical
       | results, being achieved in quantum computing. There is also lots
       | of snake oil being peddled by sleazy entrepreneurs. This is true
       | for all developing fields.
        
         | yourenotsmart wrote:
         | Is it though?
         | 
         | Hype causes people not familiar or well informed into the
         | matter to get into it, hoping for big returns. Of course,
         | they'll come out severely disappointed, but science and
         | technology as a whole would have advanced, thanks to their
         | efforts.
         | 
         | Going blind into something with a lot of hype is often an "ice-
         | breaker" for humanity into new areas of study.
        
           | systemvoltage wrote:
           | The article talks about this in good detail and why hype
           | leads to:
           | 
           | 1. Brain drain of talent
           | 
           | 2. Ponzi schemes
           | 
           | 3. Damage to the reputation of science
        
       | gadders wrote:
       | Things that are always 10 years away:
       | 
       | [1] Those aeroplanes that can fly from London to Australia in 2
       | hours
       | 
       | [2] Cold Fusion
       | 
       | [3] Quantum computing
        
         | tsimionescu wrote:
         | I really don't understand your list. Are you trying to lump QCs
         | in with pseudo-science gobbledygook like cold fusion or Musk's
         | surface to surface starship rides?
        
         | gswdh wrote:
         | Don't forget solid state batteries.
        
         | chki wrote:
         | The distance from London to Sydney is 17.000km. The current
         | flight airspeed record is 3.500km/h. I would doubt that
         | anything close to 8.500km/h is physically possible without
         | using rockets. That's the speed of the fastest missile.
        
           | mrhyyyyde wrote:
           | The X-15 has recorded 7,274 km/h airspeeds. I'm no airspeed
           | record expert but did some reading.
        
             | dogorman wrote:
             | The X-15 was a rocket.
        
           | mcguire wrote:
           | Exactly. That would be the point.
        
             | chki wrote:
             | But I would claim that nobody has ever said "flight times
             | from London to Sydney will go down to 2-3 hours in 10
             | years". Si the above example does not really make sense.
        
         | bawolff wrote:
         | Is anyone (not selling snake oil) claiming that large scale QC
         | is actully only 10 years away? I certainly havent heard that.
        
           | reikonomusha wrote:
           | Google just announced they plan to have a million qubits in
           | less than 10 years. They've not yet demonstrated the ability
           | to go past 100.
        
             | Hedgemaster wrote:
             | well, Google's CEO also made a furore by stating that in 10
             | years quantum computer would break currently used
             | encryption.. But guess what? In 10 years Sundar won't be
             | with the company anyway ;)
        
           | krastanov wrote:
           | I work in this field, and I would say anything between 20%
           | and 70% percent of my colleagues believe that useful error
           | corrected qubits will exist in less than 10 years (depending
           | on what exactly you ask). So I guess the answer is yes, there
           | are respected scientists that would wager that we will have
           | enough useful qubits for interesting chemistry simulations in
           | 10 years.
        
         | lr4444lr wrote:
         | Cold fusion is still a hypothesis. As for that airplane ride, I
         | dunno who was selling you that. What kind of fuel would the jet
         | even use?
         | 
         | Quantum Computing though is already here, it's just not
         | practical for much outside of a lab setting.
        
           | tsimionescu wrote:
           | QC is here in theory, but it is not practical for anything -
           | the experiments so far were only intended to prove that the
           | thing was an actual quantum computer in the complexity theory
           | sense. But it is impossible to actually use Google's device
           | or the one in China to compute anything at all, even
           | something like factoring 4 using Shor's algorithm is beyond
           | the current capabilities.
           | 
           | There is perhaps some more debate about D-Wave's device,both
           | its status as a QC and its usefulness.
        
       | RIMR wrote:
       | Quantum computing deserves hype, but I'd like to see the stupid
       | hype die down.
       | 
       | I've heard claims that quantum computers "connect to alternate
       | timeline versions of themselves" and would allow us to
       | communicate with people from parallel universes. I've heard that
       | they'll let you bypass traditional cryptography with such ease
       | that you could steal all of the bitcoins in circulation in an
       | afternoon. I've heard that it could guarantee a lottery win with
       | only 100 picks.
       | 
       | A bunch of high-concept nonsense that is simply not what Quantum
       | computing is going to enable.
        
       | gigel82 wrote:
       | Quantum computers don't exist; qubits as they exist today are
       | simply sources of entropy. So every time someone does a big
       | fanfare announcement of this many qubit "computer" I chuckle a
       | bit; ok, you got a bigger "random number generator", cool cool
       | cool :)
       | 
       | If you downvote, please also include a link to something that
       | proves a quantum computer exists (outside of theoretical papers);
       | I'm genuinely interested in being proven wrong.
        
         | throw149102 wrote:
         | They've factored 15 into 5 by 3. It's a real computer, even if
         | it's too small to do anything useful.
         | https://arxiv.org/abs/quant-ph/0112176
        
           | gigel82 wrote:
           | Thank you for including a link to that experiment, it's
           | pretty cool. My naive definition of a quantum computer would
           | be a general purpose machine that can execute quantum
           | programs on controlled inputs and produce valid outputs
           | (expanded from the definition of a classical computer: "a
           | programmable electronic device designed to accept data,
           | perform prescribed mathematical and logical operations at
           | high speed, and display the results of these operations.").
        
         | mikewave wrote:
         | I'm sitting thirty feet away from one, friend. It exists. The
         | pulse tube cooler is making a comforting squelchy sound. There
         | is more entropy in your post.
        
           | gigel82 wrote:
           | So, what does it actually do besides squelching and producing
           | random numbers?
        
         | reikonomusha wrote:
         | Quantum computers do more than produce random numbers. They
         | follow predictable statistics predicted by quantum mechanics.
         | These computers also run programs.
         | 
         | Maybe your definition of "quantum computer" doesn't agree with
         | the field at large. What's your definition?
         | 
         | What do you think about Google's supremacy experiment? Do you
         | have objections to their results? [0]
         | 
         | This is one of many papers by Google, IBM, Rigetti, and many
         | other quantum computer manufacturers.
         | 
         | [0] https://www.nature.com/articles/s41586-019-1666-5
        
           | [deleted]
        
           | gigel82 wrote:
           | Just to be pedantic, that experiment literally generates
           | certifiable random numbers; it is also unclear if they have a
           | real physical device; it appears their "Sycamore" design is
           | theoretical and actually simulated on the Julich (classic)
           | supercomputer.
        
             | reikonomusha wrote:
             | When I said they do more than "produce random numbers", I
             | meant that they do more than such than that which is out of
             | their control (i.e., due to noise). By the physics of a
             | quantum computer, at their very foundation, they're random
             | number generators. What you program on a quantum computer
             | is, more or less, the shape of the distribution from which
             | they sample.
             | 
             | A linear congruential generator from Knuth programmed on a
             | classical computer produces controlled pseudorandom
             | numbers. So what? Whether a LCG or a program to produce
             | controlled samples from a goofy Porter-Thomas distribution,
             | they're both coming from machines that were programmed to
             | do a job. If the machine was neither a computer nor
             | programmable, then the job could not be done.
             | 
             | You haven't refuted the point of the published existence of
             | a computer. The paper includes both the results of a
             | program running on a quantum computer, and a comparison of
             | the results as simulated by a classical computer, the
             | latter taking several orders of magnitude to complete at
             | several orders of magnitude increased cost.
        
       | okareaman wrote:
       | The AI winter didn't seem to hurt AI
       | 
       | https://en.wikipedia.org/wiki/AI_winter
        
         | mcguire wrote:
         | The AI winter of the 1980s destroyed quite a few careers. Many
         | of the TAs when I was an undergrad graduated into that winter.
        
           | yourenotsmart wrote:
           | It destroyed many people's careers, but it didn't destroy AI.
           | There's a difference.
        
             | sgt101 wrote:
             | It destroyed a particular vision of AI - that of Knowledge
             | Engineering.
        
         | sampo wrote:
         | AI is very useful, even if the pioneers had to give up on the
         | dream of human-level or human-like intelligence. There is
         | quantum cryptography, but other than that I find it hard to
         | find practical value for quantum computers before they reach
         | the limit, maybe around 80 qubits fit for general computation,
         | that they start to be faster than classical computers. Faster
         | for solving meaningful problems, not useless problems designed
         | for the quantum computer to solve fast.
        
         | bitwize wrote:
         | The AI field pre-winter has been obliterated by the winter.
         | What is now called AI are the parts of the field that got less
         | attention and funding back in the day, mainly prediction by
         | statistical analysis of past data.
        
       | ssivark wrote:
       | That this is a post on LinkedIn shouldn't take away from the
       | important points it is making :-)
       | 
       | As a physicist who knows a little bit about quantum computing, my
       | understanding is that we're far far away from building usable
       | quantum computers (it's still at an applied research stage, and
       | nowhere close to "just an engineering/design problem") -- all
       | hype be damned.
        
         | bopbeepboop wrote:
         | Would you be willing to elaborate on why you think it's a
         | research problem rather than an engineering problem?
        
         | derekp7 wrote:
         | I like to draw an analogy to slide rules. A typical slide rule
         | has a precision of about 3 - 4 significant digits (it gets
         | worse at the higher end of the scale, better at the lower end).
         | To get another digit out of it, you need one 10 times long, or
         | a way to make the markings 10 times more accurate. So
         | effectively you have an intractable problem trying to build a
         | slide rule that is precise to a large number of significant
         | digits.
         | 
         | Is the issue with quantum computers somewhat similar? I know
         | next to nothing about the mechanical aspects of them, but based
         | on the what I've read it is considered a breakthrough whenever
         | another qubit is added.
        
           | kens wrote:
           | As a historical note, this is a primary reason why digital
           | computers replaced analog computers. If you want another
           | digit of accuracy out of an analog computer, you need
           | components that are 10 times as accurate, requiring expensive
           | precision resistors and capacitors. But if you want another
           | digit of accuracy out of a digital computer, you just process
           | four more bits and you can still use cheap, inaccurate
           | components.
           | 
           | While analog computers are almost entirely forgotten now,
           | they were widely used even into the 1970s. They could solve
           | differential equations almost instantaneously, while digital
           | computers needed to chug through calculations before
           | producing the answer. But digital computers steadily became
           | faster until they could generate answers in real-time, but
           | more accurate and easier to program.
        
           | bawolff wrote:
           | I think its much too early to tell. We know really well about
           | slide rules. We are still learning what the best way to build
           | quantum computers are, and how different methods scale.
        
             | NohatCoder wrote:
             | So far the empirical evidence is pretty unanimous: None of
             | the methods scale.
        
           | reikonomusha wrote:
           | It's certainly not a breakthrough when another qubit is
           | added. It's currently typically a breakthrough when another 9
           | to the fidelity is added, or a 0 to the qubit lifetime is
           | added.
        
           | fsh wrote:
           | This is a pretty good analogy for the state of quantum
           | computers right now!
           | 
           | One difference is that (in principle) it is possible to do
           | quantum error correction. Essentially this turns a number of
           | imperfect "physical" qubits into a perfect "logical" qubit.
           | However, this requires extremely low error rates of the
           | physical qubits to begin with and creates a lot of overhead.
           | All existing quantum computers are much too small and noisy
           | to implement quantum error correction except for some proof-
           | of-principle experiments. I am somewhat pessimistic that any
           | of the current technologies can be improved enough to make it
           | possible in practice.
        
         | reikonomusha wrote:
         | I like to tell people that actual, programmable quantum
         | computers _do_ exist (which is a very important point--they're
         | not vaporware), but exactly like you say, in order to make them
         | useful and scalable, more "actual science" needs to happen.
        
           | xondono wrote:
           | They really don't though. We have some things that if you
           | squint a little, and are willing to stretch the words, look
           | like a quantum computer.
        
             | reikonomusha wrote:
             | They really do, and there are mountains of published
             | experiments unambiguously verifying such.
        
             | anon_tor_12345 wrote:
             | they do and you can get time on one (5 qubits) right now
             | 
             | https://quantum-computing.ibm.com/composer/files/new
             | 
             | if you don't think these are computers then you just don't
             | know what a computer really is
             | 
             | https://en.wikipedia.org/wiki/Circuit_(computer_science)
             | 
             | they're not useful at all but they're still real actual
             | unadulterated computers.
        
               | Hedgemaster wrote:
               | pls note a difference between a 'logical qubit' and a
               | 'physical qubit'.. currently they don't have even 1
               | logical qubit, and for quantum computer to be of any use
               | it should have >10k logical qubits...
        
               | xondono wrote:
               | By this logic a 74LS138 is a "digital computer".
               | 
               | It's limited because it has only 3 bits, but if you play
               | with the input bits, the output bits change!
        
             | mikewave wrote:
             | You can access a quantum computer - to be precise, a
             | quantum _annealer_ right now, for free, via D-Wave Leap. It
             | may not be gate model, but it does compute using quantum
             | effects, and it is useful for optimization problems,
             | materials research, and other applications.
             | 
             | No squinting required.
        
               | Laakeri wrote:
               | Is there some fair comparison of D-Wave annealer vs
               | classical methods on optimization problems? I remember
               | seeing papers where it was compared to some naive methods
               | or the runtime of D-Wave approximation algorithm was
               | compared to the runtime of classical exact algorithm --
               | obviously apples vs oranges.
        
               | xondono wrote:
               | Except there's a lot of people (myself included) who
               | don't consider a quantum annealer to be a quantum
               | computer.
               | 
               | There has been also very little if any actual research in
               | other fields powered with quantum computing.
               | 
               | We can keep moving the goal posts and claim that we have
               | made it, but the fact is that QC keeps overpromising and
               | under delivering.
        
       | [deleted]
        
       | AlexCoventry wrote:
       | The hype is bad in its own right, but it's a symptom of how
       | science is being funded and rewarded, which is a much bigger
       | problem.
        
       | daxfohl wrote:
       | My feeling is there's as much anti-hype as there is hype these
       | days. I'll continue with the maxim "progress is always slower
       | than you think in the short term, and faster than you think in
       | the long term."
        
         | reikonomusha wrote:
         | The quantum computing industry definitely had a lot more hype
         | than anti-hype. There is a very small minority of scientists
         | who actually speak up against false or misleading claims, but a
         | majority are either silent (why gratuitously jeopardize your
         | own career or your funding avenues?) or amplify the hype
         | (because their newfound startup depends on lay investors being
         | excited for any reason so they'll continue to put tens of
         | millions of dollars in).
         | 
         | (To be clear, there's a ton to be excited about in quantum
         | computing, and there are truly legitimate careers to be had
         | both as a scientist and as an engineer. But what's exiting
         | currently isn't very marketable or fashionable!)
        
       | chestertn wrote:
       | This has also harming other fields. I work as a researcher in a
       | traditional engineering field with a very long tradition and well
       | stablished methods.
       | 
       | There is a push to use AI and Quantum such that in order to get
       | funding or publish papers you need to say that you're applying
       | XYZ AI technique to solve a well known engineering problem.
       | 
       | Because funding agencies want to sell to their investors or
       | government managers that they are in the new hot trend, if you
       | want to get funding money you need to have something related in
       | your proposal. Of course, having previously published papers on
       | the topic helps so that motivates people to send papers on the
       | topic. The journal editors know that the topic is hot so they
       | prioritize papers on this topic as their metrics will increase.
       | 
       | The result is tons of rushed papers saying "Applying XYZ AI
       | technique to well known engineering problem" usually without
       | examining previous research methods or proper benchmarks.
       | 
       | At the end the only barrier for this bro to happen is the
       | individual moral standing of each researcher. Unfortunately,
       | careerism usually trumps over this.
       | 
       | Sorry if this was too bleak.
        
         | 908B64B197 wrote:
         | > There is a push to use AI and Quantum such that in order to
         | get funding or publish papers you need to say that you're
         | applying XYZ AI technique to solve a well known engineering
         | problem.
         | 
         | How different is AI from good old fashioned stats again?
        
       ___________________________________________________________________
       (page generated 2021-07-22 23:01 UTC)