[HN Gopher] Disentangling the facts from the hype of quantum com...
       ___________________________________________________________________
        
       Disentangling the facts from the hype of quantum computing
        
       Author : rbanffy
       Score  : 11 points
       Date   : 2022-09-19 16:05 UTC (1 days ago)
        
 (HTM) web link (spectrum.ieee.org)
 (TXT) w3m dump (spectrum.ieee.org)
        
       | mikewave wrote:
       | >Over the past five years, there has been undeniable hype around
       | quantum computing--hype around approaches, timelines,
       | applications, and more. As far back as 2017, vendors were
       | claiming the commercialization of the technology was just a
       | couple of years away--like the announcement of a 5,000-qubit
       | system by 2020 (which didn't happen).
       | 
       | Inaccurate. D-Wave did in fact launch a 5000+ qubit system named
       | Advantage in 2020.
        
       | mike_ya wrote:
       | hjhjh
        
       | xhkkffbf wrote:
       | There are plenty of challenges. In my eyes, the machines are
       | going to require an exponential amount of precision. I don't have
       | any proof, but that's what my instincts say.
        
         | somat wrote:
         | It reminds me of why we use digital computers in the first
         | place, there was a period in the 50's where the future of
         | computing was unsure, was it going to be analog or digital?
         | 
         | digital won, mainly because the precision required for the
         | components was far far lower. because of this a digital
         | computer could be smaller faster and cheaper than it's analog
         | version.
         | 
         | Makes me wonder if quantum computers are just the analog
         | computers of our day.
        
           | gatane wrote:
           | Have you seen Veritasium's analog revival video?
           | 
           | Some analog systems could be used as cheaper accelerators
           | (like GPUs) for ML.
        
           | kragen wrote:
           | This is a fundamental confusion borne of the historical
           | accident that resulted in digital computers being called
           | "computers" instead of "switches" or "controllers" or
           | "logics" or "coders" or "analytical engines" or "programmable
           | data processors" or something. Analog "computers" are a
           | totally different kind of thing. You can't run a compiler,
           | solve an equation, play chess, or encrypt a message on an
           | analog computer. It's not that they do those things slower or
           | more expensively or in more space than digital "computers";
           | they just can't do them at all. (Except by simulating a
           | digital computer, of course.)
           | 
           | We call computers "computers" because the first ones were
           | built to do what analog "computers" do: numerical integration
           | of ordinary differential equations. But they can also do
           | those other things.
           | 
           | In theory, the relationship with quantum computers is kind of
           | similar. Quantum computers can in theory do anything a
           | reversible classical digital computer can do in a similar
           | number of operations, and also some other things (though
           | probably not, say, solve SAT in polynomial time.) And
           | classical digital computers can simulate QC, but as with
           | simulating a digital computer on an analog one, the
           | simulation is so inefficient as to be infeasible in all but
           | trivial cases.
           | 
           | But maybe that's backwards? Does it depend on your efficiency
           | metric? Certainly you need quite a lot of GPUs to approach
           | the cost of a dilution refrigerator. We'll see.
        
       | meltyness wrote:
       | I'm going to call this review maybe tight-lipped.
       | 
       | It does point to Intel's high-water marks:
       | https://www.intel.com/content/www/us/en/research/quantum-com...
       | 
       | And to the nittier-grittier:
       | 
       | - https://www.nature.com/articles/s41928-022-00727-9 (300mm fab)
       | 
       | - https://meetings.aps.org/Meeting/MAR22/Session/M28.4
       | (cryoprober talk abstract)
       | 
       | - https://newsroom.intel.com/wp-content/uploads/sites/11/2020/...
       | (which highlights that all of the key information regarding
       | intel's work mentioned in this article was being presented in
       | early December of 2020)
        
       | abhayhegde wrote:
       | > Let's remember that it took Google 53 qubits to create an
       | application that could accomplish a supercomputer function. If we
       | want to explore new applications that go beyond today's
       | supercomputers, we'll need to see system sizes that are orders of
       | magnitude larger.
       | 
       | Although the current order of magnitude (10s of qubits) are
       | sufficient (on paper) for doing certain tasks faster than
       | supercomputers of today, we need to go to 100-1000 qubits order
       | for error-corrected results. Maintaining coherence for long
       | enough to be able to do useful calculations in 10s of qubits has
       | still been a challenge, let alone the designs for 100s of qubits.
        
       | dvh wrote:
       | It's 2022 and Shor's algorithm record on quantum computer is
       | still only 21.
        
       ___________________________________________________________________
       (page generated 2022-09-20 23:01 UTC)