[HN Gopher] Replication of Quantum Factorisation Records with a ...
       ___________________________________________________________________
        
       Replication of Quantum Factorisation Records with a VIC-20, an
       Abacus, and a Dog
        
       Author : teddyh
       Score  : 53 points
       Date   : 2025-07-18 19:09 UTC (3 hours ago)
        
 (HTM) web link (eprint.iacr.org)
 (TXT) w3m dump (eprint.iacr.org)
        
       | rahimnathwani wrote:
       | Previous: https://news.ycombinator.com/item?id=44538693
        
       | cbm-vic-20 wrote:
       | > We verified this by taking a recently-calibrated reference dog,
       | Scribble, depicted in Figure 6, and having him bark three times,
       | thus simultaneously factorising both 15 and 21. This process
       | wasn't as simple as it first appeared because Scribble is very
       | well behaved and almost never barks.
        
         | trhway wrote:
         | one can always press the door bell button - works like a charm
         | with my Chihuahua. Though he prefers to factorize numbers more
         | like 529 than 21.
        
       | tomgag wrote:
       | I guess I'll post it here as well. This is my personal take on
       | the whole story: https://gagliardoni.net/#20250714_ludd_grandpas
       | 
       | A relevant quote: "this is your daily reminder that "How large is
       | the biggest number it can factorize" is NOT a good measure of
       | progress in quantum computing. If you're still stuck in this
       | mindset, you'll be up for a rude awakening."
       | 
       | Related: this is from Dan Bernstein:
       | https://blog.cr.yp.to/20250118-flight.html#moon
       | 
       | A relevant quote: "Humans faced with disaster tend to
       | optimistically imagine ways that the disaster will be avoided.
       | Given the reality of more and more user data being encrypted with
       | RSA and ECC, the world will be a better place if every effort to
       | build a quantum computer runs into some insurmountable physical
       | obstacle"
        
         | jgeada wrote:
         | Except that factorization is exactly what is needed to break
         | encryption, and so knowing what QC can do in that realm of
         | mathematics and computing is _exactly_ the critical question
         | that needs to be asked.
         | 
         | And a reminder that in the world of non-QC computing, right
         | from its very roots, the ability of computers improved in mind
         | boggling large steps _every_ year.
         | 
         | QC records, other than the odd statistic about how many bits
         | they can make, have largely not made _any_ strides in being
         | able to solve real world sized problems (with exception of
         | those that use QCs purely as an analog computer to model QC
         | behavior)
        
           | tomgag wrote:
           | I beg you to read the full story and to not extrapolate from
           | the quote.
           | 
           | Also, in the world of QC, right from its very roots, the
           | ability of QC improved in mind boggling large steps every
           | year. It's only that you cannot see it if you only look at
           | the wrong metric, i.e., factorization records.
           | 
           | It's a bit like saying "classical computing technology has
           | not improved for 50 years, it's only recently that we finally
           | start to have programs that are able to write other
           | programs".
        
             | madars wrote:
             | A great resource for visually seeing progress is
             | https://sam-jaques.appspot.com/quantum_landscape (click
             | "Prev"/"Next" to see other years) - it makes very clear
             | that incredible progress _is_ happening - this is a log-log
             | plot.
        
             | jgeada wrote:
             | There is a reason QC factorization records haven't shifted
             | much over the past years. Number of qubits by themselves
             | isn't enough. You to be able to do computation on them and
             | for long enough to run Shor's algorithm till it produces a
             | solution. How the qubits are connected, how reliable the
             | logic gates are and how long you can maintain the quantum
             | coherence with enough fidelity to get results is equally
             | important.
             | 
             | That no significant factorization milestones have moved is
             | a huge critical black eye to this field. Even worse, that
             | _no one_ has ever even been able to truly run Schors
             | algorithm on even trivial numbers is a shocking indictment
             | of the whole field.
        
               | tomgag wrote:
               | The reasons you listed are exactly why the lack of
               | factorization records _should not_ be seen as a
               | "critical black eye to this field", because they are not
               | a relevant measure of progress. Again, think of the
               | parallel with LLMs: it took decades to get out of the "AI
               | winter", because that's what non-linear technological
               | progress looks like.
               | 
               | With QC, the risk (and I am not saying this is going to
               | happen, but I'm saying that it is a non-overlookable
               | risk) is that we end up transitioning from "QC can only
               | factorize 15" to "RSA-2048 is broken" in such a sudden
               | way that the industry has no time to adapt.
        
               | mlyle wrote:
               | I think the main thing is: quantum computing doesn't
               | really work right now, at all.
               | 
               | Imagine if you had crummy, unreliable transistors. You
               | couldn't build any computing machine out of them.
               | 
               | Indeed, in the real world progress looked like:
               | 
               | * Useless devices (1947)
               | 
               | * Very limited devices (hearing aids)
               | 
               | * Hand-selected, lab devices with a few hundred
               | transistors, computing things as stunts (1955)
               | 
               | * The IBM 1401-- practical transitorized computers
               | (1959)-- because devices got reliable enough and
               | ancillary technologies like packaging improved.
               | 
               | In other words, there was a pattern of many years of
               | seemingly negligible progress and then a sudden step once
               | the foundational component reached a critical point. I
               | think that's the point of the person you're talking to
               | about this.
               | 
               | And then just a couple of years later we had the
               | reliability to move to integrated circuits for logic.
               | 
               | If you looked at the "transistorized factorization
               | record" it would be static for several years, before
               | making a couple steps of several orders of magnitude
               | each.
        
         | kevinventullo wrote:
         | _A better measure of progress (valid for cryptanalysis, which
         | is, anyway, a very minor aspect of why QC are interesting IMHO)
         | would be: how far are we from fully error-corrected and
         | interconnected qubits? I don 't know the answer, or at least I
         | don't want to give estimates here. But I know that in the last
         | 10 or more years, all objective indicators in progress that
         | point to that cliff have been steadily improving: qubit
         | fidelity, error rate, coherence time, interconnections... At
         | this point I don't think it's wise to keep thrashing the field
         | of quantum security as "academic paper churning"._
         | 
         | I think the problem is that "objective indicators pointing to
         | the cliff" is pretty handwavy. Could there be a widely agreed-
         | upon function of qubit fidelity, error rate, coherence time,
         | and interconnections that measures, even coarsely, how far we
         | are from the cliff? It seems like the cliff has been ten years
         | away for a very long time, so you might forgive an outsider for
         | believing there has been a lot of motion without progress.
        
       | hagbard_c wrote:
       | After having read this paper I'm busy working on the replication
       | of String Theory with a plate of Spaghetti, a packet of instant
       | Ramen noodles and a pair of Octopuses. I would have used a single
       | octopus but those 8 arms don't cover the 12 dimensions in String
       | Theory. Technically a single squid might suffice - it has 8 arms,
       | 2 tentacles and 2 fins which makes 12 - but that wouldn't be fair
       | to the dimensions which get stuck with the fins while others get
       | to walk away with those tentacles.
        
       ___________________________________________________________________
       (page generated 2025-07-18 23:00 UTC)