[HN Gopher] 2023 ACM Turing Prize awarded to Avi Wigderson
___________________________________________________________________
2023 ACM Turing Prize awarded to Avi Wigderson
Author : nanna
Score : 212 points
Date : 2024-04-10 12:36 UTC (10 hours ago)
(HTM) web link (awards.acm.org)
(TXT) w3m dump (awards.acm.org)
| srvmshr wrote:
| From ACM:
|
| _ACM has named Avi Wigderson as recipient of the 2023 ACM A.M.
| Turing Award for foundational contributions to the theory of
| computation, including reshaping our understanding of the role of
| randomness in computation, and for his decades of intellectual
| leadership in theoretical computer science._
|
| _Wigderson is the Herbert H. Maass Professor in the School of
| Mathematics at the Institute for Advanced Study in Princeton, New
| Jersey. He has been a leading figure in areas including
| computational complexity theory, algorithms and optimization,
| randomness and cryptography, parallel and distributed
| computation, combinatorics, and graph theory, as well as
| connections between theoretical computer science and mathematics
| and science._
|
| Also of interest, he has won the Abel Prize in 2021, making it a
| rather unique combination of winning the top honors in both
| theoretical/abstract math & CS
| polygamous_bat wrote:
| > Also of interest, he has won the Abel Prize in 2021, making
| it a rather unique combination of winning the top honors in
| both theoretical/abstract math & CS
|
| The overlap between theoretical CS and math is way larger than
| most people know. For a simple example, check out the
| theoretical CS course catalog at MIT:
| https://catalog.mit.edu/subjects/6/ and how many of them are
| cross listed as course 18 (math) classes.
| vecter wrote:
| Theoretical CS is basically a branch of applied math
| waldrews wrote:
| Perhaps 'an applied branch of math' since the term 'applied
| math' is claimed by something usually disconnected from the
| discrete math subjects CS tends to study.
| Kranar wrote:
| Theoretical CS is not at all applied math. Ordinary
| computer science overlaps with applied math, but
| theoretical CS is very abstract.
| jchonphoenix wrote:
| Technically the Field's medal is the top honor in mathematics.
|
| But who am I to talk.
| Kranar wrote:
| There's no formal hierarchy so there's nothing technical
| about it. Field's medal is certainly prestigious but it's not
| widely applicable, it has a bunch of restrictions that don't
| have anything to do with math itself, including an age limit.
| For example no one who has ever been awarded an Abel Prize
| would qualify for a Field's medal strictly due to age.
| math_dandy wrote:
| Six Abel laureates (out of 22) had previously won the
| Fields Medal: Serre, Aliyah, Thompson, Milner, Deligne, and
| Margulis.
| srvmshr wrote:
| True, it is considered one of the two top honors in Math
| since last decade. Previously it was the only distinguished
| prize.
|
| There was a growing need for another award which bridged few
| gaps. The 40 year cutoff age, awarded every 4 years to living
| mathematical prodigies failed to honor several prominent
| mathematical breakthroughs which came after decades of
| painstaking research.
|
| As the field has progressed, monumental breakthroughs are
| harder to come by early into career. Many of the ingenuity
| comes from cross-study of disciplines for e.g. Riemannian
| hypothesis being approached by Algebraic geometry and
| Topology rather than number theory. These require years of
| mastery - not just prodigy. Also the prize money offered by
| Abel Foundation is a good incentive for research into pure
| math
| unethical_ban wrote:
| Congratulations.
|
| This was a pretty well written summary by ACM.
| COGlory wrote:
| Here's a great Quanta article as well:
|
| https://www.quantamagazine.org/avi-wigderson-complexity-theo...
|
| One thing I enjoyed was the variety of poses they had Wigderson
| in. It just looks so awkward. "Here, sit in this chair and look
| out the window longingly".
| jkaptur wrote:
| > The unreasonable effectiveness of randomness led him to think
| about the nature of randomness itself. He, like other
| researchers at the time, questioned how necessary it was for
| efficient problem-solving and under what conditions it could be
| eliminated altogether. "Initially, it was not clear if this was
| only our own stupidity, that we cannot remove randomness," he
| said. "But the larger question was whether randomness can
| always be eliminated efficiently or not." He realized that the
| need for randomness was intimately tied to the computational
| difficulty of the problem.
|
| Does anyone have a sketch of how this works? My understanding
| of complexity classes is that they consider worst case
| performance. Even if you have a great pseudorandom number
| generator and a great randomized algorithm, how do you prove
| that no instance of RNG + seed + problem instance takes
| exponential time?
| hinkley wrote:
| Randomization prevents the worst degenerate cases from being
| repeatable and thus exploitable. (By an attacker or self-
| own).
|
| Using them is really in the applied CS rather than a
| theoretical CS domain. The problem can still happen, it just
| no longer happens consistently to the same customer or at the
| same time every day.
|
| Eliminating the randomized solution was not on my radar so
| I've got some homework to do.
| pca006132 wrote:
| No, there are complexity classes for random algorithms. And
| randomness is also crucial for some cryptographic
| protocols, typically you want to show that the probability
| the adversary can do what they want is very slim, e.g. 2^-n
| for some large n.
| hinkley wrote:
| In crypto we are worried about the best case scenario
| (from the attacker's perspective).
|
| Randomization in algorithms, to avoid worst case
| behavior, would be things like shuffling input, XORing
| hashes for query parameters per run or per hash table, or
| using random tiebreakers.
| pca006132 wrote:
| Yeah randomization in algorithms and crypto serve
| different purpose. But overall introducing randomness can
| allow you to do amazing things that deterministic
| algorithm/schemes cannot do.
| kadoban wrote:
| > In crypto we are worried about the best case scenario
| (from the attacker's perspective).
|
| That seems mistaken to me. The best case in most crypto,
| from an attacker's perspective, is "I tried one key and
| just happened to get it right". Aren't we more interested
| in expected outcomes?
| hinkley wrote:
| In the sense of big O you're right and I misspoke.
|
| When we are deciding to retire an algorithm it's often
| down to how many unaddressed weaknesses there are and
| assuming that they compose. That's the best case I was
| referring to.
| wbl wrote:
| That's completely untrue.
|
| For example there's an easy randomized algorithm to
| determine a quadratic nonresidue modulo a prime with
| success probability 1/2 per Legendre symbol evaluation.
| In expectation this is 2 iterations, and that's
| independent of the prime. The best known deterministic
| algorithm takes log(p)^2 such evaluations in the worst
| case, even assuming generalized Riemann hypothesis.
| bo1024 wrote:
| BPP is the complexity class for decision problems that a
| _randomized_ algorithm can solve in polynomial time, in the
| sense that on every input (worst-case), the algorithm is
| right with at least 2 /3 probability (over its own
| randomness).
|
| One example is deciding whether a given number is prime. For
| a long time, we knew randomized algorithms like the Miller-
| Rabin test that are correct with high probability. That is,
| we knew PRIMES is in BPP. But we didn't know a deterministic
| algorithm. In 2006, a deterministic algorithm was discovered,
| proving that PRIMES is in P.
|
| One of the central open problems in Computer Science is
| whether BPP=P. That is, for any problem we can solve with
| randomness (in the above sense), can we solve it without
| randomness? Among many other contributions, Wigderson's work
| has made a ton of progress toward this problem.
|
| The general idea is to try to "derandomize" any randomized
| algorithm that runs in polynomial time and is correct with
| 2/3 probability. We will feed it inputs from a pseudorandom
| generator (instead of true random numbers). If the PRG is
| really really good, then we can get a deterministic algorithm
| by running the original algorithm along with the PRG.
|
| Now, if we can prove certain problems are hard, then we can
| also prove that such really really good PRGs exist. This is
| basically because telling apart the PRG from true randomness
| would be a computationally hard problem.
| sandspar wrote:
| That "sitting in chair looking out the window" pose is a Martin
| Scorsese or Sopranos type pose. The elderly gangster in the old
| folk's home etc.
| pthreads wrote:
| Just picked up Wigderson's book and I am liking it so far :
|
| https://press.princeton.edu/books/hardcover/9780691189130/ma...
| teleforce wrote:
| Final draft version of the book available here for personal
| research and education:
|
| https://www.math.ias.edu/avi/book
| tinhhuynh_97 wrote:
| Agreed
| myth_drannon wrote:
| I looked at the book and it's more for graduate-advanced
| undergrad students.
|
| Can someone recommend a more basic book on the topic of
| computation for someone with a rusty comp-sci/math undergrad
| background?
| cryptoxchange wrote:
| Introduction to the Theory of Computation by Michael Sipser
| pthreads wrote:
| Try What Can be Computed by John MacCormick :
|
| https://press.princeton.edu/books/hardcover/9780691170664/wh.
| ..
| dataexp wrote:
| In (*) Scott Aaronson relates the impact of one of Avi Wigderson
| talks on his career.
|
| https://scottaaronson.blog/?p=2925
| wslh wrote:
| More info at "Israeli Wins Turing Prize, Computing's Highest
| Honor, for Insights on Randomness": [1] and its archived version
| [2]
|
| [1] https://www.haaretz.com/israel-news/2024-04-10/ty-
| article/.p...
|
| [2] https://archive.is/e8uix
| wslh wrote:
| Looking at the downvotes it seems like HN is now a hub for
| antisemitism.
| hinkley wrote:
| > Computer scientists have discovered a remarkable connection
| between randomness and computational difficulty (i.e.,
| identifying natural problems that have no efficient algorithms).
| Working with colleagues, Wigderson authored a highly influential
| series of works on trading hardness for randomness. They proved
| that, under standard and widely believed computational
| assumptions, every probabilistic polynomial time algorithm can be
| efficiently derandomized (namely, made fully deterministic). In
| other words, randomness is not necessary for efficient
| computation. This sequence of works revolutionized our
| understanding of the role of randomness in computation, and the
| way we think about randomness.
|
| How would I go about catching up with this aspect of his
| research? It's not often that I've never heard of a Turing
| winner, but this guy is completely off of my radar.
| layer8 wrote:
| I wonder what the "standard and widely believed computational
| assumptions" are. Presumably, probabilistic approximations to
| NP-complete problems are not polynomial-time? Or the
| derandomized versions would still be just approximations?
| reader5000 wrote:
| I think his argument assumes the existence of pseudorandom
| generators which map a small amount of "true" random bits to
| a large amount of bits that look random to any polytime
| observer. The "derandomization" is that we just have to check
| all possible states of the seed bits which hopefully will be
| logarithmic in the size of the problem so you can do
| exhaustive checking.
| polymipisexp wrote:
| Nisan and Wigderson prove many different corollaries of
| their construction in their seminal 'Hardness vs
| Randomness' paper but their requirement for general
| derandomization (P = BPP) is that there's some function f
| computable in time 2^O(n) such that for some e > 0 for all
| circuits of size 2^en the correlation between f and the
| output of the circuit is sufficiently low (if I understand
| correctly).
| bo1024 wrote:
| They are generally esoteric conjectures similar in spirit to
| P != NP. For example, the third paper uses the assumption "E
| requires exponential circuits". Here E is the class of
| problems solvable in exponential time, or 2^O(n) time, on a
| Turing Machine. Another model of computation besides Turing
| Machines are Boolean circuits, i.e. AND, OR, and NOT gates.
| The conjecture is that not every problem in E can be solved
| by "small" (subexponential-sized) circuits.
|
| The basic idea of the work is that if these problems are
| hard, then we can use them to build pseudorandom generators
| that are "just as good" as true random, which we can use to
| turn truly random algorithms into pseudorandom algorithms
| with the same performance.
| jumpCastle wrote:
| You can try his book. https://www.math.ias.edu/avi/book
| ode wrote:
| Two of Wigderson's major papers mentioned in the announcement are
| co-authored with Noam Nisan, one of the professors behind the
| well known on-line course 'From Nand to Tetris'.
| thedatamonger wrote:
| From the related article: https://www.quantamagazine.org/avi-
| wigderson-complexity-theo...
|
| > ... if a statement can be proved, it also has a zero-knowledge
| proof.
|
| Mind blown.
|
| >Feeding the pseudorandom bits (instead of the random ones) into
| a probabilistic algorithm will result in an efficient
| deterministic one for the same problem.
|
| This is nuts. AI is a probabilistic computation ... so what
| they're saying - if i'm reading this right - is that we can
| reduce the complexity of our current models by orders of
| magnitude.
|
| If I'm living in noobspace someone please pull me out.
| IshKebab wrote:
| I don't know exactly what it's saying but it definitely isn't
| that. AI already uses pseudorandom numbers and is
| deterministic. (Except some weird AI accelerator chips that use
| analogue computation to improve efficiency.)
| ilya_m wrote:
| > AI is a probabilistic computation ... so what they're saying
| - if i'm reading this right - is that we can reduce the
| complexity of our current models by orders of magnitude.
|
| Unfortunately, no. First, the result applies to decision, not
| search problems. Second, the resulting deterministic algorithm
| is _much_ less efficient than the randomized algorithm, albeit
| it still belongs to the same complexity class (under some mild
| assumptions).
| mxkopy wrote:
| Can't you build search from decision by deciding on every
| possible input?
___________________________________________________________________
(page generated 2024-04-10 23:00 UTC)