[HN Gopher] Native Type Theory
       ___________________________________________________________________
        
       Native Type Theory
        
       Author : gbrown_
       Score  : 110 points
       Date   : 2021-02-20 10:59 UTC (12 hours ago)
        
 (HTM) web link (golem.ph.utexas.edu)
 (TXT) w3m dump (golem.ph.utexas.edu)
        
       | 6gvONxR4sf7o wrote:
       | This whole post is above my head, reminiscent of reading
       | extremely generic Haskell code. It looks super interesting and
       | has been done very compactly. For someone who really gets the
       | stuff it's probably an amazing way to communicate the thing, but
       | for the rest of us, the jargon just forms a barrier. That's just
       | how specialties work, so I'm not complaining or anything.
       | 
       | A question:
       | 
       | > The embedding preserves limits and exponents
       | 
       | What's this mean? I'm especially curious about exponents because
       | wildly different areas of math talk about exponents in ways that
       | must be related but I don't understand the connection.
        
         | Twisol wrote:
         | Exponents are probably the easier aspect to understand. For
         | instance, what is the number of distinct functions between a
         | set X and a set Y? Well, for every input value of X, we get to
         | choose any single output value of Y. There are |Y| choices of
         | output, so we have |Y| * |Y| * ... * |Y|, where there are |X|
         | factors in that multiplication. In short, there are |Y| ^ |X|
         | functions from X to Y. So we'll often denote the function space
         | by Y^X itself, and thus it gets called an "exponential".
         | 
         | In the category Set, the "exponential" of two sets X and Y is
         | exactly the set of functions from X to Y. In other categories
         | with a concept of exponential, you get a similar
         | internalization of collections of the category's morphisms as
         | objects of the category. You also almost always have an
         | exponential law, (Z^Y)^X = Z^(X*Y)). (If you interpret ^ as
         | "if" and * as "and", you get a common law of logic that we
         | often use when refactoring nested if-statements!)
         | 
         | For a functor to "preserve" an exponential operation (or any
         | other operation) is to be able to do the exponentiation either
         | before or after the functor without changing the result. In
         | symbols, F(X^Y) = F(X)^F(Y). (Compare with linear
         | transformations from linear algebra, in which f(a*x + y) =
         | a*f(x) + f(y).)
         | 
         | Limits are a bit more fundamentally categorical, and I still
         | don't have a good intuition for them. If you think of a functor
         | as projecting the shadow of some index category onto your
         | target category, a limit is in some sense a "best pre-
         | approximation" in the target category, such that if you have a
         | function from elsewhere in the target category into the part
         | under your functor, you can equivalently make a pit stop at the
         | limit object before continuing on into the functor's image.
         | 
         | In some sense, it's like an optimal border control point just
         | outside your functor's image -- you can systematically divert
         | traffic through the limit without actually changing the
         | results.
         | 
         | I'm not sure how to intuitively interpret "preserves limits",
         | since I guess you'd need another functor lying around whose
         | limit is preserved under composition with your first functor.
         | Something like F(Lim(G)) = Lim(F . G), perhaps. But now I'm
         | just reading Wikipedia.
        
       | calhoun137 wrote:
       | I really like this idea! I first got into type theory as a result
       | of an answer by Carl Mummert to a question I asked on Math
       | StackExchange[1]. I even wrote a blog post about it a year
       | later[2].
       | 
       | Ever since then, I have considered type theory to be my favorite
       | version of the foundations of math. I have often played around in
       | theoretical foundations theory to discover as many connections as
       | possible between foundations of math and computer science. That
       | being said, I think the perspectives offered by set theory and
       | category theory are extremely valuable, and that these three
       | approaches compliment each other, and sometimes one is better
       | than another for a specific purpose.
       | 
       | This is the first time I had really seen the possibility to
       | reformulate computer science to bring it closer to mathematics.
       | How exciting!
       | 
       | [1] https://math.stackexchange.com/questions/1519330/is-it-
       | possi...
       | 
       | [2] https://medium.com/@calhoun137/a-programmers-way-of-
       | thinking...
        
       | carapace wrote:
       | This is terrific but also frustrating: if you want people to use
       | this stuff you've got to "bring it down the mountain".
       | 
       | Alternately we could have a tiny elite writing crystalline
       | mathematically perfect software and the rest of us mere mortals
       | can just plug it together like Lego blocks. Might not be
       | terrible?
        
       | amw-zero wrote:
       | I can't read things like this without bringing up Leslie Lamport,
       | the creator of TLA+. His position is that all you need for
       | understanding computation is "regular" mathematics, i.e. set
       | theory and predicate logic. With just this level of math, we have
       | been able to specify, prove, and model-check sequential and
       | distributed computations. For several decades.
       | 
       | I was briefly interested in type theory, dependent types, etc.,
       | but have yet to have the payoff from the investment into it.
       | Whereas investing in set theory and predicate logic has had
       | immediate benefits to my day job.
       | 
       | Can someone explain what type theory gets us that we don't
       | already have with set theory?
       | 
       | Also, please have a read on one of Lamport's most clear arguments
       | on the subject: https://lamport.azurewebsites.net/pubs/teaching-
       | concurrency.....
        
         | philzook wrote:
         | "Set theory" can mean different things. The looser
         | interpretation of set theory is just using the intuitive notion
         | of sets to model things. I think basically everyone is for that
         | and learning these topics is definitely the lower hanging
         | fruit.
         | 
         | The usual formal interpretation of what Set Theory is Zermelo
         | Fraenkel set theory, which is roughly first order logic with
         | some axiom schema about how sets behave https://en.wikipedia.or
         | g/wiki/Zermelo%E2%80%93Fraenkel_set_t.... I would be somewhat
         | skeptical that this notion of set theory is all that
         | practically useful to use, although people have managed
         | (metamath). In my opinion, first order logic is great in
         | simplicity a degree of automatability, but clunky and awkward
         | in terms of expressivity. Usually the awkwardness of expressing
         | yourself outpaces the automation, and at that point there may
         | be such a mess that it would be terrible to do it manually Type
         | theory and dependent types are different formalizations of
         | intuitive set theory. In a sense, types are kind of sets. From
         | a pure english perspective, the words are close to synonyms and
         | in practice the analogy is pretty close. 3 being of type Int is
         | intuitively like saying 3 is in the set of ints.
         | 
         | Type theory seems to me to lead to a cleaner system of
         | syntactic formal reasoning, which sounds esoteric except that
         | syntactic reasoning is what computers have to do since we can't
         | convince them with English paragraphs. And perhaps we don't
         | want to, since English is ambiguous.
         | 
         | I think the success Lamport alludes to is even though he bases
         | the TLA+ system on ZFC in some sense, users stick to the
         | fragment of very intuitive notions of set where perhaps your
         | foundations don't matter so much and all should basically
         | agree. This may be more of a pedagogical emphasis than a true
         | difference in systems, since almost everyone interested in
         | dependent types finds the parts that go off into the
         | stratosphere the most interesting.
        
           | carapace wrote:
           | Hey, BTW, your combination of Otten's sequent prover with TeX
           | output is awesome! I think you're really on to something
           | there. https://www.philipzucker.com/javascript-automated-
           | proving/
        
         | exdsq wrote:
         | Lamport is giving a talk at my workplace on Monday about 21st
         | century proofs, happy to add this question to the Q&A and let
         | you know his answer, I'm interested in this too.
        
         | GregarianChild wrote:
         | The issue is automation: At this point (Feb 2021), the support
         | in _interactive_ theorem provers is just much better for type-
         | theory based mathematics in contrast to set-theory based maths.
         | Whether that is due to technical advantages of type-theory over
         | set theory, or a historical accident, reflecting the much
         | larger amount of attention on type-theory as basis for
         | interactive provers, is a subject of much controversy. Let me
         | cite a leading light in the formalisation of mathematics [1]:
         | _" I would guess that the amount of effort and funding going
         | into type theory exceeds the amount that went into set theory
         | by an order of magnitude if not two [...] I am certain that we
         | would now have highly usable and flexible proof assistants
         | based on some form of axiomatic set theory if this objective
         | had enjoyed half the effort that has gone into type theory-
         | based systems in the past 25 years."_
         | 
         | Paper [2] has brought renewed attention to set-theory as a
         | basis for automating mathematics.
         | 
         | [1] https://cs.nyu.edu/pipermail/fom/2018-June/021032.html
         | 
         | [2] https://arxiv.org/abs/1707.04757
        
           | amw-zero wrote:
           | That's a very insightful reply. I hadn't thought about the
           | fact that the number of working computer scientists today
           | probably does dwarf the number of mathematicians around when
           | ZFC was a current topic.
        
         | zozbot234 wrote:
         | Type theoretical and categorical foundations can be understood
         | as a kind of 'structural' set theory. Structural set theory is
         | a formalization of (naive) set theory which stays quite a bit
         | closer to commonly-understood mathematical semantics, avoiding
         | a lot of unwanted 'baggage' that is introduced by 'material'
         | set theories such as ZF(C).
        
           | GregarianChild wrote:
           | This 'baggage' are statements like                  sqrt(17)
           | [?] log(.)
           | 
           | (where (log(.) is the logarithm function) which are
           | syntactically valid and have a truth value, which depends on
           | the exact nature of how you code up various mathematical
           | things in ZF(C). Your cannot form such propositions in type-
           | theory. The downside of type-theory is that it seems
           | difficult to have simple inclusions like:
           | Naturals [?] Integers [?] Rational [?] Reals
           | 
           | and that certain natural operations (like quotients)
           | seemingly are labourious in type-theory.
        
             | zozbot234 wrote:
             | These inclusions are anything but "simple", even in
             | material set theory. Consider the real numbers; in common
             | formalizations, a "real number" is an equivalence class of
             | infinite sequences of rational numbers. You can't
             | understand the 'inclusion' of rationals in reals as
             | material (set theoretical) inclusion; you need an
             | adapter/wrapper function. Type theory works the same way.
             | 
             | Quotients are not "natural" from a constructive POV, and
             | that's ultimately where the difficulties you're referencing
             | come from. If you're OK with being non-constructive,
             | quotients can be formalized ad hoc. (One of the positive
             | side effects of homotopy-theoretical foundations is that
             | they might help clarify what exactly is going on when one
             | reasons on quotients.)
        
               | GregarianChild wrote:
               | In ZFC, you define reals as you sketch, and then specify
               | rationals, integers etc as specific subsets, then you get
               | inclusions, which mainstream mathematics uses without
               | worries. And yes, mainstream mathematics is non-
               | constructive (as is Isabelle/HOL).
               | 
               | (I don't want to have a contentious discussion about
               | FOMs, or which prover is best, just trying to sketch the
               | trade-offs, and historical background.)
        
       | acjohnson55 wrote:
       | Well, that paper escalated quickly!
       | 
       | I've dabbled in enough category theory to have some inkling of
       | the meaning of this, but not much. I've kind of given up on
       | expecting an epiphany from this stuff that's applicable coding in
       | a way that significantly changes my practice in any short period
       | of time, but I'm still tempted by the feeling that the big
       | revelation is just a couple concepts away.
        
         | calhoun137 wrote:
         | > I've kind of given up on expecting an epiphany from this
         | stuff that's applicable coding in a way that significantly
         | changes my practice
         | 
         | Same. It makes me kind of sad that pure math is not more
         | helpful here! I think its possible that coding standards today
         | are really high quality, and that any improvements are going to
         | require a revolutionary paradigm shift in how we think about
         | programming at an extreme level of generality. That's where
         | pure math comes in, its a good place to shop around for new
         | ideas, since we are kind of at a local maximum now in terms of
         | best practices, imo.
         | 
         | This is how I use math while coding: Instead of tinkering with
         | numbers, just calculate the thing and get all the numbers right
         | the first time. When trying to solve a problem, it's super
         | helpful to have spent years proving theorems in pure math, it
         | helped me compile programs in my head. When I am suuuuuuper
         | stuck and need to figure out if what I am trying to do has
         | already been done in math somewhere. When constructing an
         | overall plan for a project, using abstract concepts in
         | mathematics to properly anticipate the limits of what various
         | implementations will be in order to pick the best one.
        
           | tsss wrote:
           | Unfortunately, I think quality simply isn't needed for all
           | the web shops and trivial CRUD applications that most
           | programmers work on.
        
         | carapace wrote:
         | Have you seen "Compiling to categories"?
         | 
         | > It is well-known that the simply typed lambda-calculus is
         | modeled by any cartesian closed category (CCC). This
         | correspondence suggests giving typed functional programs a
         | variety of interpretations, each corresponding to a different
         | category. A convenient way to realize this idea is as a
         | collection of meaning-preserving transformations added to an
         | existing compiler, such as GHC for Haskell. This paper
         | describes such an implementation and demonstrates its use for a
         | variety of interpretations including hardware circuits,
         | automatic differentiation, incremental computation, and
         | interval analysis. Each such interpretation is a category
         | easily defined in Haskell (outside of the compiler). The
         | general technique appears to provide a compelling alternative
         | to deeply embedded domain-specific languages.
         | 
         | http://conal.net/papers/compiling-to-categories/
        
         | proc0 wrote:
         | CT helps advance Denotational Semantics, so syntax is written
         | closer to math syntax, which I think is approaching computation
         | in a much more stricter way and therefore leveraging the
         | "magic" of math. Of course the price is paid in complexity of
         | the theory and learning curve to understand it, which kills its
         | practicality.
        
         | threatofrain wrote:
         | But category theory is very foundational theory. It would
         | similarly be like learning set theory without further goals,
         | which would be fine if you think of it as alternative Sudoku.
         | 
         | As a foundational language it's there to formalize mathematical
         | objects, but that doesn't mean people think in either set or
         | category theory, as opposed to higher level objects in their
         | domain.
        
           | acjohnson55 wrote:
           | Totally agree. Just musing on my desire to fully understand
           | this paper, but with the knowledge that all the steps to get
           | there probably won't do much for me outside their own
           | satisfaction. Which, admittedly, might be fun.
        
         | roenxi wrote:
         | It seems unlikely that there will be any profound insights from
         | category theory on how to write code, similarly to how
         | thousands of years of advances in the theory of numbers have
         | had notably few implications for how to count things.
         | 
         | Category theory is likely to contribute by giving specialists a
         | language they can use to communicate, and it seems likely to me
         | that at some point general tools rooted in category theory will
         | have some sort of interesting implication for testing and
         | debugging. Taking Haskell for example; the language isn't
         | anything to write home about but the type checking in the
         | tooling enabled a big improvement in writing bug free code.
         | 
         | So there is a probably a lot on offer for the sort of people
         | who write tooling, and less on offer for everyone else.
        
           | pgustafs wrote:
           | "It seems unlikely that there will be any profound insights
           | from logic on how to compute, similarly to how thousands of
           | years of advances in the theory of numbers have had notably
           | few implications for how to count things."
        
             | roenxi wrote:
             | It would be helpful if you fleshed that out a little, I'm
             | curious about how you think insights from logic helps
             | people program. I don't think I could deduce whether a
             | developer knew or was ignorant of formal logic in a code
             | review. Unless they give it away with a rather direct
             | comment.
             | 
             | Recognising that a programming language is equivalent to
             | some mathematical construct is a very different thing from
             | getting a practical insight about what decisions to make.
             | My shirt is some sort of topological embedded space, but
             | the tailoring as a field has not gained much practical
             | insight from the study of topology. Although I would
             | encourage any aspirations the tailors have might have to a
             | broad education.
        
               | pgustafs wrote:
               | I was referring to the fact that computers themselves are
               | the descendants of logicians -- Church, Turing, von
               | Neumann. Your argument above could have just as easily
               | been used to attack Turing machines before the invention
               | of the general purpose computer.
        
               | calhoun137 wrote:
               | +1 for the request for a rephrasing.
               | 
               | When Faraday was asked about what the practical use of
               | his discovery of electromagnetic induction was, he
               | famously replied, "what is the use of a newborn baby?"
               | 
               | Which is how I read the OP
        
               | pgustafs wrote:
               | Here's another great quote along the same lines:
               | 
               | "No one has yet discovered any warlike purpose to be
               | served by the theory of numbers or relativity, and it
               | seems unlikely that anyone will do so for many years."
               | 
               | GH Hardy 1941
        
           | GregarianChild wrote:
           | Category theorists have made this promise since the early
           | 1990s, when the overlap between categorical methods and pure
           | functional programming (think type theories) became clear.
           | The really big win here was Moggi's discovery that monads
           | neatly generalise many similar ideas in programming.
           | (Moggi/Plotkin should have won a Goedel award for this!) This
           | led to a great deal of activity towards generalising this to
           | other forms of computation, in particular parallel computing
           | (key problem: generalise the beautiful relationship between
           | cut-elimination and beta-reduction in type theory, to cut-
           | elimination and parallel composition + hiding). In my
           | opinion, this has _comprehensively failed_ to date, in the
           | sense that
           | 
           | (A) while you can express some parallelism in a way that can
           | be read as a generalisation of cut-elimination / category
           | theory, that is not surprising since (simplifying a great
           | deal for brevity) category theory is a foundation of
           | mathematics; but
           | 
           | (B) when you express non purely functional computation
           | categorically, you get extremely complicated formal systems
           | that obscure the fundamental simplicity of what they abstract
           | from and generally have only what they abstract from as
           | actual model of the abstraction.
           | 
           | Let's take Example 47 from the _Native Type Theory_ paper:
           | what exactly is the formal property the typing system
           | induces? (The paper makes vague allusions to a _" foreign
           | function interface"_ with rho-calculus, but there is no
           | _published_ understanding of even the equational theory of
           | rho-calculus. Why should I trust this paper?) What is the
           | logic that the paper promises ( _" Native Type Theory gives
           | provides a shared framework of reasoning"_)? What are the
           | proof rules, so I can type them into Lean or Coq? Show me
           | that they are _better_ than the simple logics I get from
           | looking at lambda-calculus operationally.
           | 
           | Each abstraction has costs and benefits. As of 2021, the
           | costs of categorical methods _outside of pure functional
           | languages_ outweigh their benefits. I worked in categorical
           | methods for programming and changed the field, when I
           | realised this.
        
             | siraben wrote:
             | > generalise the beautiful relationship between cut-
             | elimination and beta-reduction in type theory, to cut-
             | elimination and parallel composition + hiding
             | 
             | How would you generalize the correspondence between cut-
             | elimination and beta-reduction to cut-elimination and
             | parallel composition, and what is meant by hiding here? I
             | would be curious to know of relevant work in the area.
        
               | GregarianChild wrote:
               | The generalisation of Curry-Howard to parallelism was
               | outlined in early interpretations of linear logic, I
               | think Abramsky's [1] was the first, there has been a lot
               | more work in this direction since. A good place to start
               | to understand the relationship between functional and
               | concurrent computation is Milner's _Functions as
               | Processes_ [2], it 's incredibly simple, my mind was
               | blown when I understood it (if you understand CPS
               | transforms, you will find it natural, indeed, I'd say the
               | Milner translation is the essence of CPS, see also [4]).
               | A Curry-Howard correspondence of sorts has now been
               | produced for linear logic and session-typable name
               | passing processes [3].
               | 
               | [1] S. Abramsky, _Computational Interpretations of Linear
               | Logic._ https://www.sciencedirect.com/science/article/pii
               | /0304397593...
               | 
               | [2] R. Milner, _Functions as Processes._
               | https://hal.inria.fr/inria-00075405/document
               | 
               | [3] L. Caires, F. Pfenning, _Session Types as
               | Intuitionistic Linear Propositions._ https://citeseerx.is
               | t.psu.edu/viewdoc/download?doi=10.1.1.22...
               | 
               | [4] H. Thielecke, _Continuations, functions and jumps._
               | https://www.cs.bham.ac.uk/~hxt/research/Logiccolumn8.pdf
        
               | siraben wrote:
               | Thanks! What sort of work did you do in the field of
               | applying categorical methods to programming?
        
           | specialist wrote:
           | > _profound insights from category theory on how to write
           | code_
           | 
           | Sure. But as u/feniv hinted upthread, I'm also optimistic
           | about the impact on language design (noodling).
           | 
           | I too have a hobby toy language. For the type system, I just
           | want to use the best available science. Hopefully something
           | practical, easy to explain, simple to use, with good enough
           | ergonomics.
           | 
           | While type theory is really important, it's just not my area
           | of interest, and I'd like to defer to the experts.
        
           | calhoun137 wrote:
           | > It seems unlikely that there will be any profound insights
           | from category theory on how to write code
           | 
           | I strongly disagree with this. To me it does not seem
           | "unlikely", quite the opposite. The reason I feel so strongly
           | about this is that computer science is a subset of pure
           | mathematics, its the part that deals with algorithms and
           | computation. There are many questions in computer science
           | which cannot be answered without an appeal to pure math
           | (e.g., what is the best possible compression algorithm)
           | 
           | All of computer science began as a result of the work of
           | Hilbert on the foundations of logic[1]. This is what led to
           | Turing's paper on Turing Machines. Since that time,
           | mathematicians have done a lot of exciting work on
           | foundations.
           | 
           | Category theory has completely transformed the way we think
           | about abstract mathematical objects; and type theory is very
           | exciting in that it brings mathematics closer to a
           | programming language which is compiled in the human brain and
           | is done with pen and paper.
           | 
           | It's seems to me very likely that in the future, code writing
           | standards will change and get better, and will not remain the
           | same. Towards that end, it makes sense that improvements will
           | come from a better understanding of foundations, since that
           | is where it all started.
           | 
           | Curious for your thoughts?
           | 
           | [1] https://en.wikipedia.org/wiki/Hilbert%27s_second_problem#
           | :~:....
        
             | threatofrain wrote:
             | What has been your interest in category theory? How has
             | category theory empowered your work?
        
               | calhoun137 wrote:
               | My interest in category theory is just as a hobby and an
               | amateur; but I have gone pretty deep reading lots of
               | books and so on. To be fair, category theory has not
               | empowered my work (in programming) at all.
        
             | GregarianChild wrote:
             | None of Zuse, Babbage, Torres y Quevedo, Ludgate,
             | Dickinson, Desch, Atanasoff-Berry, Mauchly / Eckert, nor
             | many of the other pioneers of computing came via Hilbert
             | and FOM problems. An important source of early computing
             | was the practical need for calculation (e.g. logarithm
             | tables), leading to mechanical (and later electro
             | mechanical) automata for automatic calculation, a tradition
             | going back at least to ancient Greece (e.g. Antikythera
             | mechanism).
             | 
             | Regarding the arrow of influence: a Fields medalist spent a
             | decade coming up with a new foundation of mathematics (or
             | at least algebraic topology), only to realise that the
             | computer science department already teaches Coq to
             | undergraduates!
        
               | calhoun137 wrote:
               | > None of Zuse, Babbage, Torres y Quevedo, Ludgate,
               | Dickinson, Desch, Atanasoff-Berry, Mauchly / Eckert, nor
               | many of the other pioneers of computing came via Hilbert
               | and FOM problems
               | 
               | I don't think this is a fair comparison. The modern
               | computer is really distinct from everything that came
               | before. That's because it was built according to the
               | theory of Turing Machines.
               | 
               | One of the most important historical papers for the
               | development of modern computers was the Report on the
               | ENIAC by Von Neumann[1].
               | 
               | Von Neumann took the idea's of others working in the
               | field, and was able to apply his understanding of
               | mathematical logic to formulate the principles which led
               | to the first working modern computer, which Von Neumann
               | built in the basement of IAS. At that time, there was a
               | major debate at IAS between Einstein and Von Neumann,
               | which centered around whether or not to only do pure math
               | at IAS, with the idea that building a computer was part
               | of experimental science. [2]
               | 
               | > Regarding the arrow of influence: a Fields medalist
               | spent a decade coming up with a new foundation of
               | mathematics (or at least algebraic topology), only to
               | realise that the computer science department already
               | teaches Coq to undergraduates!
               | 
               | LOL! That is an interesting and funny story. However, I
               | don't think this example demonstrates that in the future,
               | mathematics will not be the source of improvements to
               | code writing standards.
               | 
               | Question: if code writing standards improve, where else
               | will these improvements come from other than pure
               | mathematics? I consider this question to be a problem
               | type similar to maximum compression algorithms, i.e. its
               | a question whose solution can only be verified using the
               | language of pure mathematics. Therefore it seems likely
               | these improvements can also have their roots in pure
               | mathematics as well. At least, I would not say it "seems
               | unlikely"
               | 
               | [1] https://en.wikipedia.org/wiki/First_Draft_of_a_Report
               | _on_the...
               | 
               | [2] https://www.amazon.com/Turings-Cathedral-Origins-
               | Digital-Uni...
        
               | GregarianChild wrote:
               | Regarding your question, I think mathematics and
               | programming about the same: _being precise_!
               | 
               | Programming is much more demanding, much more unforgiving
               | in this regard, because you are talking to a stupid
               | machine, rather than a smart human. The powers automation
               | gives mathematicians (once you've climbed the mountain of
               | initial unfamiliarity), are so spectacular, take the
               | average Joe so much beyond what even Gauss, Newtown,
               | Grothendieck or Archimedes had at their disposal, that I
               | expect that over the next century mathematics will be
               | rewritten to suit computing, rather than vice versa. K.
               | Buzzards's work is one step in this direction, scriptable
               | notebooks like Jupyter or Mathematica are another.
        
               | calhoun137 wrote:
               | > I expect that mathematics will be rewritten to suit
               | computing, rather than vice versa
               | 
               | I agree with this. I believe pure mathematics is
               | suffering greatly because many mathematicians refuse to
               | fully embrace the computational power of modern
               | technology.
               | 
               | My belief is the age of pretty formulas is coming to an
               | end, and that the future of mathematics will be it
               | focuses more and more on computational aspects of the
               | subject, and problem sets in pure math courses will be
               | done using programs that are much more advanced than
               | anything which exists today, and everyone will think
               | nothing more of those programs than we do about
               | calculators.
               | 
               | Apologies for the self plug, but this has been my vision
               | with mathinspector[1]. I've been working very hard on
               | that, and this is why I got so interested in your
               | statement. Thank you for clarifying your thinking here.
               | Makes sense to me, and you could be right
               | 
               | [1] https://github.com/MathInspector/MathInspector
        
               | randomNumber7 wrote:
               | > I agree with this. I believe pure mathematics is
               | suffering greatly because many mathematicians refuse to
               | fully embrace the computational power of modern
               | technology.
               | 
               | They can't even give meaningful names to their variables.
               | When showing code to mathematicians we should always
               | rename all variables with only one character. Then look
               | down at them because they can't understand it ;)
        
               | GregarianChild wrote:
               | The MathInspector is nice. Reminds me of the _"
               | Incredible Proof Machine"_ [1] which I find to be a neat
               | tool in teaching logic.
               | 
               | [1] https://incredible.pm/
        
               | calhoun137 wrote:
               | Thank you!!! It's so funny that you mentioned "scriptable
               | notebooks like Jupyter or Mathematica" since I have been
               | spending all of my time on mathinspector recently.
               | 
               | I think our points of view are actually very strongly
               | aligned. However I believe the next big idea is likely to
               | come from outside of computer science.
               | 
               | Personally, I am betting on biology. So many of the most
               | sophisticated techniques are based on biology, e.g.
               | neural nets and genetic algorithms. I have done a lot of
               | work on extending the theory of computation with a new
               | axiom which gives Turing machines a self replicating
               | axiom[1], [2]
               | 
               | In many parts of science, there is a cross pollination,
               | where new ways of thinking about subject X come from a
               | new discovery in subject Y. Typically, research will
               | follow a group think pattern until it hits a brick wall,
               | then you need that really big breakthrough idea. This
               | line of reasoning leads to the conclusion, imo, that it's
               | approximately equally likely to come from either pure
               | computer science, or pure mathematics, or somewhere else.
               | 
               | [1]
               | https://math.stackexchange.com/questions/3605352/what-is-
               | the...
               | 
               | [2] https://medium.com/swlh/self-replicating-computer-
               | programs-8...
        
               | GregarianChild wrote:
               | It's hard to predict the future.
               | 
               | Alan Turing invented neural nets in a little-known paper
               | entitled _Intelligent Machinery_ (see [1]), in 1948.
               | Since, the use of NNs has moved away decisively from
               | inspiration by nature. I reckon, nature 's last big win
               | in AI were _convolutional_ NNs: Kunihiko Fukushima 's
               | neocognitron was published in 1980, and inspired by 1950s
               | work of Hubel and Wiesel [2]. Modern deep learning is
               | largely an exercise in distributed systems: how can you
               | feed stacks and stacks of tensor-cores and TPUs with
               | floating point numbers, while minimising _data movement_
               | (the real bottleneck of all computing)?
               | 
               | Not unlike, I think, how airplanes were originally
               | inspired by birds, but nowadays the two have mostly
               | parted ways, for solid technical reasons.
               | 
               | [1] http://www.alanturing.net/turing_archive/pages/Refere
               | nce%20A...
               | 
               | [2] https://en.wikipedia.org/wiki/Neocognitron
        
               | calhoun137 wrote:
               | > take the average Joe so much beyond what even Gauss,
               | Newtown, Grothendieck or Archimedes had at their disposal
               | 
               | I think this comment really sums up very well what is at
               | the core of our discussion: the future of mathematics and
               | science.
               | 
               | My strong belief is that thousands of years from now,
               | Archimedes and Gauss will still be remembered, and
               | everything we think is great now will be forgotten while
               | they are not. That tells me that they were much farther
               | ahead of their times than us, even though they didn't
               | have modern computers.
               | 
               | Mathematicians and computer scientists both have it
               | totally backwards imo. On the one hand, mathematicians
               | think they have something to teach _us_ about computer
               | science, but they refuse to use technology properly. On
               | the other hand, when we write code, it 's all governed by
               | mathematical laws and there are many questions (but maybe
               | not you know, coding standards or the philosophy of
               | writing good code) we could really use the guiding hand
               | of mathematicians with, and they need to catch up with
               | the times and we programmers need to accept they have
               | something valuable to offer and to teach us.
        
       | feniv wrote:
       | Seems like this is an extension of the Curry-Howard-Lambek
       | isomorphism [1] relating type theory, logic and category theory
       | together. I find that to be one of the most beautiful results in
       | computer science and have been working on a practical language
       | built on the same principles for a while (A mix of Haskell,
       | Prolog and APL in a python-like language)
       | 
       | [1] https://en.m.wikipedia.org/wiki/Curry-Howard_correspondence
        
         | Tabzz98 wrote:
         | Is there any online material on this language?
        
           | feniv wrote:
           | Not yet. I've specced it out, but still working on the
           | implementation. Here's a hand-written example showing how you
           | can specify the constraints to validate a Knight's tour. You
           | can then enumerate over the types to get valid tours.
           | 
           | https://gist.github.com/Feni/d819f120a7d179bc472c94df9471e60.
           | ..
        
       | jimmyed wrote:
       | I am a literal newbie to this field. How is this field related to
       | logic?
        
         | blovescoffee wrote:
         | How much of a literal newbie? Are you familiar with predicate
         | logic / any formal logic? Are you familiar with any basic
         | modern algebra like Group theory?
        
           | jimmyed wrote:
           | I was educated in the UK, so have a good grip on the
           | highschool math there, but not much beyond that. I did try to
           | plough thru SICP, that's a work in progress. Too little you'd
           | say?
        
         | guerrilla wrote:
         | I suggest starting by looking up the Curry-Howard isomorphism.
        
       | siraben wrote:
       | A good introduction to categorical logic is [0], introducing the
       | simply-typed lambda calculus along with the categorical semantics
       | for each construction (cartesian closed categories). The later
       | sections include topics mentioned in this article such as
       | hyperdoctrines, categorifying quantifiers, and dependent types.
       | 
       | [0] https://www.cl.cam.ac.uk/~amp12/papers/catl/catl.pdf
        
         | 3PS wrote:
         | CMU also currently has a course in categorical logic [0].
         | Internally, the only prerequisite is a course in category
         | theory. The course notes can be found at [1].
         | 
         | [0] http://mathieu.anel.free.fr/CLcourse.html
         | 
         | [1] http://mathieu.anel.free.fr/mat/80514-814/Awodey-Bauer-
         | catlo...
        
       | chewxy wrote:
       | I followed this until the part just slightly after "see how
       | natural it is?" Then it's suddenly How to Draw an Owl. I don't
       | follow the monoid part. Anyone can help elaborate?
        
         | siraben wrote:
         | > Dependency is pervasive throughout mathematics. What is a
         | monoid? It's a set M, and then operations m,e on M, and then
         | conditions on m,e. Most objects are constructed in layers, each
         | of which depends on the ones before. Type theory is often
         | regarded as "fancy" and only necessary in complex situations,
         | similar to misperceptions of category theory; yet dependent
         | types are everywhere.
         | 
         | A monoid is (S,*,e): a set S equipped with a binary operation *
         | : (S,S) -> S such that for all x, y, z in S, x*(y*z)=(x*y)*z,
         | and an element e in S such that for all x in S, x*e=e*x=x.
         | Examples of monoids include (String,concat,""), (int,0,+) and
         | so on.
         | 
         | One recurring theme in universal algebra and category theory is
         | the notion of an algebra[0], in programmer terms it's like a
         | module declaration. (see also F-algebra[1] which is heavily
         | used in functional programming to formalize the notion of
         | folding and more)
         | 
         | Dependent type theory allows one to encode these algebras, e.g.
         | encoding monoids and their laws in Coq's type system CoC[2].
         | Just like how ADTs in Haskell/OCaml and enums in Rust are
         | comprised of a "sum of products", products and sums can be
         | generalized in dependent types as well:
         | 
         | Length-indexed lists via the dependent sum: _Sn:N.List(n)
         | consists of dependent pairs <n,l:List(n)> _. Sn:N.List(n) can
         | be read as a function that takes a natural number n and returns
         | a type List(n)
         | 
         | Type-parametrized types: _PS:Set.List(S) consists of dependent
         | functions lS:Set.ph(S)_ , PS:Set.List(S) can be read as a
         | function that takes a type S and returns a type List(S)
         | 
         | So to encode monoids in a dependent type theory using those
         | dependent sums and products, we can just put the operations and
         | laws into some big record, say                 Monoid:=
         | SM:Set.         Sm:M2-M.         Se:()-M.
         | P(a,b,c):M3.m(m(a,b),c)=m(a,m(b,c)).
         | Pa:M.m(e,a)=a=m(a,e).
         | 
         | Compare this to [2]!
         | 
         | [0] https://en.wikipedia.org/wiki/Universal_algebra#Basic_idea
         | 
         | [1] https://en.wikipedia.org/wiki/F-algebra
         | 
         | [2] https://gist.github.com/puffnfresh/11258181#file-monoid-
         | v-L5
        
           | siraben wrote:
           | > For now we explore the language of a topos. There are
           | multiple views, and often its full power is not used. Toposes
           | support geometric logic, predicate logic, and moreover
           | dependent type theory. We emphasize the latter: dependent
           | types are expressive and pervasive, yet underutilized in
           | mathematics.
           | 
           | The next step of the article is the realization that a
           | topos[0] has this expressive capability from the get-go.
           | 
           | There's a nice list of clarifying examples in the paper[1],
           | starting at the bottom of page 2, for instance, stacks with
           | push and pop, untyped lambda calculus, monad comprehension
           | syntax in Haskell (sort of like a generalization of
           | async/await syntax), and so on. This quote in particular
           | seems exciting:
           | 
           |  _We aim to implement native type theory as a tool which
           | inputs the formal semantics of a language and outputs a type
           | system, which can then be used to condition an entire
           | codebase. Properly integrated, it could introduce expressive
           | type theory directly into everyday programming. To do so, we
           | plan to utilize the progress made in language specification_
           | 
           | [0] https://en.wikipedia.org/wiki/Topos
           | 
           | [1] https://arxiv.org/pdf/2102.04672.pdf
        
           | creata wrote:
           | But don't most dependent type theories have a restrictive
           | notion of equality (e.g. without extensional equality of
           | functions) making many things that we would _like_ to be
           | monoids not fit that definition? Anyway, I agree with your
           | general point: dependent types are _extremely_ simple, and
           | make so many things more elegant (type-safe printf is the
           | common example) that I 'm surprised they aren't more popular.
        
             | siraben wrote:
             | Hm yes. For instance the monoid of endofunctions on S would
             | require one to prove that f . id = id . f = f. In vanilla
             | dependent types this needs the extensionality axiom added,
             | but in category theory it definitely is fine to speak of
             | equality between morphisms with the same source and target.
        
         | lmm wrote:
         | That's just an example of how we naturally use dependent types
         | in everyday mathematics: the set of monoids is the set of
         | triplets of a set M, a function m _from that M^2 to that M_ ,
         | and an element 1 _of that set M_ such that the given laws hold.
         | You can only define the type of monoids (members of the set of
         | monoids) dependently, because 1 isn 't just an element of any
         | set, it's an element of the specific set M.
         | 
         | (Groups might have been a better example than monoids, given
         | that groups are more familiar to mathematicians and the point
         | is to emphasise that this is normal everyday mathematics).
        
           | tomp wrote:
           | Not really, Rossberg's 1ML / F-ing modules provides a way to
           | elaborate such type to plain System F types, i.e.
           | existentials, but not dependent types.
           | 
           | The type of such monoid would be:                   exists[T]
           | . (M : [= T], f : T -> T, e : T)
        
             | GregarianChild wrote:
             | I think you need HKTs, in other words _System F-omega_ , to
             | carry out Rossberg's idea.
        
               | tomp wrote:
               | Only for applicative functors (i.e. OCaml's version,
               | where Set[int].t == Set[int].t), which 1ML models as
               | _pure_ functions.
        
             | lmm wrote:
             | Sure, but that's distinctly not the way you'd write or
             | think about that definition in everyday mathematics. It
             | encodes the definition but it doesn't really express it.
        
         | tom_mellior wrote:
         | The core sentence of the post comes a bit later, just above the
         | "Conclusion" section header: "This type theory is used in proof
         | assistants such as Coq and Agda; eventually, this expressive
         | logic + dependent types will be utilized in many programming
         | languages." Note the "is used" part, which is in the present
         | tense.
         | 
         | My understanding is that nothing new was constructed here.
         | Maybe the well-known theory of dependent types has been _re_
         | constructed _in a new way_ based on categories; this may be
         | exciting to fans of category theory, but I don 't see how it
         | advances programming languages in any way. The actual paper
         | isn't any more helpful either: It name-drops JavaScript in the
         | introduction and at the very end, but doesn't really do
         | anything with it, it certainly doesn't show any kind of model
         | of a type system for even a trivial JavaScript subset.
        
       ___________________________________________________________________
       (page generated 2021-02-20 23:02 UTC)